Steam’s PC Survey: The Future Is Almost Now

It’s always fun to take a nose at where PC gaming in general is at these days, and Valve’s regular Hardware survey can give a pretty good picture of that. From the latest results, it’s pretty much as you’d expect: the vast majority of gamers are running Core i7 (and not even the Extreme Editions) systems with just 12Gb of RAM and only three SLIed GeForce GTX 295s. Which is a dispiritingly low-end spec for the average system, suggesting the recession is hitting gamers much harder than has been anticipated. PC gaming is indeed doomed

Ahahahaha, etc. In fact, one thing these surveys do reveal is how little take-up there is on the ludicrously high-end components that the tech sector ballyhoos about: only 2.40% of systems on the Steam survey have multiple GPUs, for instance. SLI/Crossfire may be neat marketing tools, but in the real world they’re just not something that the vast majority of gamers really want or even need. A cheap PC can do so much now, and we should never forget that.

Good to see that dual-core systems are now the norm, however, as that really does make a difference to a lot of gaming and everyday performance alike. Meantime, the number of quad-core CPUs out there is almost level with the number of one-core systems. Times have definitely changed in processorworld, though Intel’s near-40% lead over AMD chip ownership is a little depressing in competitive-market terms.

Similarly, NVIDIA is way ahead of ATI for graphics card adoption, at 63 to 29%. A little surprised by that, as a lot of ATI’s recent cards have been astonishingly good bang-per-buck value; I don’t give a rat’s ass who’s in the lead, but would have thought gamers would flock to the cheaper option. Also surprising is DirectX 10 take-up: despite its whipping-boy legacy and even though we haven’t had an upgrade mania-inciting game since Crysis, there are more DX10 cards out there than anything else by quite a margin now.

This is despite the most common OS remaining the resolutely non-DX10 (without hacks, at least) Windows XP, Vista and 7 remaining quite some distance behind. Though it’s interesting to note that Windows 7 64-bit is pretty hot on Vista 32’s heels, and quite some distance ahead of both Vista and 7 32-bit. So gamers are interested in 7, and word seems to be out that we really should all be switching to 64-bit OSes now.

Those are the broad strokes, anyway – finer detail here, if you’re interested.

From this site

147 Comments

  1. Concept says:

    Is it bad that I stay with AMD just for my Athlon 64 days?

    • Jerricho says:

      No, thats perfectly acceptable. I’m still running an AM2 5.2. I still have my old Cyrix 5×86 knocking around somewhere… the little chip that could.

    • dancingcrab says:

      CYRIX!!!

      I love you AMD. I went with the (at the time) top of the line Phenom II chip and an ATI card. Just to stick it to Intel and Nvidia.

    • Malibu Stacey says:

      I bet everyone on the Intel & NVidia boards are crying into their cornflakes every morning because of you.

  2. Stense says:

    I recently entered the shiny electro world of multi-core cpus, a quadified one in fact. So allow me to just say: hooray, I’m finally part of progress!

  3. MD says:

    How is the survey conducted? As in, do they collect data automatically from all users, or is it an actual opt-in survey, or automatic but opt-in only, or some other method?

    • CMaster says:

      @MD
      Automatic opt-in

      Every so often when loading up Steam, Valve asks you to participate. You click “yes” and it collects data and whams it off to valve, after asking you a question or two about microphones and speakers. So no, the data isn’t perfect, but it’s the best you are going to get for PC Gamer’s hardware.

    • Christian says:

      Steam asks you now&then if you want to participate, and if you agree it also asks you some questions about bandwith and audio and the sends the results, linking you to the overview-page.

      So hopefully they only collect such data when you agree to share it.

      Valve does collect lots of gameplay-related data without asking though..I especially like the ‘deathmaps’ for TF2, showing you where most people died.

    • Premium User Badge

      Wisq says:

      They also offer “don’t know” options for all their questions, up to and including “do you have a microphone” (!!), meaning they’re not self-selecting for the more technically oriented users.

  4. CMaster says:

    @Alec Meer
    Funny thing is, I’ve noticed gamers getting more and more calcified towards nVidia. There’s this perception of ATI as the big corporate evil (which is odd, as A: they are now owned by AMD, who gamers percieve as the plucky underdog and B:all these electronic companies are massive corporations with insane revenues – and insane costs) and that nVidia is the better one to have. This, combined with the huge amount of press that the GeForce8800GT got for good value when it came out seems to mean a lot of gamers always go nVidia. This is despite AMD making a lot of much better value cards recentley.

    That said, my current HD4850 seems to have an occasional overheating problem, so I’m very much considering an nVidia for my next update.

    • Monchberter says:

      The 8800GT is/was a fantastic card and will frame mine once it’s useful life is over; it now resides in my just built media pc and is doing a fantastic job of pumping out such intensive games as GTA4, Dirt2 and Mirrors Edge with not much difficulty.

      I’ve actually moved on to a cheap DX11 card, the ATI 5750 and it’s pretty much flawless in running Crysis at 1080p. I’m happy with my mid-low end set up and savings.

    • Jerricho says:

      I’m still running Crysis on my single 8800GTX. Its a great card and still does a good job of it. There’s and ATi 4850 on my laptop and that just about manages to keep up with DoW2 and Borderlands… but without the extra shiny.
      The 8800 will have to melt before I consider buying something new and I’d probably lean towards nVidia. That said, way back when I used to run ATI cards all the time (ATI RageFuryII was a great card).

    • Carra says:

      Yes, the 8800 GT was a great buy. It’s running the new games such as Crysis, Dragons Age, Mirrors Edge, Call of Duty: Modern Warfare 2 at 1680 x 1050 with quite a few graphical bells and whistles on.

      So I see no reason to upgrade my 8800 GT.

    • Premium User Badge

      Wisq says:

      My main reason for being a stalwart nVidia user is Linux compatibility. Sure, I have to use their proprietary driver, but Linux distros have been very good about making it easy to do so as of late, and it’s way better than either the proprietary or open source ATI drivers.

      However, in my experience, the 8800 GTS (note the ‘S’) 320mb was a bust. It was a high-speed low-VRAM (in relation to the speed) card I purchased at the peak of games’ need for speed, just before they started needing more VRAM again.

      Incompatible with the motherboard I initially tried to use it with. Too much power drain on a single rail for my original 600W (noise-oriented) PSU. Too much heat for my well-ventilated case. Eventually cooked its own VRAM to death.

      Granted, mine was factory OCed, which was a whole new level of foolhardiness. Here’s a card that has roughly the same GPU as the 8800 GTX, and about half the VRAM, and yet someone decides that whereas the GTX apparently needs two PCIe power connectors, the GTS can get by with only one. And then several manufacturers decide to overclock it on top of that. (And then I decide to buy it, but it’s not my job to know all this ahead of time, right?)

      I love the “lifetime warranty” trend amongst the big video card makers, though. (OCZ, XFX, BFG, and particularly EVGA.) When my card cooked itself, at a time when I had pretty much no money to spare, it took only $30 for me to ship my old card in (cross-border), and within a couple days of it getting there, have a brand new one in my hands via FedEx. I realise these companies probably finance the lifetime warranty program via all the people who don’t take advantage of it, because the card goes obsolete before it goes bad, but that just means QA is doing as good as job as they need to — and it’s a lifesaver when you do need it.

    • Kelron says:

      @CMaster

      I think ATI’s reputation was badly damaged by the 2xxx and 3xxx cards, Nvidia were offering considerably better cards for so long that it’s become the norm to expect ATI’s offering to be poor. It also seems to have skewed the balance of games optimised for Nvidia or ATI cards, where previously it was roughly 50/50, there now seems to be a lot more games that have performance issues with ATI.

      I’ve always bought whatever offers best value for money, but if many games aren’t running well with ATI cards then you have to take that into account, even if the hardware itself is good.

    • CMaster says:

      Note that I wasn’t bashing the 8800 series at all. The 8800GT was indeed exceptional value when released (perhaps to make up for the pretty overpriced and unprepared for the future 8800GTS?). However people continued speccing it and buying it as a “good buy” for a long time after ATI heavily undercut it. Much like Intel’s Q6600 processor, it’s reputation for good value meant that it stuck at much the same price while other models dropped sufficently to undercut it.

    • Casimir's Blake says:

      I had one of those bloody GTS cards as well, finally replaced with a Radeon 4850. Crysis sucked on those 320MB 8800GTS cards, but then Nvidia has a habit of making as much shit as good cards. Oh and yes, Crysis was the only game that caused it to overheat – after a while I could clearly see rendering defects!

    • Monchberter says:

      If we’re talking ‘herd mentality’, my last major build was at the time when the 8800GT and Intel Q6600 offered unparalleled value and i went with them. I can see them lingering in my 2 set ups as components for a good while yet.

    • Starky says:

      I also got my Q6600 and 8800GT when they were beyond exceptional value, the Q6600 was £115 (it is £150 to buy now), took the stock 2.4GHz easily up to 3GHz (Had it running stable at 3.4 but didn’t want to run it that hot (74C full load compared to the 60C full load I get now) – Or push my FSB that high, though the mobo could probably handle it (MSI Neo2 FR).

      A £115 chip, and a £25 heatsink, gave me the equivalent speed of a stock £500 Quad Extreme model.

      The 8800GT (G92 512mb) was a stunner too, I found one with a 16% factory overclock (600 stock to 700MHz – Inno3D Ichill), and after market cooler for £115 – the same price as most stock models.
      Offering speeds getting respectably close to the stinkingly overpriced 8800 Ultra, and matching the G92-8800GTS 512mb for £100 less, utterly smoking anything that ATI could offer at the time.

      this was June 2008, and it is still my main machine. I’ve been sitting on a several hundred quid for over 6 months now as my “next build” money, with except for dipping in to buy a 1TB HD, I find no compelling reason to upgrade.
      A budget build 18 months ago that still offers remarkable gaming performance on bleeding edge titles is simply unheard of in my entire history of PC gaming.

      I still don’t think I’ll bother to build that new system for another 6 months, the cost/gain balance simply isn’t there yet despite having the money ready.
      I might upgrade to a DX11 card soon (given it will carry forward to my new platform (mobo/ram/cpu) without any problems.
      Intel I5 looked promising for a while, but the benchmarks just not offering enough incentive to fork out £400 on a new core system upgrade, the Q6600@3GHz still holds it’s own against them, and everything from AMD while outstanding value, doesn’t really offer any more speed than I currently have.
      If I was upgrading from some old single core machine I’d go AMD Quad Phenom2 without a doubt, but going from a fast Intel quad, to a AMD quad that might be a tad faster and save a few watts of power, is a waste of time.

      ATI were too late with their great bang-for-your buck offers but they were late to the party and not enough of an improvement to really justify an upgrade to most 8 series nvidia owners. I think they firmly realized this which is why they clearly made an effort to get a good value DX11 card on the market first.
      Almost everyone I know is intending their next card to be a 5XX0 Series card. Unless Nvidia pull something stunning out of the box, which I fully suspect they will soon.

    • Muzman says:

      Seeing the difference in the way Mirror’s Edge runs (and looks) on nVidia I’m wondering if all the power and price in the world won’t let ATI back in if they don’t have something like PhysX.
      Do they?

  5. Ian says:

    I basically whip my PC until it can’t give any more. When I buy a new game and can’t even run it on the lowest settings, that’s when I upgrade something.

  6. Mithrandir0x says:

    At least, now it’s possible to make SLI with non-supported SLI MOBOs:

    link to xdevs.com

  7. Marar says:

    I’m shocked that 3 core cpu’s have only 0.95%, I spent a little under 100 euros on mine, and the performance is just mind blowing for the money invested

  8. Heliosicle says:

    Well I’ve had the same hardware for 3 years, so I deserve an upgrade from a 256mb card, I’m behind the times :(

  9. Vague-rant says:

    Sounds about right. SLI/crossfire has always seemed a bit too fiddly for a decent number of people to buy into it, especially since the performance gains aren’t as high as I would like considering the extra cost.

    I would also like to stress how much gaming you can do on a weak (also read cheap) system. I just bought a laptop with an ATI 4570 and whilst considered the low-middle end of graphics cards (at best) it seems to be churning out a pretty decent fps and decently pretty pictures.

    About the whole ATI vs Nvidia thing… I’m going to guess Nvidia still in the lead due to the success of the 8800 and variants, after all it is the most popular DX10 card. None the less, the fact that DX10 systems tend to have the ATI 4800 might suggest a shift in popularity?

    • Babs says:

      While I wouldn’t buy 2 cards straight up, to ignore SLi/Crossfire means you miss a nice little upgrade path.

      Basically just buy a really good single card (I recently got an HD5850) and then later on you can pick up a second one for a lot less money than buying a new high-end card and get an 80%-90% performance boost.

    • jalf says:

      @Babs: Everyone says that. No one ever does it. By the time you’re thinking about buying the second card, it just isn’t worth it. SLI is so fiddly with which cards can be matched, so you have to buy another mostly obsolete card, just to get performance that’s not quite as good as if you just bought a new GPU and ditched the old one. It’s just not worth it.

    • vagabond says:

      As far as I am aware, there has never been a situation where purchasing two of any given card has had a significantly better price/performance ratio than a single card of the equivalent model in the generation above it.

      Waiting 12 months or more to buy the second card risks that you can no longer find the exact card you need to SLI anymore, and by then the generation above has come down in price anyway.
      As far as I can tell, the only reason to SLI anything is at the very top end where the generation above doesn’t exist yet, and since there’s hardly anything to tax one high-end card, having two in your system seems solely about the size of your e-peen.

      I am curious as to whether the SLI stats count things like the GTX295 (which I am under the impression is actually 2 GPUs with SLI built into the card) as SLI, in which case the uptake of “I bought two separate graphics cards and connected them together” is even lower than the figures suggest.

  10. gaspardo says:

    Not too many surprises there, I’d say, but only because the results pretty much sum up my own situation as a non tech-savvy “average consumer”.

    I can only speak for myself, but I’d have a hard time switching to ATI for example, regardless of quality or value for money, as the only time I see them mentioned in various games’ tech support forums is when they aren’t compatible/supported at launch ( not to say nvidia cards don’t have the exact same problems, it just makes the switch that much more difficult). And that’s pretty much the biggest turn-off ever, game-wise.

    Whether nvidia’s take up rate has to do with business interests or industry take-up, I couldn’t say, but I might investigate further into ati cards on principle alone following this survey.

  11. Monchberter says:

    Remember 7 comes in both 32 and 64 bit flavours in one box, so you can choose. I guess most desktops are running 64bit while laptops and netbooks account for the 32bit option.

  12. Finstern says:

    Guess people don’t want to be dependent on the absolutely horrendous ATI graphic drivers.

    I had so many problems with drivers for all my ATI cards back in the day that now I wouldn’t put one in my rig even they paid me to.

    • Jerricho says:

      This was always the problem. As much as I loved my ragefury, ATI made great cards with awful drivers. Getting it into that sweet spot of functionality was a moment of joy and was largely the cause of many of my replays of Deus Ex.

    • Babs says:

      Everybody always says this but I’ve used ATi cards for 6-7 years without any driver problems at all. In fact the only card I’ve ever had that didn’t work properly was an Nvidia card, but since that was an original GeForce I don’t really hold it against them.

      And haven’t Nvidia released a string of really duff drivers recently forcing people to stay with version 181?

    • invisiblejesus says:

      ATI drivers have been fine for years now. They haven’t always been entirely without problems, but I’m not aware of them having any recent problems on the same level with Nvidia’s issues with Age of Conan and Left 4 Dead when those games were released. The whole myth about bad ATI drivers is so far in the past I’m surprised people still mention it.

    • goodgimp says:

      ATI Drivers are NOT fine. I ripped out my 4870 in disgust and swtiched back to nVidia thanks to a driver corruption issue that forced me to reinstall my entire OS, since nothing could fix the problem (ran every driver removal utility there was out there to no avail).

      ATI also has major problems with scaling on HDTVs, the screens are always underscanned. This can be corrected by overscanning in the Catalyst Control Center, but guess what? My friend’s CCC absolutely refuses to work anymore on his Win7 x64 machine. No amount of uninstalls / reinstalls / tweaks have got it fixed, so he’s perpetually left with an obnoxiously wide black border around his display. This can probably be solve… by a reinstall of the OS. Probably. He would switch to nVidia as well, but unfortunately for him he’s running a laptop.

      Another one of my friend also has constant battles with ATI drivers. He’s also looking to switch. Look, I love ATI, I used their cards for years and years, but they still have massive driver issues for a lot of people. Out of the 3 friends of mine that are ATI users (all power users who know what they’re doing on a PC), all 3 are switching or have switched to nVidia due to driver hassles.

      Until ATI gets that fixed, it doesn’t matter how good their hardware is. They’re not gonna catch nVidia with such poor quality drivers.

    • Farewell says:

      I switched from ATI to Nvidia about 13 years ago (starting with the Riva 128), and have stuck with them since.
      The only good reason I’ve stayed this path is the drivers: After having to deal with the steaming pile of software called Catalyst (on a computer at work) I don’t really want anything to do with ATI cards ever again.

      The only thing that could make me reconsider is if ATI can deliver quiet cards at competitive price and performance – as much as I love my Geforce 9800, I wish it would be less noisy all the time..

    • Vitamin Powered says:

      As a strange historical sidenote, approximately 25% of the issues people had in the early days of Windows Vista were due to buggy nVidia drivers. I do remember the madness of trying to maintain my new 8800GTX under Vista64. Highlights included STALKER, and Call of Juarez’s “Sorry, no plans for 64bit support”. The later still makes me cringe, as they taught me in first year uni how to program for bit-level independence.

      *ahem* If anybody wants me I’ll be on my comfy couch by the fire, smoking my pipe, wearing a cardigan, writing a letter to the local about the youth of today.

    • Psyk says:

      “as much as I love my Geforce 9800, I wish it would be less noisy all the time..”

      Amen to that, thinking of water cooling mine when I can.

  13. Dantokk says:

    ATI needs to work on their drivers on Windows, and on Linux it’s not even close.

  14. Generico says:

    ATi’s problem is that developers are simply not optimizing their games for ATi hardware. It doesn’t really matter how awesome your GPU is if nobody is writing their game to take advantage of your card’s specific capabilities. Without that, the benchmarks just don’t show how superior ATi’s hardware actually is, and a lot of people use those benchmarks to make their buying decisions. ATi/AMD needs to ramp up their marketing (specifically toward developers) if they intend to provide legitimate competition for Intel and nVidia.

    • archonsod says:

      Nobody is optimising for Nvidia either. The difference is, the Nvidia driver is nice and robust and happy to work with virtually anything you chuck at it, whether an obscure shader implementation or a non-Windows OS. ATI drivers on the other hand are tetchy and prone to throw a fit if there’s anything out of place. I’m not entirely unconvinced that optimising for an ATI card wouldn’t cause problems for non-ATI cards either.

      You can get around it by using third party drivers, but people tend to shy away from unofficial drivers, even if in this case they tend to be better than the real thing. Plus I guess with so little difference between the top end cards in terms of providing a playable experience trawling for a decent driver is added hassle most people don’t want.

    • Skinlo says:

      Lots of developers optimise the games for Nvidia, hence the whole ‘The way its meant to be played’ played thing you see at the beginning on lots of games. Nvidia pays the developers lots of money to come and optimise games for Nvidia cards, which is why they often work better for Nvidia cards than Ati cards.

    • archonsod says:

      That just means they sponsored the game, not that it’s been optimised for their hardware specifically. Last game I can recall that they did get the dev to optimise for was Morrowind. These days it’s easier and cheaper for Nvidia to create an optimised driver themselves.

    • Optimaximal says:

      That just means they sponsored the game, not that it’s been optimised for their hardware specifically.

      So, Nvidia chucked a load of money at the developer in order to get their name on the box/splash screens and they aren’t going to favour them?

      Also, they do have sway – Assassins Creed was a TWIMTBP release that ran better in DirectX 10 on ATI hardware (which supported 10.1) so Nvidia made them patch 10.1 support out because it was making their cards look bad. Well, officially it was for ‘compatibility reasons’ but its too convenient.

  15. DavidK says:

    Is it interesting that there are twice as many 64-bit Windows 7 users as there are 32-bit Windows 7 users? Seems odd to me.

    • Monchberter says:

      See my answer above. Plus almost any reasonably specced PC is able to take advantage of 64bit these days

    • Malibu Stacey says:

      Windows 7 installation media include both x86 & x64 and install whichever one is applicable for your hardware. Since the majority of PC’s sold in the last 5 years will be x64 compatible, it will more often than not pick the x64 version of the O/S automagically. Not sure how or even if you can force the x86 version to install on x64 hardware but there’s no benefit to doing so & plenty of reasons not to.

      Interestingly Windows Server 2008 R2 doesn’t have an x86 version (unless you’ve an Itanium CPU which is pretty unlikely) so I wouldn’t be surprised if the next version of desktop Windows also drops x86 hardware support.

  16. Supertonic says:

    Possible reason for the NVIDIA popularity. On Linux, Wine gaming is best done with an NVIDIA card, AMD aren’t quite there yet with their drivers. Given that Valve’s games are known for working well on Linux/Wine and thus quite popular in the Linux community, it’s no great leap then to see that the prevalance of NVIDIA cards among linux users extends into Valve’s userbase.

    • ascagnel says:

      As an avid Linux/Mac/*NIX OS fan, I can say with total certainty that the amount of people playing with WINE is well under 1% of the total audience of a game. nVidia has always had better Linux drivers than ATI (although I haven’t checked out the cleanroom ATI drivers since I haven’t had an ATI card since the 90s.

  17. Supertonic says:

    Btw for those who are interested in such things.. ATI’s proprietary linux drivers are much better than they were, but still not on the same level as NVIDIA, however, the open ones are flying along at an awesome pace. In the last year they’ve gone from unusable to the current state with kernel 2.6.32 where they outperform ATI’s own drivers on 2D and are finally making headway on 3d. Within 12 months I predict ATI will be the place to be on Linux.

    Ubuntu users wanting to try the open drivers..
    link to kernel.ubuntu.com
    Grab linux headers all
    grab either linux headers i386 or x64
    Grab linux image i386 or x64
    Install in that order
    Add the xorg-edgers repo
    Done

  18. Pessimal says:

    Because the ATI Catalyst drivers are SHIT.

  19. The Dark One says:

    ATI has made great strides on Linux. Dantokk must’ve smoked some bad granola.

    • Sam says:

      Yes, ATI has made great strides on Linux.

      They’re still lagging behind nVidia, however, and behind their Windows drivers also. And, as people have mentioned, their Windows drivers don’t compare well with nVidia’s, either.

      Improving crap drivers doesn’t necessarily make them brilliant; it can just end up making them a bit less crap.

  20. Lucas says:

    I’m curious about whether gamers are buying SSDs, and how well they work out. HDD access times and throughput are one of the longest standing bottlenecks, and even with 3GB/s+ interfaces the spinning hardware just isn’t there to use it yet. The anecdotes are interesting at least. Steam doesn’t seem to have numbers on these.

    Someone slapped one in a PS3, which is designed for slow spinning media anyway so the games have hugely optimized loading layouts on disc, and reported next to no improvement (possibly asset decompression is the bottleneck, rather than transfer rates).

    PC games on the other hand often have many small files and random access patterns because they expect to be on an HDD instead of playing from disc.

    Even better would be a setup with the flash memory on a PCIe card (no old HDD interface), which some companies are doing, but they aren’t bootable yet, and Windows can’t split its install or system files across more than one drive/partition (afaik). Still expensive, but crazy fast.

    I was very happy with my 2005 upgrade to an all-SATA box, and will have to do some investigating when I look into a new gaming rig (probably soon). A mix of the different drive technologies could provide a robust spectrum of price/performance options in one system if it’s worth going that far.

    I did notice the survey’s most popular display ratio is 5:4 (1280×1024), which is pretty funny. Here’s hoping the widescreen fad expires soon. Good 4:3 screens are getting hard to come by.

    • Lucas says:

      Oops, SATA-II must be 3Gb/s (not GB/s). I guess the 6Gb/s version is still going to be overkill for the average setup.

    • Sam says:

      So, the advantages of SSDs, as you mention, are primarily in the seek-speed at present (mainly because an expensive HDD can max out, just about, the 3Gb burst transfer bandwidth of the most popular current interface). Since a lot of files are hopefully going to be precached by the OS in memory (given that the fraction of free memory for cache on a modern PC is quite large, most of the time), the major effect will probably be to speed initial loading of the game (which Windows Vista/7 can do automatically, if you give them some solid state storage to precache popular applications on – you don’t even need to install to the medium), and then speed loading of new content when first used.
      At present, I suspect the price/performance ratio isn’t quite there yet for this advantage – I’m about to evaluate SSDs for my work, and even there, we’re not convinced it’s an obvious choice as to if they’re worth the money, yet.
      I’d say they’ve got another iteration of Moore’s Law for price/capacity before they start displacing HDDs in mainstream computers, even for a lot of “extreme gamers”.

    • CMaster says:

      I’ve got a velociraptor in my machine (actually, it’s defragging at home as we speak). The difference in startup/loading times is there and real, but far from huge and I’m not sure I can say it was worth the cost. Interestingly, in the past week have gone from 2 to 1 to 3.25 (32 bit OS) gb of RAM. Now that made a notable difference to boot time (from 2 to 1 was noticable slower) and certain game startup times (all transitions)

      Of course, I have for a good few years now carefully looked at benchmarks to ensure I was buying a fast HDD, not just one with size, so perhaps those with slower HDDs would notice it more.

      I will say that slow HDDs is, as far as I can tell one of the biggest reasons for poor-performance in everyday computing on the typical household laptop. And yet it’s something that most people don’t give even a moment’s thought to.

    • Babs says:

      I got an SSD with my new system and I absolutely love it. Windows 7 boots unbelieveably quickly and shutsdown immediately. Apps start instantly, though I am aware that this will also happen with prefetching on).

      I’ve got a small one, 64 gig, so all my Steam games have to be on the normal disk but I’ve played Men of War, Fallout 3, Mirrors Edge and Crysis and SotS from the SSD. I can definitely notice the the difference, especially in games with frequent reloading or area changes like MoW and Fallout. The first time I played a game off the disc drive after playing from the SSD I thought ‘Why the hell is this taking so long?’, and then I realised :)

    • Babs says:

      Also regarding SSDs, they are silent. It’s now very, very obvious when my disc drive spins up.

    • Carra says:

      An article on Coding Horror went into the benefits of having a SSD. Almost convinced me to buy one:
      link to codinghorror.com

    • Malibu Stacey says:

      Lucas putting an SSD in a PS3 wouldn’t give any benefits at all. The bottleneck in Playstations has always been the amount of RAM (PS3’s have 256 MB of System RAM & another 256 MB of Video RAM).

      Steam will never have the numbers on SSDs vs HDDs. To Windows, it’s a storage device therefore to Steam it’s a storage device.

      I’ve looked at replacing HDDs with SSDs here at work to speed up compile times but the cost isn’t really worth the benefit right now. We have 100+ projects in a solution, on an average dev’s laptop with the HDD regularly defragmented (automatically using O&O defrag) it takes about 15 minutes to build. Using the SSD it took about 10 minutes to build. This was using the same machine, simply swapping the 2.5 inch SATA II drive out between tests. When comparing that to my desktop which has 2 SATA II drives in RAID 0 it’s even less competitive.
      Considering compiling is a very disk access intensive operation (both reading and writing concurrently) I would agree with Sam that SSDs need more time before they become a viable replacement for HDDs with regards to cost vs performance.

    • Premium User Badge

      Wisq says:

      The main benefits of SSDs for me so far have been the ruggedness, and obviously only in the netbook/laptop form factor. With enough RAM and (in Windows) a good defrag regimen — not just defragmentation, but also file reordering based on usage patterns — a fast HDD can do a pretty good job, and doesn’t suffer from the size limitations, high price, and slow write speed of an SSD.

      Also, the article mentions an endorsement from Linus Torvalds. Keep in mind, his main usage pattern is almost certainly compiling the Linux kernel. It’s a huge assortment of relatively small files, and it probably won’t all stay in cache (amidst all the compiler activity and output writing) unless you’ve got a ton of RAM, so an SSD would probably have a disproportionate benefit there. Game content, on the other hand, tends to be a bunch of small stuff read on launch, and then large chunks of content read during load times, where older SSDs aren’t going to help as much.

      Of course, if the newer ones have dramatically increased the bulk read rate as well, then yeah, we’re looking at an improvement in all respects, and I’m glad we’re finally starting to see some SSDs that are actually outperforming HDDs. There was quite a while there where the whole “better, faster” thing was pure marketing — up to and including 16 months ago, when I assembled my current gaming PC and researched the matter. Now it’s finally becoming true.

    • goodgimp says:

      I’ve got a SSD drive for my system partition and it makes a difference from a 7200rpm HDD. If you’re running a velociraptor, it’s probably about comparable.

      Some people would like you to believe (and I’ve heard this phrase tossed about) that SSD is like going from dialup to broadband. It’s not. But it is like upgrading your broadband connection from 3 megabit to 7 megabit. It’s an improvement without a doubt, but it’s not fap-worthy.

    • manveruppd says:

      @ Goodgimp: You’re absolutely right. Everyone knows that NO HARDWARE UPGRADE is, as you call it, “fap-worthy”.

      Unless of course it’s a Geforce 6800 Ultra, but everyone knows that

  21. mbp says:

    Even though I consider myself a serious gamer my current gaming PC is almost FIVE YEARS OLD and I am happily playing Dragon Age on very high graphics at the maximum resolution of my monitor (1600*1050). I admit to having upgraded a few components: extra ram and several graphics card replacements and I did eventually upgrade my single core Athlon 64 with a dual core. I have also been pretty meticulous at maintaining the machine. This type of longevity would have been unthinkable a decade ago when Moore’s law was still in force and a machine had a useful gaming life of about 18 months.

    • TheApologist says:

      Yep – my PC is well over 3 years old now and I haven’t needed to touch it. Only having 2gig of RAM is getting to be a bit of an issue, but I leapt in back then with a quad core and a 8800GTX and it runs everything to this day.

      My longest serving constant hardware set up (not counting the PSU that died about 18 months ago) by far.

      Considering Windows 7 now though…

    • Malibu Stacey says:

      Read his post again. Extra RAM, “several graphics card replacements” and a CPU upgrade doesn’t equal a 5 year old PC. Unless he specifically bought 5 year old parts to upgrade his machine with which I very much doubt.

    • mbp says:

      @Malibu actually the motherboard is the original and the RAM and CPU are all models that I could have gotten in 2005 but didn’t because they were expensive at the time. The only component that is definitely not 2005 vintage is the graphics card which is a an ATI 4850 bought in late 2008. My sound card is actually a 2002 model SB Audigy which still outperforms onboard sound.

    • jackflash says:

      You realize this basically coincides with the console “revolution” – current generation came out around five years ago, basically locking in hardware requirements in most cases. There’s a reason Dragon Age runs nicely on old hardware – it’s also a 360 title. This has nothing to do with Moore’s law, and everything to do with Microsoft and Sony.

    • Starky says:

      Moore’s law is still fully in force.

      The number of transistors on a cost effective integrated circuit is still doubling every 18 months.
      That does not equal a doubling of CPU speed though, realistically doubling the number of transistors might see a 30% increase in processing power.
      Hell, I think We actually jumped a little ahead of Moore’s law with Quad cores, and the 45nm process platform.

      Every time it seems like we’re going to fail to live up to Moore’s law, some paradigm shift (e.g. from GHz to parallelism) or advancement ensures we make it.
      We’re on track to keep achieving Moore’s law for a good decade yet.

    • Nick says:

      “There’s a reason Dragon Age runs nicely on old hardware – it’s also a 360 title”

      Nah, its more to do with the fact it *was* a PC exclusive game and in developement for a very long time.

  22. Rich says:

    “….resolutely non-DX10 (without hacks, at least) Windows XP”
    A hack I have subsequently found. Plan to have go at installing DX10 on my machine later.

    Also, I wish I could even plug a multi-core CPU into my archaic mobo. I’m doomed to keep my Socket 754 board for another year at least.

  23. Carra says:

    I don’t care about manufacturers either. He who provides me the fastest electronics for the cheapest price will get my buy. For my newest PC I bought a Geforce 8800 GT which was king back then and the PC before that had a Radeon 9800 which was the best value at the time. That Geforce card still crunches the numbers from everything I throw at it so I’ll wait a bit longer to upgrade. I’m sure lots of people in that 11% are doing the same.

    I’m a bit surprised by the relatively low resolutions. I’ve been using 1680 x 1050 for two years now yet the average resolution seems to be 1280 x 1024. Still about 30% who are using 1680 x 1050 or higher. Also quite a big part of the users who are using a widescreen these days.

    • skalpadda says:

      I’m surprised as well, especially since monitors are getting fairly cheap now and investing in a larger widescreen monitor was most definitely the thing that made the biggest difference the last time I upgraded my PC.

    • Plinglebob says:

      The likely reason the resolutions are low is because a lot of pre-bought systems only come with either 17 or 19 inch moniters and I know the last 2 I bought (19 inch widescreen) have a maximum of 1440×900 and the 17 inch’s maximum was lower (1280×1024 sounds about right). Also, moniters last longer then the system usually does and so most people would only replace a moniter when it breaks and not when they upgrade.

    • pirate0r says:

      Those 1280×1024 monitors are old, think 19″ non-widescreen old. From my experience as seller of computers *shudder*, I would notice people buy a new computer but tend to keep their old monitor unless it happened to be a screen of small proportions, a CRT, or plain dead. People tend to wait for 2-3 generations of upgrades before they buy a new one because the monitor has no real effect (high resolution crysis aside) on system performance. It’s kind of like how some people will buy an awesome home theatre package, but keep their crappy couch.

      On my desk I have a 5 year old 1280×1024 BENQ LCD, right next to my 2048×1152 Samsung (merry boxing day sales to me).

    • Dean says:

      I’d rather have two screens, than one big one. And anything above two 17″ panels is just too big for the desk, especially widescreen displays. And the screens kind of have to be the same model or it just looks messy. That said, I’d love a 22″ widescreen display that also had a 17″-ish friend that was the same height and design…

  24. skalpadda says:

    About the ATI vs nVidia and AMD vs Intel thing, are we forgetting that most people don’t build their systems themselves and couldn’t care care less about benchmarks and choosing manufacturer? To find out why one is so dominant I think you’d have to ask companies who build PCs, not enthusiasts who build their own.

  25. GCU Speak Softly says:

    “mbp says:

    Even though I consider myself a serious gamer my current gaming PC is almost FIVE YEARS OLD and I am happily playing Dragon Age on very high graphics at the maximum resolution of my monitor (1600*1050). I admit to having upgraded a few components: extra ram and several graphics card replacements and I did eventually upgrade my single core Athlon 64 with a dual core. ”

    I’ve been using the same broom for 5 years, just replaced the head three times and the handle twice…

  26. dragon_hunter21 says:

    In re: ATI v. Nvidia, probably the only reason I chose an Nvidia card over an ATI card in my recent upgrade is the fact that I know the Nvidia lineup and hierarchy. Not having looked at the ATI setup, and not having any knowledge about what specs to rate graphics cards by, their card titles just seem like random characters.

    • Stense says:

      I had a similar restriction when I built my new pc, but with ATI and Nvidia swapped round. I’ve been using ATI cards since the good old 9800 pro (and not had any real problem with ATI drivers other than needing to get the agp hotfix versions instead of the normal ones), so when I came to choosing a new card, I was utterly lost with Nvidia’s offerings.

  27. Taillefer says:

    People only upgrade often to show off in forum sigs, right?

  28. Velvet Fist, Iron Glove says:

    After my Voodoo2 (remember those?), my graphics cards have been:

    NVIDIA Geforce2MX400
    ATI Radeon 9600
    ATI Radeon X1600
    NVIDIA Geforce 8800GT
    ATI Radeon HD 4850

    So I’ve been supporting both ATI and Nvidia about equally. Although of these, the X1600 and HD4850 came in my iMacs, so I didn’t choose the graphics card specifically. Speaking of which, the 27″ inch quad-core iMac makes a great gaming machine: very good performance, and an oh-so-gorgeous screen!

    • The Sombrero Kid says:

      lol

    • Malibu Stacey says:

      Shame there are hardly any games worth playing on the Mac. But then you probably dual boot into Windows to play games right? In which case congratulations, you have a very nice looking but very overpriced PC.

    • Donkeydeathtasticelastic says:

      Me too, sort of.

      I had a TNT2, then a Radeon 9600, then a Geforce 7800 and now a Radeon 4890.

  29. The Sombrero Kid says:

    i doubt i’ll go back to nvidia after the stunt they pulled with the 8800gt, missing out on pysx doesn’t hurt as much as that did.

    • jalf says:

      What stunt? Releasing a kick-ass card that was priced for the low-midrange, and performed as a high-end card?

      Yeah, I don’t think I’ll ever forgive them for that either… Waitwhat?

    • The Sombrero Kid says:

      link to news.softpedia.com

      it covered second generation 8800gt’s aswell.

    • Starky says:

      You know that was all utter BS right?

      It was a “story” lead by the Enquirer that had a lot of people jumping up to sue. There was a problem yes, but it was tiny really.
      It only affected notebook GPUs (NOT desktop cards), and only a tiny number of models/manufacturers faced the issue (mainly HP is I recall) it was due to some bad materials, and Nvidia helped fund the repair/replacement of faulty systems.
      The Fail rate of those “defective” chips Nvidia were ripping people off with? Around 10% – higher than would be expected in modern chips (which is around 5%) but hardly end of the world, oh woe is Nvidia.
      So overall, probably less than 10% of Nvidia chip using laptops manufactured during that period suffered a higher risk of failure – 5% to 10%, which I love how people were throwing around”100% increased chance to fail”. Which sounds dramatic and is technically correct, but doubling a small chance still leaves a small chance.

      The whole thing mainly consisted of anecdotal evidence touted around the internet like it was fact, fuelled by tabloid trash… and people acting like Nvidia was purposefully ripping them off.
      I think PC gamers, and bloggers just got a little bit jealous of Xbox kids and their oh so dramatic red ring of death issue.

      The whole thing was nothing more than a big wild and stupidity fuelled rumour started by a tabloid trash idiot.

      I bet ATI/AMD loved it though. Hell it would not shock me if they nudged it along a little, I would have if I was them – could of people posting on forums with horror stories is all it would take.
      Hell if I was them I’d have put an offer on “Send us your broken 8 series card and we’ll give you a $50 rebate on any ATI card of $100 or more.”

    • manveruppd says:

      @Starky The Inquirer does spout some BS from time to time but that story was very very real. Only notebooks were affected not because desktop parts were fine but because notebook parts were subject to more heat variation, as notebooks are meant to be. All those chips were defective, it’s just that laptop parts went first because they go through more heat cycles (mainly because of Ifaggressive power management sending the laptop into suspend mode more quickly than desktops tend to do).

      The 10% figure is also a little bit misleading: it wasn’t 10% of the chips that failed, but 10% of the laptop models. Nvidia wised up to the error and fixed those parts about 6 months after that bad batch, and kept selling the same chips (which now weren’t melting) and they went into the same HP, Apple and Lenovo laptop models (among others). If 10% of a particular HP model have died, that means that 10% of the laptops carrying that model number had the defective chips in there, and the rest either had later spins of those chips (manufactured after NVidia fixed them) or just haven’t failed yet because their owner might only use the machine once a month. But all those chips were defective and everyone I know who bought laptops from back then have basically been burned (no pun intended). Also, remember, it wasn’t just discrete graphics parts but also chipsets that were affected. That’s an awful lot of Nvidia-powered laptops.

      Finally, there was the whole ignoble way they dealt with it, trying to shift blame on the OEMs, and then releasing that driver update that basically forced the chip’s fan to run at 100% all the time iirc, and calling it a “fix”. It was cringeworthy, I genuinely felt embarassed on their behalf. They could have worked out a deal to quietly compensate the OEMs they sold bad parts to and shared some of the cost of repairing or replacing them in return for discounts on future deals, but no, first they had to deny the problem existed, then try to blame it on HP or whoever, then issue that driver “fix” and shouting “mission accomplished”. Compare that to the way Microsoft dealt with the faulty 360s, or how Apple recently dealt with platic Macbooks whose casing cracked due to overheating, and you get an idea of the amount of ill will Nvidia generated, both among consumers and with their industry partners.

  30. RLacey says:

    Interesting that, although XP 32-bit is the most common OS, versions of Vista and 7 combined just outnumber them.

    Presumably DirectX 10-only games are now commercially viable?

    • Tei says:

      Well.. considering Nintento targets now gamers that will never play a videogame (
      (NEVER link to youtube.com ), I think may make sense to target any type of games that maybe will play something.

    • Malibu Stacey says:

      Exclude 50% of your potential market for your product? Good idea, why aren’t more companies doing this?

      sigh.

    • The Sombrero Kid says:

      a well written dx11 render path will be compatible with dx9 – 11 and so it’s likely dx10 will be skipped by most devs.

  31. Langman says:

    Regarding Ati vs nvidia, it’s all about bang-for-buck – ad Ati for the last 18 months have simply been producing much better midrange cards at a lower price than nvidia. Simples.

  32. kwyjibo says:

    ATI’s problem right now, is that it cannot get its cards out fast enough.

    I have no idea why ATI are still relying on the idiots at TSMC and not their gurus at Globalfoundries, probably a contractual thing. TSMC are dicking over nVidia too with their crap yields on Fermi, but whereas nVidia can continue selling their old cards, ATI have all their chips in with their latest generation.

    The alternative to a 58xx series ATI card, is an nVidia 200, which is why nvidia managed to claw back share in December.

  33. Buemba says:

    My previous 2 cards were ATI and I never had problems with them, but I’ve got a 8600 and I’ll be sticking with nVidia once I upgrade this year mostly due to PhysX support. I know it’s just eye-candy that has absolutely no effect on gameplay, but those fluttering banners in Mirror’s Edge sure were pretty…

  34. blah says:

    Looking back at all my GPU purchases, I find that I had a lot more problems with Nvidia’s drivers than I’ve ever had with ATI. I have 3 friends with recent ATI drivers that have never had an issue (except maybe once with NFS Shift, but that’s the game’s fault), whereas I, with an 8800GT, had so many.
    Nvidia’s drivers have been especially shitty as of late, random nv4disp BSoDs, graphic driver crashes, stuttering, etc… Once I went back to 182.50 and 178.24, most of the problems were gone, but there are still some minor issues here and there (like turning AA on in L4D2 or SF4 causes the games to chug for whatever reason, which is fixed in later drivers, but sadly those proved to be much worse in terms of stability).

    Point is, don’t fall into all this hype about Nvidia and their perfect drivers, and don’t believe everyone and their bashing of ATI. Whether you buy this or that, there is always an equal chance of problems, so just buy whatever gives you the best value for your money.

  35. bill says:

    The thing i like about steam’s surveys is that the don’t make me feel totally inadequate. hanging out on places like VE3D tends to make you feel that your basic dell laptop is something to be horribly ashamed of… but it turns out to be pretty average.

    Was surprised but happy to find out it ran Mirror’s Edge very smoothly at 1440×900

  36. blah says:

    An edit button would be nice:
    “I have 3 friends with recent ATI drivers cards that have never had an issue”

  37. Hobbes says:

    Re: the XP-Vista-7 and the 32bit-64bit transitions, I got these numbers (assuming Windows 2000 is the 32 bit edition):

    32 bit total: (-3.46%) 73.03
    64 bit total: (+3.49%) 26.89
    Unknown total (-0.03%) 00.08

    Windows XP total: (-3.01%) 45.41
    Windows Vista total: (+0.54%) 30.71
    Windows 7 total: (+2.47%) 23.06
    Other total: (0.00%) 00.82

    Make of that what you will….

    • Optimaximal says:

      Although it’s fantastic that 7 has 25% of the Steam-using Windows market after just 3 months on general release, a lot of people are probably still using the RC till March, after which XP or Vistas usage might very well jump back up as the cheapskate gene kicks in.

  38. Azhrarn says:

    My last PC was an AMD X2 3800+ with an ATI x1950 Pro and 2GB of RAM, which worked really well, even in newer games (although I needed to sacrifice some detail-settings to get decent performance).

    My new build, is an Intel Core i5-750 with an ATI HD5870 and 8GB of RAM, which is overkill for most games unless your screen is huge or you have several (and mine is nothing special, the 20″ 1400×1050 Acer screen I have is still excellent, so I didn’t bother getting a new one).
    Especially the HD5870 is amazing, the amount of graphical horsepower it has is staggering and it’s fairly quiet too, with extremely agressive fan-settings.

    I feel the reason many go for nVidia by default is because of all the horror stories you used to hear about ATI driver issues and such, these days AMD drivers are pretty good and with a monthly release cycle stuff gets fixed fast.

  39. jsutcliffe says:

    One thing, re: NVIDIA/ATI and gamers buying what’s cheapest in the article

    I always buy ATI, [almost] always have, and probably always will. I think it’s more a matter of buying what you’re comfortable with, or what you understand — naming conventions for graphics cards are out of control (4350 is worse than my 2900? But it’s 1450 better!), and while I kind of get the ATI numbering system now, I haven’t a clue about which NVIDIA cards are good. Also, the one NVIDIA card I did ever have, which I chose in part because its name had a suffix that made it sound like it could do fancy things, was so utterly shite that it would be specifically called out on the back of game boxes as incompatible.

    I don’t know if NVIDIA is better (as I understand it, the two companies are pretty much neck and neck) but I don’t care. I want what I know.

    • Bremze says:

      The 2XXX and 3XXX series Ati cards were quite bad, but the 4XXX were better bang for buck and the 5XXX are both that and can preform better than anything Nvidia has to offer.

  40. J0J0 says:

    I’m running Win7 64 bit, on an Intel i7 core, just 6 gigs of ram, and a single 1 gig Saphire ATI Radeon 4870 card. CoD4 runs really well, so that’s all I care about.

    • J0J0 says:

      My bad, I think my card might be 512 megs….honestly I don’t remember. Still it’s more than enough for my CoD4 game. That’s the only game I play to be honest.

      The reason for the 8 cores is that I usually run a lot of simulations when not playing so the cpu was not bought for gaming purposes at all.

  41. cjlr says:

    I am perpetually astounded that my CPU puts me above 95th percentile.

    It’d be nice to see breakdown over ATI’s 4800 series cards, since there’s a rather significant range there. Ditto the nVidia’s 8800s.

  42. Joseph says:

    I think the reason most people have DirectX 10 cards is because that’s pretty much all that they make now.

    • Joseph says:

      In the way of new cards, anyway. I didn’t buy my DX10 card for DX10. I bought it because it was the fastest card on the planet at the time. (I use it with XP.)

      Haha, below post :o)

  43. dsmart says:

    Also surprising is DirectX 10 take-up: despite its whipping-boy legacy and even though we haven’t had an upgrade mania-inciting game since Crysis, there are more DX10 cards out there than anything else by quite a margin now.

    uhm, there is no correlation between the number of DX10 cards and its whipping-boy legacy. So its not like gamers are buying DX10 cards because of the DX10 games. They are not.

    The number of DX10 cards out there is simply because the DX9 part is legacy – so nobody is making them anymore. So its not like there are DX9 cards out there by gamers are buying DX10 cards despite the DX sdk being under-adopted.

    So, you can have a DX10 (or DX10.1 or DX11) card – just because thats the spec of the card – but not have any DX10, DX10.1 or DX11 games to play on them.

  44. Kadayi says:

    One of the factors that might impact why Nvidia is higher up in usage, might be to do with the fact that their GPUs are significantly better at protein folding Vs the ATi cards (link to folding.stanford.edu) and quite a lot of PC enthusiasts use their machines for this when they aren’t gaming. Just a thought.

    • Vitamin Powered says:

      Errr, what?

      For quite a while folding at home only supported ATI cards, because ATI cards were the only ones that supported double precision floats, while nVidia waited until the GT2whatever series before allowing double-precision.

    • Kadayi says:

      That ATi was first to the jump back in the day still doesn’t undermine the fact that the Nvidia cards are significantly better at folding, and generally more widely used as a result.

    • D says:

      Are you joking? Do you really think this factors into peoples decision to which graphics card to buy? Rather than a performance/cost ratio? The reason for the mass of Nvidia owners is simply that last time people had to upgrade, Nvidia had the best offer. Or the best marketing. Is it even documented how many people help do this distributed folding, or is “quite a few” maybe as low as 0.1% of the steam population? I think you give us people too much credit.

    • Kadayi says:

      @D

      Given that there’s apparently over 1 million PS3 owners out there folding I don’t doubt there are a fair few PC users folding as well, esp given GPU hardware has become a lot more productive in the last couple of years since that news :-

      link to allconsolegamers.com

      So yeah I’d say there is probably a degree of impact there & certainly more than 0.1% as you purport. Feel free to come back with something stronger than scoffing disbelief though.

    • Walsh says:

      I seriously doubt it. In fact, you will see the numbers go down as more and more people realize how much money their PC is costing them in energy bills while it sits there and churn at full power all day.

      It’s not very green.

    • manveruppd says:

      I think Kadayi mistook Steam’s numbers as representing “usage”. That 63% doesn’t stand for hours of processing or WUs or anything like that, it just stands for number of Steam users that own Nvidia GPUs.

    • Kadayi says:

      @Walsh

      Not green? Neither’s people playing 100+ hours of WoW a month, however that doesn’t stop a few million people doing that every month, & folding is a lot less demanding than that on your GPU. In fact if you’ve a decent rig you could play WoW (or any other game) & fold at the same time and the most it’s going to cost you is a few fps.

      @manveruppd

      No. Alec was wondering why there was a discrepancy and I’m offering a possible contributing factor (I didn’t say it was the sole one. I’d say that’s more down to Nvidia have a bigger mainstream reputation than ATi, esp with the ‘way it’s meant to be played’ game tie-ins). Everyone in the team I fold in is a gamer, and 9/10s of them use Nvidia Cards because the performance is significantly better than that of the ATi cards. Regardless of how tempting the prices are on the latest ATis, there’s no way I’m buying one given I know that it’s going to cost me 80% of my production.

      The thing to bear in mind also is that short of gaming, folding or at a pinch rendering (as there are better dedicated rendering cards) general GPUs aren’t much good for anything else (it’s only recently that companies like Adobe have started tapping into the power of GPUs Vs CPUs). Sure there are bound to be a few hardcore folders out there who don’t game, but I’d say the vast majority probably do, and if they’re savvy/geeky enough to be part of of folding@home I’d say they’re as likely to be savvy/geeky enough to be on Steam as well.

    • Vitamin Powered says:

      @Kadayi

      Whilst Folding @Home is not something I’m overtly concerned with, I do agree with the point that nVidia’s better support for GPGPU usage via CUDA does influence me towards nVidia.

      I’m not entirely sure how much influence better Folding support makes to sales though; my point of nVidia’s later support makes me wonder high an influence Folding could make pro-nVidia, given the G2whatever’s relatively recent arrival.

      Of course, this is assuming that the lack of double-precision is still an issue; Folding might have bypassed that and simply gone along with nVidia’s lower accuracy.

    • drewski says:

      @ Kadayi – you’re probably not overestimating the effect of folding has on card preference of hardcore folders, but I suspect you are dramatically, horrifically and other superlatives-ly overestimating the amount of hardcore folders there are out there.

      I remember back when SETI@home was all the rage with some of my uni mates, and after installing it most people forgot they’d done it. I suspect folding’s the same thing – a lot of people sign up, most people promptly forget about it and it’s only the real hardcore who even pay attention, let alone make videocard purchasing decisions on that basis.

      I don’t have any more than experience of human nature and a cynical nature, but I suspect your position is based on an unrealistic extrapolation of your own perspective – it’s important to you, therefore it’s probably important to lots of other people too. But those lots of other people are, I suspect, a teeny teeny teeny minority of Steam users.

    • Kadayi says:

      @Drewski

      Please feel free to disagree with me, but as a thought perhaps drop the needlessly patronising tone in future. There’s no school boys here.

      Hardcore folding & hardcore gaming go hand in hand. Certainly one element might far outweigh the other in terms of numbers, but the rate at which a GPU folds is increasingly being used as a form of performance benchmark. Next time you see a copy of Custom PC magazine check out the graphics card reviews and you’ll see what I mean. Generally people like to feel that they are getting their moneys worth with any hardware investment, and buying a card that appears to have an Achilles heel, no matter how inconsequential isn’t something your hardcore gamer is necessarily inclined towards in my experience.

  45. bookwormat says:

    If there is one thing I hated about the 90s except for the Macarena, it’s the constant hardware-upgrades that were required to play games. I’m so glad we left this behind us.

    Today my only fear is that I will have to buy a new license of fucking Windows one day just to play videogames.

  46. DrugCrazed says:

    I’m using a 4850 in here, but that’s because I’m not planning on running on a massive screen. I do have lag issues at times though. I’m also able to run most things on high or medium, but graphics are just eye candy. They aren’t really important are they?

  47. Jimbo says:

    Is there any point splashing out on a ‘high end’ PC nowadays? To play what? Nobody is out there pushing technical boundaries, because PC exclusive sales are no longer high enough to warrant that kind of expense. For the forseeable future we’re mostly going to be hard-capped by whatever a 360 can do.

    The unrelenting pace of technological advancement in PC gaming used to drive the entire industry forward – it used to force console manufacturers to release new hardware just to keep up. So long as ~80% of us are happy to steal the games we play, we aren’t going to see those days again.

    • Kadayi says:

      Hard capped and handicapped, given the PC is already muscling above the limitations of the 360.

    • SheffieldSteel says:

      ooh, I failed to enter the captcha phrase, so now my text is gone

    • bookwormat says:

      @Jimbo

      Nobody is out there pushing technical boundaries, because PC exclusive sales are no longer …

      Please do not try to make this a “bad news for PC gaming” thing. The reason nobody wants to push technical boundaries is because consumers voted with their wallets that they don’t want technical boundaries to be pushed. This might be bad for the minority of gfx-nerds, but for the rest of us it’s a good thing.

      I personally think 3D graphics look fine since the year 2000 or so, and 2D graphics since the late 80s. Improvements are welcome, but they should come slowly. And the problems in artificial intelligence are not solved by faster hardware but by smarter algorithms.

      The current bottlenecks in gaming are solved through smart processes and creative ideas in software and game design. We are currently breaking boundaries through exploration of new software platforms, techniques and tools, developer autonomy, knowledge engineering, community projects, new business models and lots of thinking.

    • Taillefer says:

      I’m with bookwormrat, whether you consider it a help or a hindrance, a console-lead gaming market has meant I haven’t had to upgrade for years and I’ve been able to use the extra money to buy a huge monitor and USB missile launchers instead. I haven’t enjoyed gaming any less and I have more money. Thanks, consoles.

    • Kadayi says:

      @Taillefer

      I agree that there is a benefit to it in some ways. The problem occurs when it’s means developers are forced to curtail their ambitions because the console architecture isn’t able to adequately handle them. If you look at Dragon Age: Origins pretty much every reviewer has said that if you intend to play it then do so on the PC because neither the 360 or PS3 , and as PC games go it’s not that demanding. Hats off to Bioware for making separate skews, but that probably could of just as easily made it for the 360 and then palmed us, the PC crowd off with the port.

  48. DarkNoghri says:

    I’m kinda curious what’s supposedly confusing about ATI card naming schemes? I’ve seen several posts talking about them thus far.

    3xxx, 4xxx, 5xxx are the most recent generations. x1xx-x4xx are low end, and therefore possibly worse than the high end of the previous generations. x5xx-x6xx are the budget gaming cards. x7xx-x9xx are the mid-range to high end cards. In general, the biggest number is the best performance. But that’s only WITHIN each generation. So a 4890 may be better than a 5350 (if it exists, haven’t really been keeping up). The 4890 would be a high end gaming card, and the 5350 would be a low end card.

    I also saw at least one comment about nVidia cards. They have a similar naming scheme, only theirs is several thousand numbers higher. And for the most part, it’s just as simple, with one caveat. So they have the 7xxx, 8xxx, and 9xxx cards. Again, WITHIN the generation of cards, the bigger numbers are better, for the most part. The caveat: nVidia recently hit the number 10000 with their cards, essentially. They have now wrapped around and are using 2xx model numbers. Oddly, they’re being sold alongside 9xxx cards, I think, such as the 9800. It’s at this point that I get confused, due to having upgraded recently and not needing to follow it. The 2xx models should be numerically in order of performance, but I have no idea how they compare to the 8xxx-9xxx cards.

    That said, I have no idea what a lot of the suffixes are/were. This could very well go for either company. nVidia 9800 + GTX OC BIG BOY etc etc. Is a GT better or worse than a GTX? That’s the type of thing that’s always confused me, especially when there were roughly 400 different versions of the same number card.

    Also, what’s wrong with ATI’s drivers? Having had very few problems over the years, and never having run nVidia, what am I supposedly missing? Mind, that’s because when I was upgrading, ATI always had the best bang for the buck, or was included in laptops, etc, not because I really have anything against nVidia.

    Some of the other things mentioned: I would guess that 1680×1050 is popular simply because of monitor sizes. That should be the standard resolution for anything between 21 and 23 inches or so, if I recall correctly. I don’t think most people who grab widescreen LCDs usually get much larger than that, as it gets rather expensive.

    What I do find odd is the large number of 1280×1024 screens. They’ve been selling widescreens for years now.

    And now I’ve written more than I expected. Oh well.

    • Joseph says:

      “2xx models should be numerically in order of performance, but I have no idea how they compare to the 8xxx-9xxx cards.”

      All anyone needs to do to find this out is google for some reviews/benchmarks/comparisons of the cards. All I know is a GTX 285 is approximately double as fast as a 8800GTX / 9800GTX (these two are essentially the same card).

      “That said, I have no idea what a lot of the suffixes are/were. This could very well go for either company. nVidia 9800 + GTX OC BIG BOY etc etc. Is a GT better or worse than a GTX?”

      GTX is better. Going to refer you to a website I know, it’s useful: http://www.google.com

    • R3D says:

      a GTX 280 now is approximately as fast as a 9800GX2 ,2 9800’s glued together with onboard sli. now its the GTX 285 i beleve it has pulled infront by probly not anough to actualy matter.

    • drewski says:

      Don’t you think the fact you just wrote an essay on the naming scheme of videocards automatically renders it somewhat complicated to the vast majority of people who go “oooh that’s quite a big number, I’ll have one of those”?

      You’re obviously a pretty experienced amateur computer enthusiast, as a minimum, so it’s not surprising you’re fairly comfortable with the differences, but to someone who’s only ever paid attention to one of the model lines and is more about gaming than computers per se, is it not very reasonable that they would find a plethora of model numbers and suffixes and generations confusing?

      I have a lot of mates who are pretty reasonable with IT and I’m still frequently correcting their misconceptions when it comes to videocards or other tech.

    • DarkNoghri says:

      To be fair, that was a very long-winded way of saying that in general, higher numbers are better, within each generation. My point was that while somewhat confusing, it’s not all that bad.

      And yes, I’m well aware that I can Google such things. I’ve just had no need to thus far.

      I happen to think that the CPU market is actually much more confusing. Just on the AMD side of things, you can get Athlon X2s, Athlon 2 X2s, Phenoms, Phenom 2s, Semprons, and so on.

  49. Nobody Important says:

    No, the Windows 7 installation media does not include both. Windows 7 comes with a 64-bit DVD and a 32-bit DVD. They could not fit them both on a single disc.

    Windows 7 64-bit doesn’t stink as much as XP’s or Vista’s implementation did; it’s about as good as 64-bit Linux these days, which is pretty dang solid and has little to no regressions (aside from no native Flash).

    • Azhrarn says:

      Windows 7 64-bit takes up 21 GB of diskspace for a good reason, that compatibility layer for 32-bit isn’t around for nothing.

      Most programs don’t run 64-bit natively, so it needs the 32-bit layer to make it all work. Only the drivers are actually 64-bit (and signed too, or they’ll refuse to install), the rest has both 64-bit and 32-bit modes available as required.

      Even Internet Explorer comes standard as 64-bit and 32-bit version with W7 64-bit (and runs in 32-bit mode by default).

    • Joseph says:

      “Even Internet Explorer comes standard as 64-bit and 32-bit version with W7 64-bit (and runs in 32-bit mode by default). – Azhrarn

      Why does it run in 32-bit by default? Problems with 64-bit? Guess it’s worth it just for the extra RAM, atm.

    • Optimaximal says:

      It’s for add-on compatibility.

    • Azhrarn says:

      @Joseph: no idea why exactly it runs 32-bit by default, but as Optimaximal said, compatibility with addons may indeed be the issue.

      Runs like a treat in 64-bit though, much quicker to load pages than it does in 32-bit mode. Although my browser of choice if Firefox for it’s plug-in features. It’s 32-bit though, since again there is no 64-bit FF for Windows.

      A little under half of the processes I have running in the background use 64-bit executables (since the 32-bit ones are marked with “*32” in Task-manager), and most of those are utility programs from device drivers.

  50. R3D says:

    yeh i have a 9800gx2 and it counts to the sli stats,
    i got my pc after having my other one for 8 years i had the cash and bought the best i could get at the time for my budget.
    i dont no if i would get 2 of any card even a second gx2 tho as heat is a huge issue i had to mod 2 140mil (overkill but if im going to the hassle of putting one i wanted to have 2) fans over my pcie slots to get the thing stable in summer and we dont have extreamly hot weather like 30*celc at max. i love my card tho and would even go a sli single card option again, but i would wait to see if there was a redesign as they fix a lot of heat issues with that sort of thing.