Graphics Arms Race Costs An Arm & A Leg

There’s been plenty of predicition lately that the age of supermegapixelshaderooed blockbuster games on PC might be drawing to a close, in favour of lower-spec, lower-profile inventiveness from the indie, MMO, browser-based and casual scenes. What there hasn’t been is much hard data that reflects this possible sea change. The news that current 3D card king NVIDIA recorded an eyewatering $30 million loss last year (that’s after a $797.6m profit in the preceding year) could have something to do with it.

Of course, it could also have something to do with the general worldwide moneypocalypse, or of eejits shouting about the death of the PC potentially having something of a point. Nah. I suspect it’s to do with the lack of a game that really demanded a system upgrade last year. 2007 was the year of Crysis, and despite that game’s debatable merits, it was a waving flag for a new graphical generation – a spur to buy a better graphics card even if you weren’t interested in that game specifically. 2008 lacked such a game – the PC’s biggest titles were either less graphically ambitious or reworks of console titles. Poor optimisation may have meant that half the time the latter did need a powerhouse 3D card anyway, but that’s not a card-shifter in the way LOOK AT THESE AWESOME GRAPHICS is.

I wonder if NVIDIA’s investment in Physx, and (presumably) expensively rolling out physics-via-GPU support as a result, is an experiment that really didn’t help. It’s not something that’s proven its usefulness to Johnny Average Gamer – until there’s an NVIDIA physics killer app, it’s not likely to sell many cards. In other words, it could repeat the failures of Physx back when it was Ageia’s struggling baby.

Here’s NVIDIAn emperor Jen Hsun Huang’s thoughts on the bad news:

The environment is clearly difficult and uncertain. Our first priority is to set an operating expense level that balances cash conservation while allowing us to continue to invest in initiatives that are of great importance to the market and in which we believe we have industry leadership. We have initiatives in all areas to reduce operating expenses.

Which sounds worryingly like it could mean job losses, always upper management’s sucker punch response to money problems. Let’s hope not. Whatever they do, can the big N pull it back, or are 3D card upgrades only going to go more out of fashion as integrated chips grow ever more capable and the bulk of new PC games less demanding? ANSWER.


  1. Feet says:


  2. Rook says:

    I think the loss is far more likely to be due to the write-off they had to do over the mobile graphics chips that were defective as well as AMD/ATI actually having a competitive card with the Radeon HD4XXX series rather than the abysmal HD29XX and HD39XX.

    That being said, AMD/ATi will run out of money pretty soon even if they hit all their sales targets, so the field is still wide open. I’m fairly interested to see what intel is going to bring to the g/card table with larabee or whatever it’s called.

  3. Kelron says:

    It could also be related to the return to form of ATI cards. Is it possible to find out how much of a profit/loss they made, or would it be skewed due to being owned by AMD?

  4. Dave Gates says:

    I don’t really see this as a particular problem, I imagine most companies, irrelevant of their product, probably had a record drop in sales in the last year or so. Indie gaming may per se have a hand in it but if that forces developers to make their games interesting as well as beautiful then it can only be a good thing. In the end we may be getting the back lash that hollywood recently had where people were sick off jumped up spectacle and wanted better plots and films full stop. Please let this herald the return of traditional Lucasarts. Day of the Tentacle 2 anyone?

  5. Feet says:

    More helpfully I intend to upgrade and invest into a new PC soon, my 6800 is good enough for L4D but could not cut the Fear 2 mustard and it never managed Crysis even.

    Yes I think that we’re seeing the beginning of the end of high performance graphics cards being really big business, atleast for the time being. As you say there was no figurehead killer-app last year and I don’t think there’s going to be one this year either, PC exclusive cutting edge games are probably a thing of the past, Crysis was the last of this breed of games. In 2009 they’ll have to bundle Solium Infernum with their GFX320 XT eXtreme cards instead.

  6. Dinger says:

    Isn’t the word on the street that they’ve burned so many bridges with Sony/Microsoft/Nintendo that their long-term strategy needs to be aggressive, since they won’t be seeing any console deals any time soon?

  7. Duoae says:

    I always thought that the big N was Nintendo…. Nvidia are small fries compared to them :)

    To be honest Nvidia made a stupid mistake and there are more reasons why they made that loss:

    First, Nvidia kept concentrating on the super expensive over the top, high-margin and low volume graphics cards. When AMD/ATI came out with the 4870 and 4850 at reasonable prices and excellent performance (with single slot cooling as an option) it meant that Nvidia had to cut the prices of all of its mid-range cards – which is where the bulk of the money comes from.
    Of course, AMD/ATI have also been squeezing Nvidia on the high volume, low end market of the 7300 level of cards so that’s a reduced income as well.

    Second, there’s the whole problem with their solder (or whatever it is) with at least 3 generations of graphics cards from the low-end to the high-end (including laptops) being potentially affected by it – which means lots of returns and either reimbursments or ‘free’ replacement cards – which means a big loss of income and consumer faith.

    Third there was the acquisition of PhysX – which meant that they will have posted a loss on the amount they paid for the company.

    Finally, the future of graphics cards is definitely 1 card and a complex CPU to handle physics/etc. This whole multi-GPU malarky was a long, drawn-out waste of money for both ATI and Nvidia just as it was when the idea first came around in the late 90s.

    The other end of the problem (i.e. not controllable by Nvidia and ATI) is that development focus is shifting to consoles and as a result the nunmber of high-end gamers available on PC is also either diminishing or their preferred games are as a result.
    The second part of the development problem is diminishing returns. Graphically we’ve reached a point where it’s no longer feasible to keep pushing the graphics envelope. The huge cost increase isn’t justified when small improvements in gameplay, AI, story and usability (which have been left at the wayside a few years ago) will accomplish much more for a fraction of the cost and time.

    This all means that you don’t (and won’t) need a super-high end brand new computer to play dedicated (or well-ported console to) PC games. My PC (P4 3GHz, AGP X1950 Pro 512MB, 2GB PC2100) will still handles everything i throw at it with the exception of stupidly buggy or unoptimised games or something like Crysis. If that hunk of lovely junk can make it through almost everything while still managing to make the games look pretty and at a playable framerate then what does that say about the newer mid-range PCs which have a a really cheap (but more powerful) dual core CPU and sub £100 PCIe graphics cards?

  8. Morberis says:

    My graphics card is the one thing I haven’t looked at upgrading, at an 8800GTX I look to be sitting pretty for at least 2 more years. Hell I may even be able to run everything on high for those two years.

    The one thing I have noticed holding me back however is my CPU, but that’s because I run CPU intensive games like Dwarf Fortress.

  9. Subject 706 says:

    Yeah, well Nvidia does release too many monstrous gfx cards in too short a time-span if you ask me. Add to that a global financial crisis and cheaper and competitive cards from Ati.

    Short of Nvidia starting a game studio of their own, and pumping out increasingly demanding games, there isn’t that much need to go for their high end cards at the moment.

    Then again, seeing that games are increasingly lacking in everything but the gfx department, slowing down that arms race is probably not a bad thing.

  10. Heliocentric says:

    As soon as Intel’s on core gpu/loadsa core is ready non of this will matter.

    The whole concept of rendering graphics externally from the core is kind of stupid and limits development of graphics anyway. For example, AI characters dont actually see, they are presented with an abstraction.

  11. Nallen says:

    They need to slow the hell down, as stated there is nothing recently that has required all this grunt. I have an 8800GTX that I got for about £150 new and a 2 year old E6600 and I’ve really had no problems running anything in the current crop.

  12. The Sombrero Kid says:

    nvidias fab transition was a disater the g92 is a broken component and most people know it nvidia need to sell it cause of the r&d cost but consumers don’t want it cause it’s broke hence nvidias GTX cards equivelent to ati’s costing £30 – £60 more than ati’s they need to make the money off those

  13. redrain85 says:

    Rubbish. As I’ve said before, in a previous comment: the PC has become the proving ground for future console technology. All the innovation happens on the PC, and then filters down to the consoles later.

    If companies like ATI and nVidia stop creating new high-end graphics cards . . . then who will want to buy the next console, if it doesn’t have/benefit from the advances made on the PC side? If the next generation of consoles doesn’t offer much better – in the way of visuals – than the previous generation, there isn’t going to be much incentive to buy.

    It pisses me off, though, to think that the PC and PC gamers have become the guinea pigs for what’s to become future console technology. Companies like Microsoft, Intel, ATI, and nVidia are just going to turn around and provide all the benefits of what they developed on the PC to the consoles, and continue to leave us in the lurch from now on.

    Sure, they’ll throw us a bone now and then and continue to prop up PC Gaming, but only because they’re constantly thinking about how they can bring these things over to the console space. It’s disgusting.

    And nVidia only have themselves to blame for their miserable last quarter. They became way, way too arrogant and ATI smacked them silly for at least 6 months. Then add in the whole economic downturn toward the end of 2008, and it spelled complete doom for nVidia’s sales forecasts.

  14. The Sombrero Kid says:

    i don’t think that’ll happen any time soon waste of resources when in development and for the comp when it’s easier to fake it, side note mit blew through a massive budget to try and get a machine to recognise objects from any angle as ‘seeing’ would require and they failed misserably

  15. Magic says:

    Funny Story: I bought a new GPU last year because my old one started to cause trouble at exactly the point where Dwarf Fortress refused to start. (Source Games etc. still worked with a few crashes sometimes)
    I actually paid about 100€ to play DF O.o

  16. clive dunn says:

    The word on the street it that Dell are going to sue the crap out of them for the laptop shenanigans. I doubt Nvidia are looking forward to 2009 too much!
    You make your bed, you lie in it, i guess

  17. Bobsy says:

    How have ATI fared under the money-melting then?

  18. Dave Gates says:

    Yeah I agree, I think most people seem to be making the point that there is no need to really shove this uber technology at everyone. I’ve also been running the same card and had no problems, I played Crysis and it nearly made my PC foul itself but I thought, “I’m not shelling out hundred of pounds just to play one game”. A year later Left4dead arrives, its a better game and it runs like a dream. Innovation for innovation sake is a pointless endeavour if only a few can afford to experience it.

  19. Fitzmogwai says:

    A couple of points from the comments and story above:

    Nvidia seem to have done everything the possibly could to destroy their business over the last few years. Alienating their potential console customers and losing that business. Designing broken components and then trying to cover up the story. Alienating their PC manufacturer partners with misinformation about such, and then being stung for hundreds of millions of dollars in costs to make up for their mistakes. Their products are overpriced and they’ve lost the middleground of the GFX market to ATI, which is where the volume sales and real money lies. Flagship cards are all well and good, but they’re rarely actually available to buy, and sales are so small that any profit from them is simply a drop in the ocean.

    Their share value has gone through the floor, and frankly I’m amazed that Jen-Hsun Huang hasn’t been ousted in a boardroom coup.

    Anyway, redrain’s comment made me think. His analogy reminds me of Formula 1. If top-end PCs are the F1 cars of the gaming world, leading technical development which then filters down to other, lesser machines, then doesn’t the PC games industry needs to make some noise about this?

    Maybe we’re in the same situation as F1, where the races are boring and there’s dwindling interest. Just as F1 needs a shakeup to recapture the attention of the wider world, so PCs and PC gaming perhaps need something similar. I’ve no idea what though.

    Any ideas?

  20. The Sombrero Kid says:

    @Magic if it’s a g92 chipset underclock it you’ll get some more life out of it.

  21. subedii says:

    The problem for Nvidia is also that devs have finally realised that pushing system specs for their own sake is killing their sales. We largely reached the peak of what this current generation of consoles can achieve, nd developers aren’t really going to push graphics beyond that level until the next console generation, several years from now. The problem is that an 8800 will pretty much suffice for that level of graphics

    Whilst a GTX 260 or 280 might be nice to smooth things out and add in a bit more of the shiny, they aren’t actually needed in order to keep up with modern release games, and probably won’t be necessary until the XBox 1080 (or whatever they want to call it) comes out.

    I think that’s also part of the reason that Nvidia have made a recent push with trying to renew interest in “3D” gaming with the goggles and stuff. You need much higher grade hardware to run games when you’re effectively rendering each frame twice from separate angles. Unfortunately, I expect it’s not going to take off (like it failed to the past 3-4 times they tried) and that’s just going to remain a gimmick.

  22. Pags says:

    Is it okay if I be really boring and just say that it’s probably just down to people’s unwillingness to spend silly amounts on what is essentially a frivolous purchase, particularly when every swinging dixie is proclaiming the economic meltdown to be on a par with the end times.

  23. FoolsGold says:

    I believe that slowing the rapid pace of GPU adoption is a *very* good thing. As the gaming market adapts to games with a broader appeal than the traditional core gamer – spearheaded by titles like The Sims and, latterly, Spore – the need for the latest ultra-whizz-bang graphics accelerator declines. The push for realism is one thing but what this completely ignores is that people don’t need such hi-fidelity graphics to enjoy a game. A reasonable level of overall graphical maturity is enough: millions of WoW players can’t all be wrong…

    And, as already noted, if you concentrate on graphics, the other things that can sell a game lose out – story, gameplay, etc. It’s long past time devs got back to using their creative bones instead of relying on graphical polish to gloss over the game’s jarring faults. Valve have the right idea – they have the metrics to pinpoint the graphical level their games should run at and pitch their games accordingly, avoiding the exceptionally elitist Crysis technique of demanding people spend 100’s of $/£ on hardware to be able to run it at a decent clip (God, how could they have been surprised that their game didn’t sell?)

  24. FoolsGold says:

    Another point: I wonder if PC graphical fidelity has now reached the DVD/Blu-ray point, where plain old DVD seems to be good enough for the masses and only enthusiasts/hobbysist see any benefit in the HD format?

  25. subedii says:

    @Pags: Well sure you can, if we want to bring real life into things. And who wants that?

    Realistically, I think it’s just a combination of what everyone’s already said, there’s plenty of reasons why Nvidia should be doing poorly right now. As far as I can see:

    – Global economic downturn
    – No need to upgrade given current gen games
    – Significant investments that didn’t turn out the way they were hoping (or perhaps just not fast enough)
    – Problems on the console front
    – ATI getting their act together a bit better this gen
    – Supply issues

    There’s probably a few others too.

    Personally I think that if the graphics market starts to slow down, that’s good think. What I really hope is that someday integrated graphics make a comeback, but that’s likely a pipedream, at least for the moment.

  26. StalinsGhost says:

    For me, I don’t think the ATI factor can be underestimated. The 4×00 range has been an undeniable field leveller.

  27. Gap Gen says:

    Part of the problem with physics in games is that it’s not immediately obvious. Cell Factor just looked like crate porn with dubious gameplay, so you do need something more subtle – but at the same time I didn’t notice that, say, Ghost Recon: Advanced Warfighter had noticeably better physics than games without PhysX. Better graphics is much more obvious.

    I think there’s a technological gap in game physics that requires a leap to game mechanics that people don’t even know they want. For example – fluid physics (for wind, explosions, water, etc) could add interesting dynamics to games, but it’s very computationally expensive to do properly and I don’t know if there’s really the impetus to see games that have it.

    Other than that, “physics” at the moment just means more solid-body physics and more crates flying around.

  28. phuzz says:

    Personally, when I built my current rig about this time last year I picked up a 8800GT, because it was about the right price, and seemed to be quite good.
    Now a year later and it’ll still play new games at full everything, even on my big widescreen monitor. Guess how long it’s going to be before I upgrade?

    (mind you, I tend to switch back and forth between ATI and nVidia, started with a 3DFX Voodoo 3, then a Geforce 2 something, then an ATI 9600XT, then an nvidia 6600GT, then I forget).

  29. Talorc says:

    link to

    The problem was a complete collapse in 4th Quarter (eg Christmas) revenue – $480 million instead of the $1.2 billion they made last year.

    Kind of fits with the general economic meltdown scenario, rather than an industry specific problem. Up until October 2008, they were actually slightly ahead of where they were in 2007, albeit with a significant slow down already beginning.

    So I would suggest that consumers simply failed to pony up for big ticket graphics cards upgrades in this uncertain economic times.

    As for write downs from stuff ups etc, these were not too serious, only around $330 million for the full year. Not that big in the overall scheme of things

  30. skillian says:

    Even the layman will appreciate the quality of Blu-ray over DVD when they get a 50″ screen, and the same applies as standard PC monitors get to 22″, 24″ and 30″ screens.

    I think graphics still have a very important role to play in gaming, and until our games look like live-action movies, the demand for better and better visuals will not go away.

  31. vicx says:

    (0-0) 3D STEREO needs twice the frames.

  32. Mark says:

    Could we have reached the graphics plateau? Could we have come to the point where it is simply not economical to make graphical content so complex that it requires the latest hardware to take full advantage of it? Might people have caught on to the fact that it saves money to be a generation or two behind? Could developers have learned that they will be better served by ensuring games can run on lower-end machines? Have, in short, the various influences on the GPU market finally demonstrated an ounce of sanity and common sense?

    It is a great mystery.

  33. Radiant says:

    I said this before but I really do think that the touted poor sales in PC games has less to do with piracy and more to do with the rise of the affordable net book.

    With the rise of consoles that have the perception of being as good as a high end pc why anyone would buy a near equivalently priced component instead is beyond me.

    I think this downturn in the gfx card industry is an effect of that and wont be easily turned around.

    Integrated chips are one thing but the days of the ‘killer’ gfx card are numbered.

    [Not to sidetrack this wonderfully over my head technical discussion.]

  34. Gap Gen says:

    Integrated chips need to be far better, I think. Basic computers come with utterly shit graphics cards, so it’s possible that the low end is holding up the market.

  35. Tei says:

    About software-ish rendering, I will make a comment, with no words, with a few screenshots.

    Sim Copter
    link to

    Magic Carpet
    link to

    Dungeon Keeeper
    link to

    Liquid War
    link to

    Cortex Command
    link to

  36. Radiant says:

    I hate this tiny ass comment text entry area.
    It’s like reading back your post through a letter box.

    I feel like I need to feed it money or its going to slam shut.

  37. Tei says:

    There are not a Lemmigs screenshot because I hate Lemmings. I hate soo much, that I would love to play a Turrret Defense Kill All Lemmings Game.
    Humm,…. I lied, heres a screenshot:
    link to

  38. Dinger says:

    There’s plenty of ways to expand, especially on the procedural front. PhysX and GRAW sucked, and will continue to suck, if it’s an option. I explain myself: “physics acceleration” in GRAW consisted of throwing more particles out of the explosions — that’s no change in gameplay, only in eye-candy.
    Physics has always been one of the underappreciated catalysts for games. Games where you interact with a world are simulations in a broad sense of the word. And the vast majority of game types: FPSs, FRPGs, platformers, are all simulations.
    So you can see why nVidia would go the PhysX route, and why it would seem to make sense. But then a hardware specialist will point out that a graphics card has an end result that is outside of the computer box (the display), and 3D acceleration is a bunch of tools to describe objects in space.

    Physics? Time and eternity, space and void, motion and rest. Weren’t CPUs developed in the first place to deal with physics problems?

    Anyway, heck, nVidia took a $330M writedown, made $800M less in Q4 2008 than in Q4 2007 — a full two-thirds less — and still only lost about $50M? That’s not so bad. Everybody knows these are rock hard times.

  39. Garg says:

    Lots of people are commenting on how ATI have seized the middle ground with the 4850, and saying NVidia focussed too much on the flagships.

    But looking at Tom’s Hardware, and other sites, ATI seem to be king of the top end too, with the 4850×2 and 4870×2, either of which seem to be better than NVidia’s corresponding price point offererings. Hence why I recently upgraded to the 4870×2 from my old NVidia card.

    But looking at market share, on something like the Steam survey for instance, NVidia are still miles ahead of ATI. I think this is mostly due to the credit crunch, and so people are not buying new PCs or upgrading their old one.

  40. mrrobsa says:

    I don’t think we’ll see an end to increasingly powerful GPUs, I know companies which are creating engines that make better use of the amazing parallel processing power of modern GFX cards. I’m told my 8800GTX blows my 2.6Ghz Duo away for many processing tasks, I expect to see this utilised and for graphics cards to get increasingly beefy.

  41. Fitzmogwai says:

    It’s a fair point, Garg, but the fact is that top-end cards sell in (relative terms) single-digit numbers. You’ll only buy one (or two) if you’ve got money to burn and you’re after cock-waving bragging rights.

    The manufacturers want the top spot because it’s great advertising, but those cards are, to all intents and purposes, loss-leaders. It’s the middle ground where the battle’s fought and won, especially now that more and more people are realising that a new £100 card each year will give you consistently better performance over time than a £400 top-end card that you hang onto for four years.

    With the 4xxx, ATI have cut the ground out from underneath nvidia, and, rabid fanboys aside (of which there are mercifully few), most people are not brand-tied, and will buy on perceived value, especially in times when money’s tight.

  42. Francisco says:

    I want to know the SALES difference.

  43. Francisco says:

    I cant edit and info is on the link. Revenues are down 16% but the profit was a lot lower. So, where did they spend the money.

  44. Fitzmogwai says:

    Garg, rereading that, I don’t want to sound as if I’m insulting you for buying a 4870×2 – I’m not! I’d have one if I had a monitor large enough to warrant one.

  45. Pags says:

    Even the layman will appreciate the quality of Blu-ray over DVD when they get a 50″ screen

    Goddamned right, I just got done watching The Thing on Blu-Ray and by Jesus does that film look good.

    Also I’m glad Talorc provided the figures to back up my oh-so-wildly speculative theory. Namely “maaaan these things are pretty expensive, I probably shouldn’t buy one right now”.

  46. Gap Gen says:

    “Physics? Time and eternity, space and void, motion and rest. Weren’t CPUs developed in the first place to deal with physics problems?”

    I’m not sure exactly what the first sentence means, but no, processors were first invented for code-breaking, as I understand it. If it helps, the NVidia Tesla is specifically designed for HPC (so complex physics problems and so on). The reason is that lots of slower cores can use less power per flop, so if you want to use thousands of cores to solve a big problem, you’re spending less on electricity. That, and solving some problems on GPUs can be many times faster.

  47. Jeremy says:

    I think in a way, graphics are like robots (bear with me). The closer they look to reality, the more horrific it becomes. Something that looks 95% human is much scarier than something that looks 20% human. So graphics may not be scary, but the closer they get to looking real, the more we’ll realize it isn’t real. It seems like computer games are about to have a full on renaissance, with a focus on a more artistic style rather than realism. No more shiny human skin (CoD 4), hair floating just above the skull line, or creepy marble eyes (G Man, HL2). Kinda glad about all of that to be honest, and I’m looking forward to games that more artistically present their ideas. I’ve always thought style preceded anything else, World of Warcraft (no, I’m not a fanboy, I actually don’t play WoW) is a prime example. I would much rather run around in a less graphically impressive world with a ton of character than vice versa.

  48. Fitzmogwai says:

    The “uncanny valley”, Jeremy.

    link to

  49. Arathain says:

    Jeremy: they call it the Uncanny Valley. It’s a well recognised problem.

    I’m inclined to think the problems are more from the global downturn and general belt tightening. But I have often wondered if we’ll reach a point where creating extraordinary fidelity in games simply becomes economically non-viable simply from the amount of labour it takes to create those worlds. While I accept that one needs the PC to push the graphical envelope so the consoles can follow, I wonder if only the consoles can generate the sales volume to sustain the expensive to make, high graphicsability titles. I think this will lead to a slower increase in quality, as people have to work out how to do less with more, rather than just throw cycles at it.

  50. JoeDuck says:

    Hmmm, I think most people in this thread are right, the cards are overpriced, the companies are overhyping, the games are not coming and the important thing is gameplay anyway.
    BUT, Empire Total War, Armed Assault 2 and Flashpoint 2.
    March 4 is coming, so next week i’m buying a new computer, with of course a stupidly expensive graphics card.
    My head and my wallet tell me to not do it.
    And I’m still doing it…