3D Cards: Calculus For Dummies

By Alec Meer on May 7th, 2008 at 11:59 pm.

So, SLI, triple SLI or quad SLI? The decision is so easy, and so cost-effective!

Some potentially promising news from the hardware side of PC gaming. Gamesindustry.biz has been chatting to NVIDIA’s Roy Taylor, who’s admitted that the graphics card company’s dreadful naming conventions (should we buy a GeForce 8800 GS, GT, GTS, GTX or Ultra? And with 320, 512 or 640Mb of memory?) are a little on the bewildering side, and proferred vague promises to simplify them. Somewhat ironically, Taylor is “VP of Content Business Development”, a title which does absolutely nothing to explain what his job actually involves – but hey, he sounds important.

Imagine, though, a world where choosing your next graphics card didn’t involve an hour of head-scratching research. Does a bright future await us? Mild venting beneath the cut.

“There is a need to simplify it for consumers, there’s no question.We think that the people who understand and know GeForce today, they’re okay with it – they understand it. But if we’re going to widen our appeal, there’s no doubt that we have to solve that problem.”
– NVIDIA’s Rod Taylor

Dunno if this means a genuine shakeup – like stripping the line back to something like GeForce Basic, GeForce Mid and GeForce Pro – or if it’s just hinting at a futile rebranding come GeForce 10, as abortively attempted with the 5 series’ pointless renaming to ‘GeForce FX’, or even ATI’s current ‘Radeon HD’ gibberish. Let’s hope for the former, as it’s a problem that desperately needs fixing -and not just by NVIDIA.

I was nosing at Mass Effect’s system requirements today, and found this under minimum graphics card:

NVIDIA GeForce 6 series(6800GT or better) / ATI 1300XT or better (X1550, X1600 Pro and HD2400 are below minimum system requirements).

I mean, for goodness’ sakes. So… The 1300XT is better than the X1500, X1600 and the HD2400? But the latter is a full 1100 better! Um. 1100 whats, exactly?

Facetiousness aside, I’m fortunate enough to be able to follow this stuff due to having spent some years working on a tech mag, but how in hell does it make any sense to someone who isn’t au fait with the increasingly nonsensical graphics card market? In fact, during that time on a tech mag, by far the most common reader phone call was from people wanting to know what graphics card they should buy. I wanted to cry whenever I got that call, but I did sympathise. Why is it not more obvious?

A traditional answer (or, at least, the established wisdom during my tech mag days. I’ve not, I stress, seen any reports to actually support it) to that latter has been that there’s deliberate obfuscation on the part of the graphics card companies. If the GeForce 11800 FX Pro Ultra XT is currently agreed to be the best-performing card on the market, word may trickle down to the unwashed masses. Except it will be diminished word – they’ll just pick up on ‘GeForce’, or maybe ‘GeForce 11′, and will be fooled into thinking the cheapie GeForce 11300 GS card they’ve spotted for what seems like a bargainous price is somehow awesome, just because it sports that hallowed prefix. It probably isn’t, though. It’s probably an overpriced shelf-warmer that can barely run Counter-Strike Source. The same happens with processors – people picking up dreadful Celeron machines from PC World just because they think the Intel sticker on the front signifies uber-power.

Another contributing factor is simply that these are tech companies, operating in an industry where incomprehensible number-based names are de rigeur, because hardware is made by stern men in labcoats who aren’t interested in impressing the kids. Take a look at the motherboard market, for instance, and you’ll be screaming in rank terror within minutes. For a firm like NVIDIA to do something different is actually a major break with tradition.

If NVIDIA’s talking about changing their naming system (which was further exacerbated by a) a long war with ATI, each company pilfering the other’s card name suffixes in the hope of thunder-stealing and b) trying to come up with a board for every single possible pricepoint), clearly it isn’t working so well these days. With super-cheap, Facebook’n’WoW-friendly PCs on the rise, it could be times are scary for a firm that largely flogs performance parts.

I wonder too if this has anything to do with NVIDIA’s involvement in the PC Gaming Alliance? One of the stated intentions of that much-sneered-at body is, I believe, to demystify PC gaming, or at least their apparently rather narrow idea of what PC gaming is, for a broad audience. There’s also the ongoing cold war between NVIDIA and Intel about the future of 3D – faster cards, say the former; processor-rendered Raytracing, say the latter. Perhaps NVIDIA hopes to avoid the axe by worming its way into more people’s hearts with a sudden burst of clarity.

Whatever, I’d certainly love to see a return to simply cheap card / expensive card, rather than wading through another sea of Pros and XTs and GTs and GTSes and GTXes. Silly buggers.

, , .

67 Comments »

Sponsored links by Taboola
  1. Leeks! says:

    Is “A board audience” some kind of cleverness that’s being wasted on me (like a pun), or am I being one of those annoying posters who points out typos?

  2. Saul says:

    The last time I was not confused about graphics card names, I was in the market for a card from 3dfx. And I’m a computer-building nerd! They really need to do something about this problem.

    For the record, I bought the card with 6MB of memory, over the one with 4MB.

    Leeks: I almost pointed out the typo, but then I rose above it. Plus you got in first.

  3. Alec Meer says:

    I wish it were the former, Leeks.

  4. cqdemal says:

    When NVIDIA came up with a new iteration of the 8800GTS, I thought it couldn’t get any worse. However, they then renamed the 8800GS to 9600GSO, while ATI/AMD has been rebadging products for quite some time.

    Utterly ridiculous. As an aside, I want the single-board, multi-GPU solutions to be gone too.

  5. Kommissar Nicko says:

    I want to know why my card is the size of my forearm. They seriously need to start giving you some of the exact dimensions for these bastards.

  6. a-scale says:

    I honestly can’t keep up with the market as a major geek. In the end I know I can’t look at all of the reviews or specs, so I trust a reviewer to tell me that such and such card is “one of the best on the market” and go with that. The graphics card companies would do well to narrow their lineup to say 5 cards, and then just focus on improving those to the utmost of their abilities.

  7. Mman says:

    This is some “dumbing down” I can get behind. The current card names are almost humourously convoluted.

  8. caesarbear says:

    Deliberately obfuscated names make money for Nvidia and AMD, plain and simple. It sells cheaper cards. Retailers like it, resellers like it, the manufacturers like it. It would take a mandate from on high to prevent these deceptive naming practices.

  9. Zarniwoop says:

    So are nVidia trying to differentiate themselves from their competion?

    Ever since Intel announced they were going to invest more heavily in the graphics market, nVidia have been acting really tensor.

  10. SenatorPalpatine says:

    Less bewildering naming conventions would be great for Nvidia, though I can get along with a little research as is.

    And it’s funny how no one cares about AMD at this point.

  11. bobince says:

    Deliberately obfuscated names only make money when they fool people who /think/ they know what they’re doing but don’t. Worked well for the [shudder] GeForce 4MX. But now I think the complexity is getting to the point where the average customer hasn’t the first clue… I know I barely do.

    A four-digit model number with multiple completely random suffixes is laughable when the first digit isn’t the most important performance criterion (or even a reliable process generation number), the second digit can be less meaningful than the suffixes, and the other digits are nigh-on always zero.

    Perhaps we should invent our own comprehensible naming system for the chipsets, and provide a web converter. So you could enter a “nVidia GeForce 11600GSTO Super XP” and it’d translate it to a “Green-team Midrange+ v7.1″. Or something.

  12. voice of pessimism says:

    It’s never going to be that simple – it’s a reality of the chip business that not every GPU on the wafer is going to have every pipeline working and be able to run at the intended frequency. There’s always going to be a few speed grades around each model. Plus, there’s the PCI/AGP full/half height active/passive cooling versions…

  13. MeestaNob! says:

    The simple fact of the matter is that a basic naming convention is impossible. Cant be done. Example:
    Ati announce tomorrow a range of cards for various levels of user: Ati 1000a, Ati 1000b, Ati 1000c. Now, at this moment I’d like to point out we don’t even know which is better, A or C, but we’ll assume A is the enthusiast model (power gamer MOAR POWAH rawr etc), C is for people who play solitaire under Vista and still need dx10, and B is a happy middle ground that can run most programs fine but will still shit itself if you even just see a web ad for Crysis.
    That all said, what happens when Ati inevitably release a revised model of any (all?) of these, they’ll no doubt become 1000A2 (100A1, 1000AA?) etc, but to the averagae consumer it becomes a confusing mess of names, badges and other superfluous letters/numbers that all amount to the same question at the store “Will my daughter be able run Happy Tree Friends on this?”, whereas people who know better (nerds) will be more interested in clock speeds and onboard memory and shit. The answer to the laymans question can only be answered with further information anyway, “How much RAM do you have, how fast is your CPU, whats the ambient temperature in Ballarat when a hummingbird sings?”, and the only way to possibly help these people is with diagrams and a 5 week seminar.

    Too hard.

  14. DigitalSignalX says:

    I still have a 3DFX over-drive board in the garage somewhere, which inspired me with how we could label video cards today.

    I’d guess the vast majority of people who care a great deal about their graphics cards play video games with them. The remaining few are likely concerned about some fancy nancy duel screen set up or require a graphics work station for work purposes. Those few rendering schmucks aside, why not just make a standardized benchmark that is easy to read and comprehend for the layman, but can be dived down into for the truly geek oriented. Then we can require all chipsets to advertise their performance values based on it.

    This is what Voodoo and Matrox did back in the day, posting the FPS numbers for popular games at max settings right on the fracking box. Nowadays, you can spend 3 hours sitting on Toms Hardware looking at performance analysis for a dozen different graphic benchmarks and still have not a clue what is the best bang for your buck in the GPU department.

  15. Jason Lefkowitz says:

    I never thought I’d hear myself saying kind words about Microsoft, but isn’t this problem what the “Windows Experience Index” is supposed to solve?

    In other words, it doesn’t matter if NVidia names their new ubercard the GeForce 11 Pro or the GeForce 1100GTO Macho XP 2009 Platinum Edition; as long as someone benches it under the WEI, you can be confident it outperforms cards with a lower WEI score. Right?

  16. Eric says:

    I would just like to see them cheaper. =P

  17. bobince says:

    Unfortunately the WEI “gaming graphics” score isn’t adequate as a benchmark, as it’s somewhat influenced by other system factors than purely the graphics card (and also in some cases constrained by DirectX support level).

    Plus of course nV/ATI would optimise/cheat the hell out of it, defeating the purpose. (I should imagine they’re probably working on this already…)

  18. Jason Lefkowitz says:

    Plus of course nV/ATI would optimise/cheat the hell out of it, defeating the purpose.

    Well, I’d imagine that would be true of any benchmark, so it’s hard to count that as too much of a strike against WEI.

  19. Noc says:

    @MeestaNob: All you need to do is put a LITTLE effort into making naming conventions clear.

    You’ve got three classes:
    A Series – L337 Hax.
    B Series – Decent.
    C Series – Cheap.

    Update them? A1, B1, C1, et cetera. Or, since you need to make them sound AWESOME, the A1000, B1000, and C1000 series. Then, when you upgrade, they turn into either the A1100 or the A2000, depending on how AWESOME the upgrade is.

    Thus, consumers can easily see which class of card they’re going for, and will know the most recent version of each class. Simple. It requires about fifteen seconds of education for the consumer, yes, but once they’ve learned it then they understand and don’t need to navigate a labyrinthine and arbitrary set of nomenclature. It only screws up when you add more numbers and letters for purely cosmetic (as opposed to informative) reasons, because you want your GPUs to be named like sports cars. So there’s no reason to make the AA4200XD-6 Ultra.

  20. Lukasz says:

    i don’t follow ati cards. have completely no idea what is the newest card and which one is the best (for performance, for budget)

    with nvidia it is pretty easy though. secondary number gives you the price range and power (600 is budget while 800 is power) while first number gives you series which makes previous one weaker (but one have to stick to category. so budget cards from 7 series are weaker than budget cards from 8 series but performance cards from 7 series might not be weaker than budget cards from 8 series)
    if i need correction please do so :)

    but in the past it was much easier. tnt, tnt2, ultra. higher number, addition of ultra made your card better. very easy to follow.
    ati was always problematic!

  21. Lh'owon says:

    Surely a clear naming scheme would benefit NVIDIA – many if not most of those same confused customers would jump at an easier-to-understand product line and shy away from ATI’s string of numbers.

    But I am cynical about the chances of any major change happening… these companies live off watching and mimicking each other.

  22. luckystriker says:

    Come on gents, it surely can’t be that hard to spend an hour browsing a hardware review site to get the info on what card is best in your price range. Even if they simplified the naming process, nothing would change for me; I’d still have to read up on it before my purchase. After all, we are dedicated PC gamers and choosing your graphics card is just part of that process!

  23. James T says:

    Lukasz: Get back to the Freedom base, you! This is Duty territory!

    Anyway, yes, card naming bad. I should get a 9600 though, rite guys?

  24. much2much says:

    This is why “Directx10″ cards are promoted so heavily because it is a differentiator that people who know a little (and unfortunately a lot of tech reviewers fall into this category) think it is the way to go. Over the past year or so you would be better spending your money on a good Dx9 card instead of a weak Dx10 card. Think the ATI 1950 Pro or XT. Now we are at a point where a Dx10 card is a no brainer.

    Its also interesting that Nvidia are doing this at a point where they have the graphics market by the balls (the REAL 9 series card is still sitting in their labs). A lot of this has been the old school industrial marketing of having 50 million products to convey you are a juggernaut even though you only sell two of them.

    cqdemal said “Utterly ridiculous. As an aside, I want the single-board, multi-GPU solutions to be gone too.” I don’t understand this. This tech is the way of the future. It is only at the moment it is far from transparent and not yet working as intended. Again it is only right now that the only reason to buy one is if you are allergic to money.

    In the next two years they will be as indispensable for a new system as a Dual Core CPU. The difference is they will also be the obvious choice just for a graphics upgrade into any existing system with PCIE. There is a reason that Nvidia whilst holding back their powerful tech quickly caught up to ATIs talk of two GPUs on one card.
    So much more elegant than SLI or Crossfire too and better for everybody. One less part to suspect when you have hardware hassles. Not to mention the extra cost of a dual card motherboard.

  25. Jeremy says:

    This stuff always made my head hurt. I still think I screwed myself on my graphics card.

  26. Larington says:

    What I find myself wondering, is why you can’t just have the year in which the graphics card was made (Designed, first released, whatever), as a way of simplifying which version/revision of the card it is. So you know an A class uber graphics card dated 2008 won’t be as powerful as an A class dated 2010. Plus variants marked as super cooled.

    At the moment, I’m having to make my purchasing decisions based on Core speed and Memory speed (Kind of assumes the website gives this information) of the card and I’m just not sure if thats a reliable way of doing it.

  27. Lukasz says:

    “What I find myself wondering, is why you can’t just have the year in which the graphics card was made (Designed, first released, whatever), as a way of simplifying which version/revision of the card it is”

    that would be totally confusing. How would you know which one is better. yeah . 2 years difference is good enough but what about 3 months difference?

    Super powerful cards are released earlier than their budget versions.

    Sorry mate. but that idea wouldn’t work.

    PS Any decent human being knows that Freedom is the only right choice.

  28. Gap Gen says:

    To be honest I end up checking Tom’s Hardware for benchmarks anyway. So I don’t really mind what it’s called as long as I can remember it. Might be good for the general public who don’t know about internet searches, though.

  29. John P says:

    I murdered Lukash earlier! He dropped some bullets, a gun and some slime.

  30. Fat Zombie says:

    Wait! I have an X1600 Pro. I thought it was a decent card (a while back when I purchased it, admittedly).

  31. James T says:

    PS Any decent human being knows that Freedom is the only right choice.

    But you keep deciding I’m an enemy! Also, all that George Michael…

    (…actually, I kinda like George Michael. Hey, this gives me a great idea for a STALKER audio mod…)

    What I sometimes find puzzling are the discrepancies between various brands, and what these brands actually do to the cards to warrant the price difference. “Do these Albatron, Xpertvision, BFG guys manufacture the cards wholly, or are they some kind of middleman who bolt stuff onto existing hardware they get from Nvidia? Why is the Albatron twenty bucks cheaper than the Gigabyte? Why is the ASUS thirty bucks cheaper than the Palit? (easier questions to answer if the specs of each card are laid out uniformly, but I find that’s not the case; some product descriptions are a wealth of information, some are as serenely blank as Keanu Reeves). Should I trust the TH review of, say, the 8800GT equally for all these brands? Going for the cheapest variant hasn’t led me to misfortune yet, but I certainly am curious about the differences.

  32. Lukasz says:

    nvidia and ati make a standard model.
    then different companies then can change it. Perfect it, overclock it, add some stuff.
    Also with brands comes warranty, customer support, etc.

    that’s the difference between different brands. what TH is actually testing is the stock model. unchanged version given them by ATI and Nvidia.

    and I will never die!
    (btw. Lukasz is my given, real name :D )

  33. Steven Hutton says:

    This summer I will be building my own PC for the first time ever. Confused doesn’t begin to cover it.

  34. James T says:

    Thanks Luk’, that puts me on the straight & narrow somewhat.

    …There aren’t enough Jameses in gaming.

    Steve: As in, selecting parts for the shop, or assembling it yourself? I don’t envy you the latter, my fingers feel like they’re cut to ribbons after the simplest fiddle with internal hardware, *sigh*. Bring on biotech sacs of gelatinous processing power!

  35. Cradok says:

    I’m very much in favour of this. Every now and again I think about upgrading my comp, but generally have to go and take a lie down long before I can reach any kind of descision.

    See also memory. Especially memory.

  36. Lukasz says:

    There is even more confusion with motherboards. How many people choose one based solely on price and whether that cpu will fit?
    MB are one of the most crucial components of PC. Learned that the hard way!

  37. Acosta says:

    It’s quite sad but I think Alec is spot on when he says that: “Perhaps NVIDIA hopes to avoid the axe by worming its way into more people’s hearts with a sudden burst of clarity”. This is the key, Nvidia is absolutely in panic mode about a future where GPUs are irrelevant and consumers just need one CPU, motherboard, RAM and a hardrive to have all they need for their PC. Much simpler, probably smaller systems and less confusing for the consumer.

    So now is when they start waving their hands and making statements about how they are key to contribute to PC gaming future, and how they are going to make life easier for consumer (when they had years for solving what they knew it was a confusing scenario that has managed to put many people away of building a PC for gaming). It’s quite sad they need competence to start heating engines so they start moving.

    I must say that I just bought at GeForce 8800 GT with my new PC and I think Nvidia has done a fantastic job with this one. And all about Larrabee and Raytracing sounds a little sci-fi for me, but I think Nvidia is going to need important changes in their business if they want I buy another product from them in two-three years when I change my PC again.

  38. Matu says:

    Acosta: fantastic job besides the fuckup called “AA doesn’t work in 40% of games and when it does, the quality of it is lower than expected”

  39. Ginger Yellow says:

    “That all said, what happens when Ati inevitably release a revised model of any (all?) of these, they’ll no doubt become 1000A2 (100A1, 1000AA?) etc, but to the averagae consumer it becomes a confusing mess of names, badges and other superfluous letters/numbers”

    The obvious solution is not to revise the cards so often. Produce a new range every two years or so, and spend the time in between bringing down the costs and improving the drivers. A mass market product can’t have five new SKUs a year. All you need is a range marker (eg A-Z) and a price marker (eg 1-3).

  40. Paul Moloney says:

    .

  41. Gulag says:

    Just bought a new Geforce 8800 GTX yesterday, and like a previous commenter, I was a little taken aback by the size of the beast, but I figured, “Meh, that’s progress.” What I wasn’t prepared for was the fact that it needed TWO power connectors! When did that become standard?! (My last card was a Radeon 9800 Pro.)

    If there is one thing they should put on the box for us once in a thousand years upgraders, it is this: “YOU ARE NOT PREPARED!”

  42. Sam says:

    On high-end cards, like the 8800GTX, it’s been standard since… well, the GTX, I believe.

  43. voice of pessimism says:

    much2much – dual GPU/SLI is basically the same as doubling the number/speed of graphics pipelines, but doing it in a really hacked-together inefficient way. Unless they can make it completely transparent to apps and provide proper unified memory to both GPUs it’s not really a sensible consumer solution. Of course if there’s a market for it then they might as well do it, but I don’t see it ever being mainstream, there’s no need when GPUs are inherently parallel and constantly getting faster anyway.

  44. Grant Gould says:

    A couple of years ago, I broke the code of video card naming and numbering:

    “If you don’t want to spend hours parsing meaningless numbers, badges, buzzwords, and markups through a hundred reviews only to discover that what you chose is out of stock, priced incorrectly, or only available on alternate Tuesdays for users of the next, prior, or subsequent version of Windows — and then do this again every year… buy a console.”

    PC gaming will catch up with consoles when the last video card product marketer has been hanged with the entrails of the last computer equipment salesman.

  45. Nallen says:

    Just bought a new Geforce 8800 GTX yesterday, and like a previous commenter, I was a little taken aback by the size of the beast, but I figured, “Meh, that’s progress.” What I wasn’t prepared for was the fact that it needed TWO power connectors! When did that become standard?! (My last card was a Radeon 9800 Pro.)

    If there is one thing they should put on the box for us once in a thousand years upgraders, it is this: “YOU ARE NOT PREPARED!”

    I did exactly the same thing, cue theft of brothers PSU.

  46. Geoff says:

    I don’t believe it’s deliberate obfuscation. As mentioned there’s the Intel example (which is better, my Celeron 3.2 GHz, my Core Duo 2.8 GHz, or my Core 2 Duo 2.6 GHz?) but it’s also common in high-dollar IT stuff like Cisco and IBM products:
    3524 – oh, makes sense, 3500 series with 24 ports.
    3550-24 – uh, what? 3550 series with 24 ports?
    3750G-24-TS – okay, now you’re just messing with me…

    It’s not to confuse people into buying the wrong stuff (because usually when you’re buying 20 of the $40,000 switches, you’ll have consultants and sales engineers to explain everything and select the right ones) it’s just that the technology itself is complicated, and when the tech people and the marketing people fight over the naming conventions, for once the tech people win.

    It’d be lovely if we could get a standard benchmark for all cards – Jason mentions Windows Experience Index above, but I’m thinking more neutral like flops or polygons per second or somesuch. But then, you’re just measuring raw speed. What about the RAM? Is a 1.2 Gigaflops card with 256M RAM better, or the 1.1 Gigaflops with 512M RAM? What about the 1.0 Gigaflops card with 512M DDR3 RAM instead of DDR2 RAM?

    The problem is that there are actual technical complexities involved, and it’s hard to abstract them all behind one or two metrics.

  47. much2much says:

    voice of pessimism:
    I’m not trying to be funny here but I don’t think I understand you.

    Do you mean that an ideal solution would be one video card with twice the pipelines as opposed to two video cards? If this is the major limitation at the moment why don’t they do this? I imagine there is a technical consideration. Where I live the trains are a maximum of 8 carriages. Sometimes 16 carriages would be better but they can’t do it.

  48. Sharkwald says:

    If the PC Gaming Alliance wanted to get their collective thumb out of their arses and actually help consumers about Graphics Cards, here’s what I think they should do. Pick 3 current gen 3D engines, for example, right now I’d pick WoW’s, the Unreal 3 engine and the Cry Engine, roughly representing the low, mid and high ends. To achieve PCGA certification, each card has to have a 3×3 grid on the back of their box, corresponding to one of these engines. This 3×3 grid show’s performance in these engines at 1280*960, 1680*1150 and 1920*1200 at minimum, medium and maximum settings (all other system variables being declared and equal, of course). Have the average FPS in the cell, and colour the box red for FPSs lower than 20, yellow for 20-40 and green for 50 and above. Every 2 years pick 3 new games, and 3 new resolutions. Things like Valve’s hardware survey can help out there, giving an indication of what resolutions and CPUs are actually being used.

    Then, also to get PCGA certification, games can declare themselves as being in the WoW class, the Unreal 3 class or the Crysis class.

    That would actually help me as a consumer — Crysis doesn’t interest me, but Bioshock does, so I’d know to get a card that did well in the middle category, but could save the expense of a truly top end card. As it is, I can find this info by trawling Anantech and TH, but that takes several hours and commitment, whereas checking the colour of a box on the back of a card takes 0.5 seconds.

  49. voice of pessimism says:

    Exactly, the ideal situation is one GPU with more pipelines – that’s basically how they go from an 8500 to an 8800 or whatever instead of just bolting 4 8500 GPUs on one board.
    They can’t just add pipelines indefinitely though, because of problems with yields (more stuff means more chance it comes out broken, and a bigger GPU means fewer to sell per wafer of silicon, and also power supply/heat dissipation gets tricky eventually), so multi-GPU happened. This is a pretty reasonable solution if you absolutely must have the maximum amount of rendering power, but normal people don’t.