Skip to main content
If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

3D Cards: Calculus For Dummies

Some potentially promising news from the hardware side of PC gaming. Gamesindustry.biz has been chatting to NVIDIA's Roy Taylor, who's admitted that the graphics card company's dreadful naming conventions (should we buy a GeForce 8800 GS, GT, GTS, GTX or Ultra? And with 320, 512 or 640Mb of memory?) are a little on the bewildering side, and proferred vague promises to simplify them. Somewhat ironically, Taylor is "VP of Content Business Development", a title which does absolutely nothing to explain what his job actually involves - but hey, he sounds important.

Imagine, though, a world where choosing your next graphics card didn't involve an hour of head-scratching research. Does a bright future await us? Mild venting beneath the cut.

"There is a need to simplify it for consumers, there's no question.We think that the people who understand and know GeForce today, they're okay with it - they understand it. But if we're going to widen our appeal, there's no doubt that we have to solve that problem."
- NVIDIA's Rod Taylor

Dunno if this means a genuine shakeup - like stripping the line back to something like GeForce Basic, GeForce Mid and GeForce Pro - or if it's just hinting at a futile rebranding come GeForce 10, as abortively attempted with the 5 series' pointless renaming to 'GeForce FX', or even ATI's current 'Radeon HD' gibberish. Let's hope for the former, as it's a problem that desperately needs fixing -and not just by NVIDIA.

I was nosing at Mass Effect's system requirements today, and found this under minimum graphics card:

NVIDIA GeForce 6 series(6800GT or better) / ATI 1300XT or better (X1550, X1600 Pro and HD2400 are below minimum system requirements).

I mean, for goodness' sakes. So... The 1300XT is better than the X1500, X1600 and the HD2400? But the latter is a full 1100 better! Um. 1100 whats, exactly?

Facetiousness aside, I'm fortunate enough to be able to follow this stuff due to having spent some years working on a tech mag, but how in hell does it make any sense to someone who isn't au fait with the increasingly nonsensical graphics card market? In fact, during that time on a tech mag, by far the most common reader phone call was from people wanting to know what graphics card they should buy. I wanted to cry whenever I got that call, but I did sympathise. Why is it not more obvious?

A traditional answer (or, at least, the established wisdom during my tech mag days. I've not, I stress, seen any reports to actually support it) to that latter has been that there's deliberate obfuscation on the part of the graphics card companies. If the GeForce 11800 FX Pro Ultra XT is currently agreed to be the best-performing card on the market, word may trickle down to the unwashed masses. Except it will be diminished word - they'll just pick up on 'GeForce', or maybe 'GeForce 11', and will be fooled into thinking the cheapie GeForce 11300 GS card they've spotted for what seems like a bargainous price is somehow awesome, just because it sports that hallowed prefix. It probably isn't, though. It's probably an overpriced shelf-warmer that can barely run Counter-Strike Source. The same happens with processors - people picking up dreadful Celeron machines from PC World just because they think the Intel sticker on the front signifies uber-power.

Another contributing factor is simply that these are tech companies, operating in an industry where incomprehensible number-based names are de rigeur, because hardware is made by stern men in labcoats who aren't interested in impressing the kids. Take a look at the motherboard market, for instance, and you'll be screaming in rank terror within minutes. For a firm like NVIDIA to do something different is actually a major break with tradition.

If NVIDIA's talking about changing their naming system (which was further exacerbated by a) a long war with ATI, each company pilfering the other's card name suffixes in the hope of thunder-stealing and b) trying to come up with a board for every single possible pricepoint), clearly it isn't working so well these days. With super-cheap, Facebook'n'WoW-friendly PCs on the rise, it could be times are scary for a firm that largely flogs performance parts.

I wonder too if this has anything to do with NVIDIA's involvement in the PC Gaming Alliance? One of the stated intentions of that much-sneered-at body is, I believe, to demystify PC gaming, or at least their apparently rather narrow idea of what PC gaming is, for a broad audience. There's also the ongoing cold war between NVIDIA and Intel about the future of 3D - faster cards, say the former; processor-rendered Raytracing, say the latter. Perhaps NVIDIA hopes to avoid the axe by worming its way into more people's hearts with a sudden burst of clarity.

Whatever, I'd certainly love to see a return to simply cheap card / expensive card, rather than wading through another sea of Pros and XTs and GTs and GTSes and GTXes. Silly buggers.

Read this next