Week in Tech: Nvidia’s Mighty New Maxwell Graphics

By Jeremy Laird on February 20th, 2014 at 9:00 pm.

Nvidia’s new Maxwell graphics kit, then. It’s out but what’s it all about? Epic performance density and power efficiency is the elevator pitch, with a spot of improved cryptocurrency hashing thrown in for good measure. But are the first new Maxwell boards – the GTXs 750 and 750 Ti – the bomb or a bum deal?

And so to Maxwell. That’s the codename for Nvidia’s latest and greatest GPU architecture. In time it will entirely replace Kepler, which is the basis for nearly all the GeForce 600 and 700 series GPUs to date, including the Titan models.

Confusingly, the first Maxwell GPUs are actually badged Geforce GTX 750 and 750 Ti, which doesn’t exactly scream ‘new architecture’. But we’ll come back to the branding shenanigans in a moment. Let’s deal with Maxwell as an overall architecture and then have a look at the first members of the new family.

The big news is that Nvidia is pitching Maxwell as ‘mobile first’. In other words, Maxwell is designed as a mobile architecture first and then adapted for desktop. Shades of Intel Core? Absolutely.

As a corollary, Nvidia is claiming the big win with Maxwell is efficiency or performance per watt. How much of this is expediency is a tricky question. I reckon there’s little doubt Nvidia was hoping to have access to a new, smaller production node from production partner and uber chip fabber TSMC by now.

That’s right. Shout it. The most efficient GPU ever. Allegedly.

As it is, they’re still stuck on 28nm and that means launching big Maxwells isn’t an option. Instead, the first GPUs are relatively low end and so the message can’t be about record-breaking performance.

That said, Maxwell certainly has extensive new mobile optimisations. It does indeed look a lot like Kepler re-jigged with a view to power efficiency.

So in terms of functionality, not all that much has changed. There’s no whizz-bang new rendering tech. The basic DX11 feature set is carried over from Kepler, which helps to make sense of the 700-series branding for the first Maxwell-based boards. That’s actually good news if you own a Kepler card because it means you’re not suddenly going to be locked out of some wondrous new rendering feature.

Instead, it’s a case of taking existing technology and rearranging it for more efficient operation. Result is a claimed doubling of performance per watt. If true, it’s genuinely impressive – remember these first Maxwells don’t get the benefit of a new production node. Of course, as the first truly mobile optimised architecture, you’d expect big gains. Ye olde low hanging fruit syndrome. Future gains probably won’t be as dramatic.

Anyway, the headline stuff involves a new block design. Kepler was based on a streaming multiprocessor block known as an SMX containing 192 shader cores, four warp schedulers and eight dispatch units, the latter two effectively feeding the 192 shaders as a homogeneous pool.

Maxwell: Lots of little green squares become a few less green squares in a different arrangement. Stop me if I’m being too technical.

With Maxwell, the SMX becomes an SMM and now contains just 128 shaders. However, those 128 shaders are split into four groups of 32, each with its own warp scheduler and two dispatch units.

What’s a warp scheduler, you ask? Let’s not go there. Just know that Nvidia says the new 128-core SMM delivers 90 per cent the performance of the old 192-core SMX. Yeah, really. Needless to say, the new 128-core SMM is also much smaller than the old 192-core SMX. So you can have much more performance from a given size of GPU. Or the same performance in a much smaller, lower-power chip.

To be frank, Nvidia was already doing pretty spectacular things with Kepler when it came to the size of its GPUs and the performance they kicked out, compared to AMD at least. Maxwell looks truly epic by this metric.

There are plenty of other changes, including a massive uptick in the amount of on-die cache memory. Nvidia has also upped Maxwell’s general compute ante – it will be much more competitive with AMD than Kepler when it comes to mining cryptocurrencies, if that sort of thing is your bag. But that shift from SMX to SMM is the bit you really need to comprehend.

Small chip equals big money. For Nvidia, that is.

As for the first Maxwells out of the gate, the aforementioned Geforce GTX 750 and 750 Ti, they’re based on the new GM107 GPU. Price-wise, we’re looking at roughly £90 and £115 respectively, which will give you an idea of how they’re positioned – they’re entry-level gaming GPUs.

Respectively, they sport 512 and 640 shaders and share the same maximum boost clock of 1,085MHz. At this point, it’s worth noting that the equivalent chip from the old Kepler family, GK107, had just 384 shaders.

If there is a catch with GM107 it’s the memory bus. It’s a miserable 128-bit item. That added on-die cache I mentioned might help offset that a bit. And in reality nobody is going to buy either of these new Maxwells to game beyond 1080p. But I still instinctively gag at the mention of a 128-bit bus on any GPU. Memory bandwidth is simply too important.

One final point to note re the specs is that all the effort on power efficiency means neither the 750 nor 750 Ti require a supplementary six-pin PCI Express power connector as standard. They get sufficient power from the PCI Express bus. That’s pretty significant in the context of building a low-power, super-silent game box for the living room. Just be aware most if not all of the factory overclocked boards already being offered do have a six-pin power port.

Anyway, what should you make of these new boards and Maxwell in general? The 750 and 750 Ti themselves aren’t going to change your world much unless you’re hoping to build a small form factor PC and seek the best possible performance per watt.

No need for supplementary power for the reference 750 and 750 Ti boards

Early benchmarks from the usual suspects around the web agree that Maxwell sets new standards for power efficiency – the 750 Ti looks to be getting on for three times quicker than the GT 640, fastest 600-series board without a supplementary power connector.

As a pure performance proposition, they’re less immediately compelling. But that’s thanks to how Nvidia has positioned them price-wise. In simple terms, AMD’s Radeon R7 260X is faster than the GTX 750 and the R7 265 is faster than the 750 Ti. The end.

Oh, yeah, did I mention AMD recently tweaked its product range in preparation for the new Maxwell? No surprise there.

When you start looking beyond pure performance, it all gets hideously complicated, of course. Suddenly, it’s Mantle vs G-Sync or whatever and you tumble down the proprietary-tech rabbit hole. The way things are set up in the graphics market right now, you can’t have it all. You must pick sides.

But Nvidia has definitely done some pretty staggering things with Maxwell regards efficiency and power density. Even on existing 28nm chip manufacturing tech, Maxwell is a bit special in that regard. When higher-end Maxwells based on TSMC’s 20nm node appear – presumably later this year – I’ve a feeling the results are going to be truly spectacular.

Quite what Nvidia is going to call the more powerful Maxwell chips is hard to say, of course. I doubt there will be a Maxwell with better raw performance than a 780 Ti any time soon – GK110 is still one hell of a chip – which makes launching, say a Geforce GTX 880 that’s slower than a 700-series board a bit tricky. But that’s the way GPU branding has gone these days. It’s a mess.

__________________

« | »

, , , , .

60 Comments »

  1. FurryLippedSquid says:

    It’s an astonishing piece of hardware for the size, cooling & performance. Great for a budget micro build, but not much else. There are more capable cards that cost less for the everyday PC gamer.

  2. Josh W says:

    I think “Maxwell Graphics” would be a great name for a journalist.

  3. eldwl says:

    Is it worth upgrading from a GTX460?

    • CrazedIvan says:

      Yes, you’ll see a major improvement over the 400/500 series of cards. But, you’ll start having trouble justifying the upgrade if you have a 600 series card.

      The thing to understand is that the majority of Nividia’s vast user base is sitting in the 500 series, who can’t justify spending $250+ on a new card for tech they don’t “need.” Being able to offer people a card under $150 dollars allows their user base to upgrade in mass with a good performance boost.

      • DoctorCool says:

        Sound reasoning, but I’m sat on a 560 and after reading a few reviews with gaming benchmarks, this really doesn’t seem to be worth making the jump, especially if you’ve already decided against a 660/760 or higher. Personally I’ll be saving my pennies for a Maxwell 860, but you are perhaps right that the 750ti’s price might be right for some to make it a stepping stone in-between, or maybe even an introduction to SLI.

      • HadToLogin says:

        I think I’ll wait with my 560 for 8xx series anyway – still can play games smoothly and I’d rather get card I won’t need to change again in two years – by 8xx release we should see how good/bad ports are now.

        • CrazedIvan says:

          I currently have two 560ti SLI setup and I can play most games on max settings to little to no chug. The 750ti would be a no brainer if you could join two with a SLI bridge, but atlas! you can not. I may wait for the 800 series line up, but a $150 price tag is a hard deal to pass up.

          • HadToLogin says:

            My biggest problem is that I can’t really spend $150 today for 750 and another in a year for 850 when it turns out most developers make shitty PC ports from consoles.
            With my old PC I was double lucky: first because 9800 is still enough to play most games and then I got lucky again and was able to buy another $700 PC few years later (my bro got old one).

  4. rockman29 says:

    This bodes well for the gaming laptop I will be buying in the near future :)

  5. ResonanceCascade says:

    Should I get this or a Voodoo2?

  6. Geebs says:

    Better mobile parts = more hi-DPI laptops = text rendering will finally not suck. I for one welcome our green-tinged overlords. Given that the current crop of consoles mean that a 7970 or gtx680 should be enough for anyone for the next couple of years, cooler and quieter laptops is a good trade-off for shinier graphics.

    • Wisq says:

      … although “should be enough for anyone” is contingent on screen resolution not increasing further.

      Just because the consoles can (barely) do 1080p doesn’t mean that’s where PC gaming is going to stay for the duration of this console generation. In fact, many of us have been pushing beyond that for years already.

      • Geebs says:

        I think 4k is too much of a jump still. Those cards are not too bad for 1440, and the current consoles can’t even do 60fps at 1080 for a last-gen title.

      • TacticalNuclearPenguin says:

        And it’s also dependant on what is “officially” declared as “running smooth” by random internet strangers.

        Not even a 680 will save someone who wants to be actually locked ( for real ) at 60 fps on 1080p with everything maxed, not for all games at least, let alone those who want either even more FPS or higher resolutions.

        But this is besides the point, the real Maxwell still isn’t ready, but the focus on watt/performance will still be there. So yeah, the target remains less noise and heat, but the performance is also supposed to increase from the previous generation. Simply put, the 880 still needs some time.

        • TacticalNuclearPenguin says:

          Another small note, either you buy a passive cooled card or chances are that the cheaper boards are noisier than some higher level offerings.

          Even if the decibels are lower, it’s more than possible that the fan rotor will still produce the trademark high pitched sound of crap fans. Or maybe some resonance related to the cheap assembly.

          • Shadowcat says:

            I went for passive cooling years ago, and I don’t intend to ever go back. The video cards with multiple fans crammed onto them give me the jitters — there is no game that I need to play that urgently (passive cards will get there eventually, and I’m patient).

            I’m always happy to hear about efficiency improvements in video cards, though — all the more likely that someone’s going to slap big-ass heat sinks and heat pipes all over one, and raise the bar.

  7. caff says:

    I’m interested in G-SYNC, but when and will larger monitors start appearing with it?

    I use an LCD TV for gaming, because quite frankly it’s ok and it’s massive. So when can I get a 30-inch+ G-SYNC enabled screen?

    • Jeremy Laird says:

      I’m afraid my guess is probably never. I doubt G-Sync will truly catch on.

      • SuicideKing says:

        True, especially after AMD’s “FreeSync” demo suggested that the same capabilities will finally end up in the DisplayPort 1.3 spec.

        • frenchy2k1 says:

          Haven’t seen any mention of Freesync and Display Port.
          All AMD demos were run on laptop, with direct connection to the display and access to the refresh pin.
          In those cases, G-Sync is redundant, as LCDs already support custom refresh driven by the graphics.

          For desktop though, no such luck yet.
          We can only hope a sync signal will find its way into the next DP format. And even then, it will take years for it to spread…

          • SuicideKing says:

            Laptops using eDP (embedded display port) were used for the demo, that supported variable VBLANK. Current eDP spec will apparently be merged with DP 1.3.

  8. soldant says:

    This is probably a case of the tech being more exciting than the card itself. It’ll be interesting to see where future iterations end up taking us.

  9. TillEulenspiegel says:

    Epic performance density

    “Epic” density, really? I know the word has lost all nuance, but that’s a new one.

  10. Herzog says:

    I will be getting a low profile 750ti for my HTPC build. Awesome tech! 300w psu should be enough!

  11. PopeRatzo says:

    What kind of a man plays games on a “laptop”? And why do they call it that anyway? I never see anyone use one on their lap.

    Someone who games on a laptop is only marginally better than someone who uses a…console.

    A proper man gamer sits astride a full-size case with fans and LED lights, with his nose inches away from a 27″ monitor. Mouse in one fist and W-A-S-D in the other (however, I highly recommend a lead codpiece if you intend to reproduce at some point in the future).

    This is what I’ve got warming my inner thighs at the moment: http://www.coolermaster.com/case/full-tower/haf-932/. It gets cold here in Chicago, And my wife just shakes her head, with a snort of derision. Like there’s something wrong with me.

    • Low Life says:

      What kind of a man plays games on a “laptop”? And why do they call it that anyway? I never see anyone use one on their lap.

      I use one on my lap every day, in fact I’m typing this message on a laptop that’s on my lap. Where else would you place the laptop while sitting comfortably on a sofa?

    • jalf says:

      They… don’t. Pretty much every single manufacturer of laptops calls them notebooks.

      It’s us, the users, who call them laptops. (And I do regularly use mine on my lap)

      Also, can we move past the faux-elitism already?
      This isn’t 1993, lots of people use perfectly fine laptops to play games perfectly well.

  12. Singularity says:

    I don’t know whether to laugh or cry that my GTX 690 is still king for playing games.

    • PopeRatzo says:

      I think it’s a solid “laugh” if you’ve made a long-lasting choice that has survived the test of time.

    • Max.I.Candy says:

      Well what do you expect? I would add that games that dont support SLi you will only have the performance of a single 680. I bought a 780Ti last month and I think I’ll be happy with that until the dual 790 is made (if that never actually happens, i’ll wait till 880 i suppose).

  13. Slight0 says:

    Anyone care to mention the scrypt hashrate they get with cudaminer?

  14. Moraven says:

    Leap frogging that NVIDIA and AMD is a good thing. Keep pushing the tech.

  15. SuicideKing says:

    Quite what Nvidia is going to call the more powerful Maxwell chips is hard to say, of course. I doubt there will be a Maxwell with better raw performance than a 780 Ti any time soon – GK110 is still one hell of a chip – which makes launching, say a Geforce GTX 880 that’s slower than a 700-series board a bit tricky. But that’s the way GPU branding has gone these days. It’s a mess.

    Well, if Nvidia keeps the die area the same, they’ll be able to cram many more transistors into the same area, especially at 20nm. So if the GM110 does make it to the desktop, it’ll likely manage to out-perform GK110. I think.

  16. Bull0 says:

    I’ve had my desktop for getting on for three years, and am starting to get the upgrade itch, but I bought an obscene overpriced machine with a GTX 580 and 16GB of ram and my concern now is that to see any kind of meaningful performance improvement I’d have to lay out a ton of money, it isn’t a simple “chuck a new card at it” situation :/

    I also think maybe I’m getting old, because I’m struggling to picture what benefit shinier graphics will actually get me. Everything current performs OK.

  17. RProxyOnly says:

    So when will laptop/netbook gaming be as powerful as desktop?

    • Tekrunner says:

      So when will my car be as fast as a plane?

    • po says:

      My laptop is already more powerful than most gamer’s desktops, with an i7-4700MQ, 16GB ram, and a GTX 770M (all of which are upgradable).

      You just have to forget about having a laptop built with weight, thickness, battery life or cooling fan noise as considerations.

  18. Werthead says:

    I’ve got a 550ti which still runs almost everything at maximum (not CRYSIS 3, obviously, and I have to switch the ‘ultrahair’ option off for TOMB RAIDER and BIOSHOCK INFINITE and METRO: LAST LIGHT chug a little bit when there’s a lot going on). I was thinking of upgrading without spending more than £150. Would a 750ti be worth it?

    ADVISE ME INTERNET PEOPLE.

    • nrvsNRG says:

      I advise you stop talking shit.
      Yes a 750Ti would be worth it. Obviously.

      • Werthead says:

        No, not ‘obviously’.

        Would it be better to invest extra money in a higher-end 600 series? Would it be better to wait until a high-end 600 or 700 drops in price? Would the performance boost from a 750ti versus a 550ti be not that great versus another option?

        I mean, thanks for answering the question, but I was after a bit more useful information and less of a prissy attitude. What happened, did your pet aardvark die?

        • nrvsNRG says:

          Well, a 750Ti is well within your price range and would be an OK boost to what you already have. (Its around the same performance of a 560ti and 650ti boost).
          A 660ti or 760 is around £50 over your budget, but would give you a very nice boost. Up to you whether you wait for them to drop. It may take a while, but who knows.
          I’m reaaly sorry, but the reason I had a prissy attitude is because I thought it was funny that you said you can play most games maxed out on a 550ti.You could barely get 40fps on medium setting, 1080P, with no AA & 4 AF, on recent AAA games. Its a entry level card. You only have to look at benchmarks if you dont believe me. Dunno why ppl with low end cards always say stuff like that, especially on this site.

          • rpsKman says:

            Coz they be fools. You don’t max out BioShock Infinite with anything less than a 560 SE. I should know.

  19. Shandrakor says:

    I’m running on a 6800 still. Noisy as heck though, as it’s one of the first that were produced before they fixed the fan problem. Is it worth upgrading or has the tech just not progressed enough yet?

  20. MelonieJMcKinney says:

    my buddy’s mom makes $74 hourly on the computer . She has been out of work for 9 months but last month her payment was $19442 just working on the computer for a few hours. view it now ,,,,,,,,,,http://www.Fizzjob.com

  21. frenchy2k1 says:

    I think the best example of the purpose/usage of those cards is described here:
    http://www.pcper.com/reviews/Graphics-Cards/Upgrade-Story-Can-GTX-750-Ti-Convert-OEMs-PCs-Gaming-PCs/Power-Consumption-Ch

    Those are cards that can transform either an aging machine, but mostly a cheap pre-built brand computer into a reasonable gaming machine for a low investment. As they use so little power and do NOT require any additional plug, they can pretty much be plugged into ANY pre-built computer.
    The article shows gains of 5x to 12x the perfs from integrated graphics (at 1080p).
    If you have relatives that have grey box computers and would like to participate in modern gaming, this is the safest bet to update their computer to play.

Comment on this story

XHTML: Allowed code: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>