The GTX 680: NVIDIA’s New Big-Boy Graphics Card

By Alec Meer on March 23rd, 2012 at 2:00 pm.

Pedestal not included

You’re probably going to have be a little patient with me here. I used to talk of graphics cards and processors regularly back in my past life on PC Format magazine, but my technical mojo has diminished sharply in the intervening years. I retain a working knowledge of what’s half-decent and what’s a big pile of donkey doo-doo, but if you want me to talk dirty numbers to you, you’re going to be disappointed. It does seem jolly good, and I know that I want one in my PC, but I am no Jeremy Laird and RPS has not enjoyed a review unit with which to test NVIDIA’s claims in full. So, in reporting the news and details on NVIDIA’s new flagship graphics card, formerly known as Kepler but to be released as the GeForce GTX 680, I shall have to report what I was told and leave you to draw your own conclusions.

As regards RPS and hardware coverage, it’s something we want to do a little more of – Hard Choices being our vanguard. Obviously we’re a gaming site first and foremost, but equally obviously PC gaming requires PC hardware so it’s silly to overlook it entirely. We’ll try to cover the major events/releases as they happen, but do bear with us while we work out exactly how and to what extent that happens.

Let’s kick off with a noisy official introduction video:

Okay. The GTX 680 is the high-end of the new range, and carries an asking price of £429, $499 or 419 Euros. More affordable, mid- and low-end cards will follow (and taking wild bets on the names being stuff like 660 and 620 is eminently reasonable) but haven’t been detailed or announced as yet. So, for most of us, the 680 specifically isn’t going to be something we’d ever consider buying, and instead acts as a guideline to what this generation of GPUs is going to offer PC games.

Here are some headline numbers for you:

1536 total cores, draws 195 watts (across 2x 6 pin power connectors: no 8-pin ones required), a base clock speed of 1GHz, 2GB of GDDR5 memory running at 3GHz, 28 nanometre die, PCI Express 3.0, four display outputs.

What’s interesting, from my only semi-informed eye, is none of those things, however. Yes, yes, it’s all incremental improvement both on NVIDIA’s 500 series cards and ATI’s until-now benchmark-leading 7970 board: such is the way of the graphics card market. The main thing for me, a man who spends his every day sitting next to a hulking PC chucking out extreme heat, ravaging the environment and resulting in fuel bills that make me weep, is the fact this thing both uses a lot less power and brings in a whole new scaling system, whereby the card drops its performance and thus power usage depending on what your applications actually require.

NVIDIA claim it’s “the most efficient graphics card ever” in terms of power draw, and in real terms that means it should drop to it teeny 15 watts of draw when you’re just idling on the desktop, with the GPU running at just 324Mhz.

Even in demanding games, you can set the card to not go full pelt if there’s not going to be any benefit to it. For instance, while it’s very nice to know that your PC is running a game at 600 frames per second, unless you’re an oddball who spends games scrutinising the image with microscopes and speedometers rather than actually playing, you’re unlikely to really spot much difference from it running at 60 frames per second. So, you can set your drivers to limit the frame rate to, say, 60, or 120, or whatever helps you sleep at night, and if that requires less than the card’s maximum output/speed, it’ll scale itself down in 13Mhz increments to suit.

All this means a much quieter card too, which is particularly appealing to me – I prefer gaming with speakers rather than headphones, but all my PC’s fans going bbbbbrrrrrrr buzzzzzz chitterchutter rather spoils the experience. This card is supposed to offer 41 dBA at peak usage, against the 580′s 48-odd decibels. I’ve only seen the card running in a noisy room so I can’t attest to what that means in practice yet, however. Hopefully I can find a way to get hold of one and have a listen myself before I make any real judgements there.

The other thing is that the card can scale itself in the opposite direction too – boosting its clock speed if a game demands it. NVIDIA reckon that, while the base clock is base clock is 1006MHz, more typically you’re going to see it running at 1058MHz. If you have a well-cooled case, it will keep raising itself above that, until either hitting heat problems and scaling back down, or reaching an as-yet undisclosed speed ceiling. I saw Battlefield 3 happily running on a card that had clocked itself about 20% higher without issue, which was promising. Again, complicated new driver settings and third-party control panels enable you to mandate exactly what the card gets up to if you don’t trust it to do its own thing or demand that it works at full-pelt at all times.

Third-party manufacturers are already offering all manner of tweaked cards that are pre-overclocked to higher speeds and can stand to boost to yet higher, but the downside of all this is that if you have a crappy old, poorly-ventilated case in which your PC runs hotter than a thousand suns, the card is going to downclock itself in order to avoid melting into a pool of horrifyingly expensive grey goo. This may mean it actually underperforms, and your games run like crap.

So, if you’re even considering picking up one of these boards, make sure your PC’s airflow and cooling is up to scratch if you want to make the best of it. I’m a bit worried people who aren’t all that techy are going to pick up one of these cards cos they’ve heard it’s good, slap it straight into an ancient box coated in dust and grime and then have a mysteriously bad time – hopefully the message can be spread far and wide though.

OK, onto features, which is where I’m on even more infirm territory than before. The biggest news, in terms of promotion, about the GTX 680 is that it can run the fabled Unreal Engine ‘Samaritan’ tech demo in real-time on a single card. When Samaritan was shown off to a chorus of coos and wows last year, it was running on three GTX 580s, which were drawing 700 watts and making a hell of a racket. I saw it running live on one 680, drawing 195 watts and being fairly quiet about it. It looked amazing, and I was duly astounded that a single card could render such a scene.

The catch is that Samaritan’s newly-efficient performance is as much to do with software and driver optimisations as it is the new hardware. That particular build of Epic’s next engine had been tweaked and poked and prodded to specifically run well on this particular card, with all manner of smoke and mirrors employed to ensure its pre-determined scenes looked top-notch.

So, this does not mean that games can necessarily look quite that good on a GTX 680, at least not without devs working incredibly closely with NVIDIA before release. However, that it is at all possible on a PC that doesn’t cost an outlandish number of megabucks is exciting stuff indeed.

Of course, with so many high-end PC games still being branches from development that’s focused so heavily on increasingly aged consoles, I don’t know how many games we’re going to see that truly take advantage of the 680′s box of tricks. I mean, something like Skyrim can already be made to run smoothly at max settings on a mid-range card from the last year, so upgrading to 680 isn’t going to add something new to the experience.

Unless more Cryses or Battlefields or games of that hyper-graphics ilk are en route, I suspect it’s going to be the power-scaling stuff that’s the most immediate boon from this new card and whatever follows it. Apparently we’re going to see some impressive new physics stuff in Borderlands 2, and Max Payne 3 will look its bestest on the 680 (the screens throughout this post are MP3 on a 680), but what we don’t know is whether that will also be possible on older cards from both NVIDIA and ATI.

Similarly, these new tech demos below are terribly impressive, but this is the software specifically designed for the card, running in idealised circumstances with minimal environments. Sure, it can render this stuff, but rendering it as part of a larger game with tons of other stuff going on is an entirely different matter:

Real-time fur:

The possible future of the PHYSX stuff – destructible objects that fracture convincingly and into ever-smaller parts:

More realisitically, there’s little-known Chinese body-popping tour de force QQ Dance, which looks like this on a 680:

Some other facts I’m just going to drop in here rather than pretend I can offer entirely useful context about them:

- FXAA, a speedier if less precise take on anti-aliasing which is becoming ever-more prevalent in games, is now a toggle in the driver control panel, so you should be able to force it on even in games which don’t seem to offer it. NVIDIA claim FXAA offers roughly the same visual gain as 4x standard anti-aliasing, but running 60% faster.

- DirectX 11 tessellation happens at 4x speed of the ATI 7970, apparently.

- Power throttling can turn off cores completely, but it will never go all the way down to zero cores.

- The throttling/scaling stuff and the effect of the local environment thereupon means it’s more than possible you’ll never get the same benchmark result twice.

- The card runs about 80 degrees C normally.

- Another new feature is ‘adaptive V-sync’, which essentially turns V-sync on and off as required at super-quick intervals, so in theory you’ll suffer neither screen-tearing or that damnable V-sync lag.

- Each 680 will ship with a free kitten.

- There’s no longer a seperate shader clock. It’s all one unified clock, with a longer shader pipeline – so a boost to the main clock should affect shader performance too.

- It’s 10.5″ long, so will fit into any case that takes a 580 etc. Unsuited to weeny cases, however.

- It can run 3D Vision stuff on a single card, if you’re deluded enough to want that.

So there you go. The 680 is out now, as are assorted pre-overclocked variants. As far as I can tell from reading reviews chatting to people who follow this stuff more closely than I, it’s a suitably impressive card if the highest end is where you need to be – but most of us (i.e. those of us who don’t have £420 going spare and/or don’t rock 27″+ monitors) would be wise to hold out for the mid-rangers based on the same Kepler architecture, or whatever the similarly-priced and specced ATI response might turn out to be. Let’s hope for those later this year.

, , , , .

149 Comments »

  1. roryok says:

    I started watching that first video, skipped in a bit to see some GPU action and ended up watching someone stroking a yeti.

    Too creeped out to watch the rest of them.

    • RakeShark says:

      That yeti is hard to track down, I tell you. Perhaps I should have groped its fur instead of trying to shoot it.

  2. psyk says:

    “If you’re only here for pure gaming performance and number crunching then the GTX680 certainly has performance in spades. It’s nearly identical to the HD7970 in the numbers we saw throughout our testing and only in a couple of instances was the HD7970 ahead.”

    http://www.overclock3d.net/reviews/gpu_displays/nvidia_gtx680_review/19

    Time to read some more benchmarks, then I think my wallet is going to hate me.

    • marach says:

      note this is only true at upto 1080P after that things start to change.

  3. Premium User Badge

    AmateurScience says:

    This looks like beast, but I’m unlikely to be upgrading the 560Ti any time soon.

    Hopefully some of the new driver features like adaptive V-Sync and improved FXAA will filter down to the 5xx range too. It’s really interesting what NVidia are doing with anti-aliasing at the moment.

    • Lukasz says:

      well… unless you have lots of cash to spare or your livelihood depends on having the beast machine there is no really a reason for you to not skip a generation. get 750 or 8970 a year or so from now. your 560ti will handle anything at 1080.
      consoles are bottlenecking PCs very much and monitors above 1080 are a bit too expensive for most people. Therefore 560 or 6950 are not worth an upgrade as they still run anything really at 60fps.

      • Premium User Badge

        AmateurScience says:

        You’re right of course. Very pleased with the 560ti too. Sounds like some of the more interesting software changes will be filtering down to us proles too!

    • Droniac says:

      The Geforce website actually states that both FXAA and Adaptive V-Sync will be made available to all Geforce 8-series and later cards eventually with new drivers. See their R300 driver overview: http://uk.geforce.com/whats-new/articles/nvidia-geforce-gtx-680-R300-drivers-released/

      • deke913 says:

        That is good news, and very interested to see how that will work with my current gtx 460 (old beast just keeps on chugging along). Thanks for the link.

  4. D3xter says:

    Those are some epic beards in that first video.

    Oh yeah, and anything pushing tech forward is good, this looks like it could slowly combine all those things like Tesselation, PhysX, Hair Simulation etc. into one and not drop into the single digits, or if not right now maybe at least a generation away :P

  5. Memph says:

    “NVIDIA claim FXAA offers roughly the same visual gain as 4x standard anti-aliasing”

    That just has to be bollocks. I have never seen FXAA deliver MSAA/CSAA quality visual sharpness.
    Battlefield 3 being a prime example, where turning it on low or high, compared to even the limited 4xMSAA, it’s a blurry, fuzzy mess.

    • Screamer says:

      I tried using FXAA in Skyrim……. I still can’t see properly afterwards.

      • Dozer says:

        I tried using FXAA in Skyrim, but then I took an arrow to the knee.

    • LionsPhil says:

      This is because FXAA and its ilk are just blurring. It’s a godawful malfeature. It is not anti-aliasing at all, because that is a defined signal-processing term.

      The obvious failure case for them will be things like ladders, fences, and tapering points, since there’s no supersampling. It’s all very well (read: hideous, but vaguely workable) to blur away the jaggies on diagonals with edge-detection hacks, but without actually computing the extra resolution to resolve sub-pixel boundaries, you can’t cope with geometry at the scale where pixel quantization is significant.

  6. Meat Circus says:

    I remember there was a time when I would have found this stuff exciting. But now… it’s just some numbers.

    And you know what? I don’t think it’s my fault.

    • Bostec says:

      Same here actually, my eyes just glazed over the numbers. Now all I want in a graphics card is how quiet they are. It certainly shows my age.

    • michaelfeb16 says:

      I guess that is just a difference in expectations for us. Skyrim and other recent games are the things that have forced me into a new card. I’ve been playing every game I’ve been interested in maxed out since I first purchased my 4870×2. Once I was forced to stop scaling back, even a little, I set aside my money for the 7990…if it would just be released already.

    • LionsPhil says:

      I find myself scowling at the hot, noisy things, and the fact that it’ll encourage developers to move their targets needlessly upward and eventually render my previous hot, noisy thing obsolete and the less-hot, less-noisy Intel chipset in the laptop complete incapable of handling anything, rather than simply a bit slow and low-quality.

      It’s actually somewhere I wish the green environmental craze would hit with their crippling Earth-hugging blow. Death to cooling fans!

    • bill says:

      You are not alone.

      I’m happy they’re focusing on energy saving though.

  7. Aaarrrggghhh says:

    “Each 680 will ship with a free kitten.”
    Aaaawwwww!

    • HothMonster says:

      Yeah but those things sit on store shelves for months before you, the consumer, buys it. So you will most likely get a dead cat instead of a live kitten, really its a whole different message then you think.

      • theleif says:

        Maybe they are working with Schrödinger to mitigate the problem?

      • deke913 says:

        Cats are arrogant bastards anyway, at least this way I can chunk him in the other room without having to wait for him to decide he’s ready.

    • Lugg says:

      Also, if as part of your purchase they end up being owned by you, the buyer – how can we still call them “free”?

    • Ragnar says:

      But beware, the free kitten carries a terrible curse!

  8. Brun says:

    Sorely, sorely tempted to upgrade to this guy – I have a GTX 480 at the moment and I designed my computer around upgrading GPUs every other generation. That said, the 480 still doesn’t feel that old, and dropping $500 on a new GPU that would leave my other $500 GPU on the shelf collecting dust isn’t that appetizing.

    I know you can’t run two different GPUs in SLI, but can’t I use one as a main GPU and another as a dedicated PhysX card? Upgrading would be a lot more palatable if my 480 wasn’t essentially going to waste.

    • Premium User Badge

      AmateurScience says:

      Why not get a second 480 and SLI them together?

      • Brun says:

        Because 480s are still relatively expensive. If I’m already spending $300 on a second GPU I might as well spend the extra $200 and get the latest and greatest thing that can match or exceed the power of two GPUs all by itself.

    • Kdansky says:

      You are considering upgrading from a card which I would consider upgrading to, but I can’t be arsed because Skyrim actually runs fine (as in: acceptable frame rate at medium settings) on my prehistory ATI 4870, and there are no games worth playing that actually require more power.

      Though I really would like an HDMI-out, because then I could stream files to my TV that the PS3/WDTVLive (don’t buy that crap) can’t handle.

      • Brun says:

        This is the direction I’m leaning as well. My 4-year-old gaming laptop is no longer necessary (due to my 2-year-old gaming desktop) and it doesn’t serve as a practical laptop when I’m traveling. I see an Ultrabook in my future…

      • zergrush says:

        You can set up your pc as a media server for the PS3, then it’ll be able to play any format your computer can handle.

        Just google PS3 Media Server.

      • DrGonzo says:

        Upgraded from a 4850 to a 6870, the old card popped. And although I saw no real point in upgrading before, the difference is actually quite phenomenal. Before I was running games fine, but now almost everything is maxed out and running at 60fps solid. Now I really do understand why people rave about it.

        Problem is, now I’m being spoilt and there is no way I will be in this position for very much longer.

      • againstthagrane says:

        step 1: go to monoprice dot com
        step 2: add HDMI to DVI cable to your cart
        step 3: checkout
        step 4: thank me in any way you wish

    • Valvarexart says:

      Correct me if I’m wrong, but can’t you SLI ANY two nvidia-cards? It is most effective if both are the same, but if they are relatively close standard-wise it should be an improvement over only one card. I was looking into it when I was upgrading from my 275 to 480. Unfortunately I didn’t have enough space, though.

      I am also sorely tempted to get the 680, but I’m afraid my CPU might bottleneck it.

      • zaphod42 says:

        No, this is completely incorrect. To use SLI, you must have the EXACT same GPU chip in both video cards. They must be the same series (such as 580) AND the same model configuration (GT, GTX). They must match up EXACTLY or it will not work.

        You *are* allowed to mix video cards from different manufacturers. They can have different RAM sizes, although it is highly discouraged. If one card is overclocked and the other is not, it will work.

        • DrGonzo says:

          You can crossfire two different cards though, don’t think everything is compatible with everything, but a 6850 should work with a 6870, and it would just act as though you had two 6850s stuck in there.

      • MadMinstrel says:

        Yes/no. You can put any two cards into a system and, say, use them for OpenCL or CUDA computing (assuming the software supports multiple cards). But you won’t get any benefit in actual games. SLI relies on the cards rendering alternate frames, so having them render at different speeds would be a big no-no.

    • Juan Carlo says:

      Heh.

      I still have a 290gtx. I’ve had it for like 4 years now. I would totally upgrade, but I honestly have yet to encounter a game that it can’t run. Of course, it can’t do Direct X 11 (only 10), but I don’t see anything all that great about dx11 anyway. Or not enough to upgrade.

      The fact that most PC games these days are either retro indie games (with purposely shit graphics) or games design for 8 year old console hardware really kind of makes upgrading pointless. I remember when I used to upgrade like every two years (back in the late 1990s and early 2000s), but the last time I remember ever feeling like I really needed to upgrade was with “Crysis.” Graphics have really only made minor improvements since (relatively speaking).

      • phylum sinter says:

        There were a bunch of games i thought i would have to upgrade from my 5850 to the new, pricy hotness – The Witcher 2, Crysis 2, Rage, Batman, The new Batman… but no, none of these games pushed the crazy (even the dx11 patches are just fine here for games that have them) to the limit, and i wonder if i’ll ever see a game that fully utilizes DX11, or that doesn’t have “Requires DX9c” blasted on it.

        Every time i see “Requires Direct X 9.0c”, i die a little bit more.

  9. CaLe says:

    The benchmarks and new architecture show why this was worth talking about. That wasn’t the case with the 7970.

  10. Spacecraft says:

    I’m thinking about selling my two 6950s to grab this. I’m tired of the noise and heat from two cards, not to mention I lost a pci slot I could be using for a sound card. But I don’t know if I’ll see any real performance gain. I already run BF3 on ultra at 1920×1080 and I never drop below 60 FPS.

    Then there is PhysX. I kind of hate Nvidia for forcing a proprietary physics solution like that, and artifically crippling AMD cards if they want to play games with PhysX. But this new PhysX seems worthwhile, and being the graphics whore that I am, I don’t want to be left behind with cool new effects.

    Any thoughts?

    • Premium User Badge

      AmateurScience says:

      To be fair, with the exception of Batman and Arma3, the list of games supporting PhysX mostly reads like a list of not very good games.

      http://en.wikipedia.org/wiki/PhysX#PhysX_in_video_games

      • The First Door says:

        I really enjoyed it in Mirror’s Edge, too. Lots of paper fluttering around as you run by, from what I remember. Oh, and plastic scaffolding coverings getting torn to bits by a helicopter shooting at you as well. Anyway, my point is it can help increase the physicality of the world a little, but it isn’t really essential normally.

        • Spacecraft says:

          I’m wondering if more developers will use PhysX since it seems pretty advanced compared to what it does in Mirror’s Edge or the Batman games.

          • The First Door says:

            I think the problem is using it in a way which is actually fun for the game. There were a few additional levels in UT3 which used it for destructible walls and a fan-made level where you could destroy basically everything in the level, but it just sort of got annoying after a while. If you are using it for actual environments it can make moving so tedious and fiddly and it is probably more difficult to code. I guess that is why many developers just use it to make things look prettier.

          • DrGonzo says:

            Unless they make it truely cross platform it will never be an integral part of any/many games unfortunately. But it’s impossible to justify it when half of your customers won’t be able to see it/run it smoothly.

          • Premium User Badge

            AmateurScience says:

            That’s the real kicker. I imagine that adding all the nifty PhysX effects to a game takes no small amount of dev time, even with the extra expertise from the ‘The Way It’s Meant to be Played’ assistance. And it’s only something that’s going to be available to (according to Valve) 55% of your potential userbase. If the PhysX card had stayed as a ‘third party’ additional, separate card I wonder if it might have taken off a bit more with developers.

          • Amun says:

            And if Nvidia weren’t such buffoons about it and allowed an nvidia card to be used for PhysX in the same system as an AMD card, they’d have tons more money. Arbitrary and meaningless divisions for the sake of profit are bad!

  11. DeanLearner says:

    Hmmm, just how many graphics can I expect to get from this card? Not to brag or anything, but I need a serious amount of graphics before I am interested in playing any of these “video games”.

    • stele says:

      I’m sure you’ll be very happy with the shear amount of graphics you’ll get from this card.

      • DeanLearner says:

        Thanks stele, I can only hope you’ve not underestimated my need for many graphics.

        • stele says:

          I enjoy many graphics myself. Not as much as boobs in my face. If I had to choose it would be the boobs. But the high amount of graphics is a close second!

          • phylum sinter says:

            I’m going to say spending $500 on boobs in your face is definitely the better investment, faced with just this as the other option.

            $500 on many graphics may require additional investment in development companies to generate $500-worth of graphics per game, which means you’d be totally without boobs in your face for a very long time.

    • stahlwerk says:

      It’s up to 4 times more!

      • DeanLearner says:

        4 times? That’s almost 400%! Thanks, I’m sure this will be adequate.

        • stahlwerk says:

          But I heard most of the graphics are processed, so be wary of aliases.

          • GraemeL says:

            Aliasist!

          • HothMonster says:

            Processed graphics? Great high-fructose corn syrup finds its way into another one of my daily consumptions.

        • Tams80 says:

          Are you sure? I’ve heard rumours these “video games” consume large amounts of graphics.

    • fallingmagpie says:

      This card supports all those games that take migs and megs of memories just to play!

    • jonfitt says:

      I don’t want to add to your confusion about if this will provide enough graphics, but I have a warning:
      .
      If you choose the wrong brand of graphics your life quality will be irrevocably reduced. One card brand will bring peace and prosperity, the other misery.
      .
      Please consult a forum to determine the correct brand.

      • DeanLearner says:

        Ok I will consult impartial graphics forums to make sure I get it right. Also I want to make sure the card can handle web 2.0 graphics and cloudy graphics.

    • bill says:

      The 680 will naturally give you more graphics than the 780, and almost double the 560. Of course, the 560M will give you a little more graphics, but whatever you do don’t get the 2600 or the 560i as those will be obsolete in terms of graphics when they launch.

    • Octopi says:

      With this card, you’ll get so many graphics you won’t even know how to hold them

  12. Mike says:

    Still waiting for that day everyone told me would come when graphics advanced so far that people would stop plouging ridiculous amounts of money into it and actually start funding AI development.

    Stiiilll waiting.

    At least we’ve got ten times as many hairs on furry stuff now though, eh?

    • jonfitt says:

      The mans now duck behind the waist high walls and stand up to shoot. AI !

    • zaphod42 says:

      Its happening even now. Its just the curve is much longer, you’re going to keep waiting for a few more decades, and anybody who told you othewise was crazy.

      If you compare the curve of market price and sales and complexity for sound cards and video cards, its the EXACT same, except scaled. Video cards are following the same trend, and we’re already on the downward slope now. Most people can buy $100 cards and play almost any PC game on high settings. Cards like this are fun, and they do it to push the envelope and get attention for their lower-grade products, and make some sales to the “enthusiasts”

      But make no mistake, video cards days are numbered. Its only a matter of time until on-board video is all anybody ever needs.

      Its just that amount of time is like some 20 years. It doesn’t happen overnight.
      But still, its a lot of development for 20 years.

      • jonfitt says:

        He was referring more to the development effort “money” not card cost. The theory was/is that once we reached, say, Crysis, people would think “Well that’s as good as we’re ever going to need things to look, let’s start burning all these spare ever increasing processor cycles on making our space marines care deeply about each other”.
        Instead we still see the next Unreal Engine telling us that what we really need is additional graphics. More graphics!
        .
        I don’t think giving up aiming for additional fidelity is necessarily the right thing to do, but I do think that tech demo shows me nothing I want. I’d much rather they tell us that their new cards have libraries that natively implement projectile physics of zillions of bullets at once, so we never have to play another hitscan game.

  13. stahlwerk says:

    I read the article on anandtech, and I’m not quite convinced of some of the design decisions they did (namely reintroducing static scheduling to fermi), but if it helps them save precious wattage I’m okay with it, I guess. Actually I think it would be really nice if they ported Tegra mobile architecture to the desktop, like Intel did in the Pentium M -> Core transition. Sure, the benchmarks anandtech conducted were indicating that the gtx 680 is an very excellent 1080p card, but really anything > Radeon 5800 or Geforce 200 based is, so it would be cool if we could get 1080p at a respectable framerate while staying below 20W TDP.

  14. iHavePants says:

    Wow that fur demo looked awful. Nvidia should hire some decent artists instead of relying on programmers to show their tech off.

    • stahlwerk says:

      It really did. I can see what they aimed at, but the reality is that this particular lighting model just doesn’t cut it for hair.

  15. jimmm25 says:

    But will it blend?

  16. Torgen says:

    I’d rather drop $500 (!) on a new iPad, really. I’d get more utility from it. ($500 is about what I spent building my entire PC)

    • againstthagrane says:

      how is $500 spent on a laptop replacement with the speed of a laptop from 4 years ago better than the BEST GPU on the market? how is that even a valid comparison in the first place? a pair of shoes also has more utility than a gpu for 99.9999% of people. doesn’t mean anything.

      • Premium User Badge

        Wisq says:

        If you’re spending $500 on a pair of shoes, you probably already have an iPad.

  17. Klarden says:

    Well, nVidia does seem to feel danger if it asked every single gaming site on the planet to write about 680. At least, this stuff doesn’t feel like a simple press-release like it does on most sites. Then again, this is RPS i’m talking about.

  18. stele says:

    I’d like to see some more of that, um, “tee tee” dance. Um, yeah.

  19. trjp says:

    I have massive respect for anyone who can applaud and appreciate hardware like that but then DARE to say that you’re an “oddball” if you care about FPS over and above that you’re getting “enough” – because I fully expect an explosion to come from a section of our community on how their eyes simply freeze in their sockets if their games fall below 120fps even for 1ms ;)

    It’s good to see some sense coming to GPU design in terms of scaling the tech tho – one of the major issues, for me, is that my PC is on anything between 16 and 24 hours a day and I don’t want my office turned into a sauna (or my electricity bill made into a telephone number either!!)

  20. Premium User Badge

    PoulWrist says:

    Was going to get a 7970, but then this comes out and is more powerful. I need the power to drive my silly resolution monitor :( but will wait for custom PCBs to hit the market. Preferably something pre-overclocked. On my old days I have gotten weary of toying too much with such things.

  21. GT3000 says:

    So..Chop chop on that 7970 report, Daddy’s 6800HD isn’t getting any younger.

  22. Retro says:

    Any word on mobile versions of this chip?

  23. Advanced Assault Hippo says:

    When GPUs at half the price can play games at max settings with no slow down, what sane person is actually going to but this card for gaming?

    • GT3000 says:

      The same people who buy SCAR-Ls when a M-16 does the job just fine.

    • trjp says:

      A Fiat Panda will do 70mph – hell one of those Citroen/Peugeot/Toyota portaloos does it whilst returning 60mpg – why would ANYONE buy a Nissan GTR/Porsche/Ferrari eh??

      • TheWhippetLord says:

        The obvious answer: “To pull attractive members of their preferred gender,” would seem unlikely in most cases* with a graphics card.

        *Although anyone pullable by graphics card is by definition awesome.

    • Zenicetus says:

      If you expand the concept of “gaming” to include flight simulation, then the answer is “lots of people.” A high-end flight sim like X-Plane 10 will drive this puppy into overload, and you still won’t be able to max out all the settings (which is basically the developer’s goal… trying to stay a step ahead of the hardware).

      It’s true that for most of the “game” market this would be overkill, and won’t improve much until the next console generation. But there are a few exceptions. I’m running a GTX 560ti, and still can’t quite get the highest level of eye candy on Witcher 2.

    • Shooop says:

      People who don’t realize that.

    • bill says:

      People who like bragging about and comparing the size of their graphics on web blogs and forums.

    • phylum sinter says:

      I’m with you, but yeah… luxury items exist everywhere. I’m kind of against exorbitant value placed on clearly less-worthwhile things (i.e. rolex), but when it comes high priced gpus, there’s at least the hope that you might one day get the quality you paid for it.

      …Except by the time that finally rolls around, there will be another card for $500 that will again just futureproof you for another few years. Early adopters man, they keep the coffers full and perhaps fuel innovation a little, too bad it doesn’t spread as quickly among developers as it should.

      Remember when voodoo graphics and the TNT2 first came around? Didn’t it seem like every new game took advantage of one or the other? I miss that, in fact if there were as few as 6 games announced today that would take advantage of tricks PURELY available on this card, i might be swayed from my crystal palace of meh on these, but nahh… never happen :

    • Premium User Badge

      Wisq says:

      People who use high-res monitors, or multi-monitor setups.

      Seriously, my dual-GPU ATI 5970 card was the fastest thing on the market (and cost $500) when I bought it, but it’s powering a 2560×1440 monitor. 2560 and 1440 are each only 33% more than 1920 and 1080 respectively, but square that and you’ve got a 77% increase in pixels, and it can’t always power that many pixels without some noticable slowdowns.

      Or how about multi-monitor setups? Throw in a couple of extra monitors, even at the more nominal 1080p resolution, and you’re talking about a whole lot of pixels to deal with.

      Realistically, I’ll almost certainly be upgrading my CPU before I upgrade my GPU again. But these high-end options do exist for a reason.

  24. TheWhippetLord says:

    This power management / auto tuning stuff scares me. I tend to turn off power management on my PC since my trust in such things has long been eroded by various bits and pieces over the years turning themselves off when I would prefer them to be on (like hard drives, network cards and my capacity for love.) Mind you all that was years ago. Is that kind of thing realiable and happy now? Or am I right to be a luddite?

    • ankh says:

      If you are what they consider to be normal then all that stuff will work fine. If you are not what they consider to be normal then it won’t. (It still doesn’t work properly)

  25. Premium User Badge

    jezcentral says:

    PLEASE let there be a new console generation soon. I can see the future from here, and it’s going to take a step by Microsoft and Sony to get us there.

    • phylum sinter says:

      Exactly, until there’s a new Gamebox out, i don’t think we’ll see a substantial leap in our PC gfx quality. Kind of sad, really.

  26. Tom says:

    frame rate limiting’s nothing new – you can enable it using NVInspector and it’s totally awesome.
    it also resolves input lag, and the benefit is immediately obvious.

    it’s very much a set-and-forget kinda thing. works wonder with my 570.

  27. Innovacious says:

    I kind of what one, but am tempted to get one imported from Americaland if possible. On average they are about £110-120 cheaper over there.

  28. Bobtree says:

    According to Nvidia tech support on their driver forums, FXAA, adaptive vsync, and the frame rate limiter will be available on 500 series cards in future drivers, maybe in April. GPU Boost and TXAA are Kepler specific.

    I really want the limiter and adaptive sync. Triple buffering beats vsync alone, but it isn’t sufficient IMHO.

    There are ways to enable some stuff hidden in the current drivers (since 285.62 IIRC), but there are bugs and issues (like adaptive vsync making some games load very slowly), so I haven’t bothered with it.

  29. Jim9137 says:

    I’m glad games don’t have to care about this silly wooshy wooshy stuff anymore.

  30. deadly.by.design says:

    I bought a Big Boy Flagship card once. It was the 7800GTX and it cost me around $500.

    All-in-all, while it was fun for a little while, I regret the purchase. The 8800GTX came out barely a year later and performed much better at half the price.

    Yeah, sure… late ’06 was an awkward PC phase where the C2Duos hadn’t quite taken us to where console ports ran flawlessly (was running an AMD x2), but it still left me ‘once bitten’ about buying enthusiast models.

  31. zaphod42 says:

    Rock Paper Shotgun: YOU MADE A MISTAKE!

    The GTX 680 is NOT the high-end card. This is confusing on Nvidia’s part, so not entirely your fault; you just assumed that they followed their normal convention. But they didn’t, so that’s lazy journalism! :P

    The 680 is actually the MID RANGE card. But with all the advances of the new Kepler architecture (which is the architecture code name, not the video card’s “previous name” as you said, like with the previous generation’s Fermi cards, Nvidia names their architectures after famous scientists) their card is so much more energy efficient and so much more powerful than the current market top dog ATI’s card, that they realized they could release their mid-range to compete with ATI’s top dog. Then they can drive ATI’s price down, and later release a REAL high-range card which blows everything away. In the mean time, they can make high-range card money for a mid-range card, and make some extra profits from all of Kepler’s design costs.

    • Delusibeta says:

      I’m inclined to perform a duck test on this one: if it looks like a high-end card, performs like a high-end card and is priced like a high-end card, it’s probably a high-end card.

      • Premium User Badge

        AmateurScience says:

        But is it made of wood?

        In response to the OP. It’s £430. Thus it is high-end, considering that one could build a system (sans monitor/peripherals etc natch) capable of playing most games in some fashion for approximately that much.

      • Premium User Badge

        Wisq says:

        It performs like a high-end card … up until they release their actual high-end card and blow their own mid-range card out of the water.

        I gather that was the point the OP was making. Don’t treat this as “this is the next high-end GPU”. Treat it as “this is a mid-range card being misrepresented as a high-end card”. So if you’re looking for the absolute best in performance at any price, don’t be confused by this sudden advance in GPU technology, and instead hold out until the real high-end arrives.

        For the rest of us … does this mean that this card will significantly drop in price as soon as the real high-end arrives? Or is nVidia going to keep the $500 price tag on this one and then set a new record ($800ish?!) for high GPU prices when their real high-end card arrives?

  32. phenom_x8 says:

    “So, for most of us, the 680 specifically isn’t going to be something we’d ever consider buying, and instead acts as a guideline to what this generation of GPUs is going to offer PC games”

    Agree with this. As usual, I’m gonna skip this GPU generation and wait till what the next two years would bring (going from radeon 9550 -> hd3xxx -> Hd6850).

  33. Premium User Badge

    MerseyMal says:

    Definitely looks a nice card but will probably stick with my crossfired HD 6870s for now. I suspect I’ll be going back to nVidia when I do change though.

  34. phenom_x8 says:

    And yeah, before I forgot, this review seems much more balanced than any other sites review

    http://techreport.com/articles.x/22653/

    (the edit button were missing from my post before)

  35. Soon says:

    I want lighting that doesn’t half-obscure things in complete darkness even outside in daylight. But this may be more of a game engine/dev thing.

    • Premium User Badge

      Wisq says:

      I dunno, shadows do a pretty good job of almost completely obscuring things in complete darkness IRL on a bright sunny day. At least until you get up close.

  36. 2late2die says:

    I’m looking forward to the free kittens personally.

    Btw, in regards to Skyrim – I wouldn’t mind giving it a second look after getting this card and installing the variety of visual enhancements mods like textures, lighting, shadows, character details and more.

    • Shooop says:

      The problem with Skyrim is its bottlenecks are the CPU and video card’s frame buffer.

  37. Dozer says:

    What I want to know is, is this going to cause the price of the last-generation midrange nVidia card to drop?

  38. Radiant says:

    450 POUNDS.

    FUCK OFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF

    • Radiant says:

      Incidentally if any of you do wish to spend 450 pounds enriching your pc. I have a kickstarter I’d like to point you towards.

      It’s called Radiant’s Private Mind Garden.
      I have 3 months to raise 20 million pounds LETS GO.

    • Shooop says:

      Just sit back and relax for a few months. These things are always ludicrously expensive when they first appear because of high demand.

  39. hemmingjay says:

    I need an RMA for my kitten. I tried to overclock it :(

  40. rockman29 says:

    Can’t wait for the laptop equivalent!

  41. Fox89 says:

    I have ordered this in my new PC! I haven’t updated my graphics card since my GTX 285, which is running in my Mac, so I felt it was time to get a proper dedicated gaming PC and what better to equip it with than one of these?

    I’m not sure what game to try first… on the one hand The Witcher 2 is probably where I’ll see the biggest difference. On the other hand that’ll be one of the longest installations and I am impatient. Perhaps Battlefield 3?

  42. Fwiffo says:

    There is a typo in this article, but I’m not going to point it out because it makes it read like a circa-1988 Bomb The Bass record and has put me in a good mood for the night.

  43. Premium User Badge

    bsplines says:

    As a CUDA programmer, this looks absolutely awesome and a significant upgrade with 3 times as many cores.
    As someone concerned about power consumption, this is a step in the right direction.
    As a gamer, it ‘ll probably be at least 2-3 years before it has any effect for me.

    • alilsneaky says:

      Then you should look up the gpgpu performance before you jump to conclusions based on the amount of cores, this card is trash for that purpose.

  44. Shooop says:

    Early test results are in and it’s not as powerful as a GTX 580. But it is much more power efficient and quieter.

  45. Demiath says:

    It can run 3D Vision stuff on a single card, if you’re deluded enough to want that.

    I’m no tech wizard so I’m not quite sure what that’s supposed to mean. I run 3D Vision games on my single GTX 580 card and am quite happy with it. The truly demanding titles such as Witcher 2 might hover precariously around 30 FPS at times (according to the objective FRAPS counter, at least, subjectively it can be hard to make out), but most games work really well in 3D performance-wise. Or maybe 3D Vision itself is the “delusion” being referenced?

    • Premium User Badge

      AmateurScience says:

      I think it was the latter ‘why would anyone want 3D vision’ reason. Having never played a game in stereoscopic 3D I couldn’t possibly comment,

  46. felisc says:

    my good mr meer, i’d like to point out that this card costs 500 euros in Europe (in france at least), not 420.
    not like it makes a big difference, that’s still in the red zone of my official “fuckthisisexpensive-o-meter”

  47. macaco says:

    I’m sorry, but battlefield 3 running at 70 fps in a dark detailess hallway of a carrier? That is the best they can do? My 560ti448 can do that. Hell my 560ti448 can do 60fps in the middle of a battle with half of Karkand falling down around me. on ultra at 1080p

  48. Sarkhan Lol says:

    Look at that fucking thing. I half expected the video to be a bunch of chimps pawing at it and then throwing a bone into the sky.

  49. Branthog says:

    Still just waiting for the 690 (the dual GPU card) or the 7990 (AMD’s dual GPU). Unfortunately, I don’t know when the 7990 is coming and the 690 isn’t likely to come out until the third quarter or later. As soon as I feel satisfied with either one, I’ll be buying two of ‘em so I can go quad-SLI/Crossfire.

    Until then, news of the 680 and 7970 only serve to give a glimpse of what I might expect whenever the fuck they get around to finally releasing the god damn things.

    • phylum sinter says:

      I’m wondering what sort of monitor and gaming setup do you have that could possibly warrant that much power.

      Maybe partially to subdue my inner jealousy (i only have a 5850), but i haven’t run across anything with it that makes me think i need a few $500 videocards. What games need even a 580 to be enjoyed?

  50. jellydonut says:

    NVIDIA and ATI need to invest in PC development if they want me to do anything but go ‘oh that’s neat’ at their new technology and then proceed to ignore it completely.

    The only reason I have a relatively new card is because the other ones mysteriously failed. (both were NVIDIAs)

    Anyway no one cares because a four year old card will still run every new game at decent settings, thanks to the idiot boxes holding us back to 2005 levels. (idiot boxes which run NVIDIA and ATI chips, respectively. so I guess supporting PC development would be a conflict of interest for these Quisling companies)

    • Zarx says:

      They do, any game with Nvidia’s “The Way it is meant to be played” or AMD’s “Gaming evolved” have been partially funded and development supported by the respective manufacturer. Usually it amounts to the devs being payed to add a super demanding feature that works best on their cards tho, like the insanely wasteful use of tessellation in Crysis 2 DX11, or PhysX effects.

    • phylum sinter says:

      I agree completely – I wish there were more reasons to BUY a $500 video card… but i haven’t seen any in many, many years.