NVIDIA’s Most Preposterous Graphics Card Ever

By Alec Meer on February 19th, 2013 at 5:00 pm.

I'm a bit scared

While the next round of consoles wait in the wings, potentially setting the general game graphics bar higher even if, as rumour has it, their actual hardware lags behind top-end PCs, NVIDIA’s just done a new thing to help sew up the PC’s status as Top Pixel Dog. If there’s to be similar from ATI we don’t know much about it yet, other than it has the rather pretty codename “Sea Islands” and it’s not due any time soon, but we’ll shout once we know more.

GeForce GTX Titan is the hilarious name of NVIDIA’s new flagship card, it has been ripped from the heart of an IRL supercomputer (sort of), and £827 is its terrifying pricetag.

While benchmarks aren’t doing the rounds yet (hopefully Mr Laird will be able to tell us more soon), official word is it’s the fastest single-GPU card available. It may be that multi-GPU cards like NVIDIA’s own GTX 690, or SLI/Crossfire multi-card setups, have an edge, but they’re also a pain in the bottom.

Not as much of a pain in the bottom as £827, of course. It’s beyond out of reach to the vast majority of humanity, and no matter how much I might hunger for it, it is impossible that I would ever own it.

What you get for the best part of a grand is a card which originated in a bone fide super-computer, the Cray Titan. There, the card was known as the Tesla K20, carried a $3500 pricetag, and worked in tandem with 18,687 identical cards. Later this month, mere (but exceptionally wealthy) mortals can own one.

Not all graphics cards get their own soundtracks, you realise. It’s like a wrestler.

Now reconfigured for desktop PCs, it bears 2688 CUDA cores, 75% up from what’s in NVIDIA’s previous flagship uni-card the GTX 680, and a GK110 GPU (as opposed to the 680’s GK104). It has 6GB of memory. SIX GIGABYTES. Christ almighty. If you want teraflops, those are 4.5 single precision and 1.3 double precision. If you want a release date, it is February 25th.

February 26th is my birthday. Just sayin’.

, , .

136 Comments »

Sponsored links by Taboola
  1. Dethnell says:

    the music makes this card one million times more tempting. EPIC.

  2. Premium User Badge

    Smashbox says:

    Why, that’s just the thing for my Pentium IV!

  3. Premium User Badge

    Lamb Chop says:

    I feel like I’m watching Batman.

  4. Forceflow says:

    But I just got my GTX 680 … grmbl. MOAR MONEY.

  5. pupsikaso says:

    Pray tell, what game will ever require this graphics card in the next 5 years?

    • Zenicetus says:

      X-Plane pushes the limits of current hardware pretty hard, at full settings. The DCS guys and Lockheed’s Prepar3D are still moving forward with their flight sims. Aside from a few FPS titles like Far Cry here and there, it’s always been the PC flight sims that have been the best justification for buying top-end CPU’s and graphics cards.

      I’m happy with my GTX 560ti for now, but if there are some baby brothers spun off from this new card at a lower price, then I’ll be interested.

      • wodin says:

        NO..I don’ t agree..the issue is it pushes cards due to poor optimizations and coding.

        before we had upgradeable PC’s developers kept on improving games both in terms of gameplay and graphics through new ways of optimisation etc etc..these days they don’t have to do that as Hardware keeps coming out.

        I reckon our current hardware has so much untapped potential, sad really.

        • Zenicetus says:

          It’s not a question of poor coding, but of what flight sims (the civilian ones, anyway) are attempting to do now, with their scenery modeling. The hot new thing is using OSM (Open Street Map) data to auto-generate ground scenery, which is incredibly data-intensive.

          Here’s a test shot from a post in the Avsim forum, where someone experimented with importing OSM data in X-Plane. I think this is just north of downtown Seattle:

          http://img844.imageshack.us/img844/1755/xplane2013021705550373.jpg

          That isn’t an orthophoto (2d image) down there. Every building is a full 3D object. See that big yellow “3” in the upper right corner? That’s the frame rate at this data density: 3 frames per second. Scenery like this can be stripped down to load faster, but you’ll still need one hell of a CPU cluster and GPU to render something like this at 60 frames per second while flying. This is where scenery is going in the flight sim world. We’ll need better hardware to keep up with it.

          • ChromeBallz says:

            Geometry is not a problem for modern PC’s, and those buildings don’t look to be textured (much).

            Even then, with some clever texture reusage and tiling, along with the already brutal hardware T&L of today, i don’t see how a static model like that would make the framerate go that low – Even on PC’s a few years old. I can only imagine that they’re trying to update the entire model with each frame, since that would generate the indicated FPS.

            X-Plane is awesome for realistic flight, but it’s hardly a benchmark of optimized (and good looking) graphics. Ace combat looks quite a lot better if you ask me >_>

          • Zenicetus says:

            I can only imagine that they’re trying to update the entire model with each frame, since that would generate the indicated FPS.

            Think about the dynamic lighting. It’s a little hard to see at that angle and time of day, but every one of those separate 3D buildings is throwing its own little shadow, and the shadows have to be updated with every frame, as the plane moves. It would be more obvious early or late in the day, with longer sun shadows. That may be sucking up a big part of the processing, but I think you also might also be underestimating the sheer number of separate 3D objects in that view.

          • Premium User Badge

            FriendlyFire says:

            I’m sorry, but that’s just dumb. This is just trying to go brute force and then wondering why it doesn’t work.

            Using parallax mapping with self-shadowing, this could be generated at 30+ fps and it’d look just as good if not better. Only the taller buildings need to be modelled fully, the smaller models can be approximated and distant objects can be removed entirely.

            There are probably more ways around it too if you bothered to look. Just generating tons of geometry and sending it to the GPU directly is wasteful. It also means you can quite clearly see the cutoff range where no geometry is being generated.

            @Zenicetus: It’s not a lot more expensive than rendering the scene itself is. It’s just that there’s a lot of geometry, but the shadowing itself probably is something fairly standard. Honestly, considering the specifics of the game (directional light, distant objects), you could probably get away with a lot of approximations.

          • Zenicetus says:

            @FriendlyFire: I think you (and maybe others) are still missing the point.

            Sure, it’s brute force. But that’s because flight sims like X-Plane allow you to fly anywhere in the world, but all you see there are runways. It takes tremendous amounts of hand labor to do the kind of handmade 3D scenery that companies like Orbx release as commercial scenery for FSX. The worldwide coverage for hand-made scenery like that is pitiful. Looks great where it’s been done, but that’s it.

            The whole point of using OSM scenery is that no hand-optimizing required. You could fly anywhere in the world where there is enough open source data published, and just suck in the raw data. It bypasses any need for optimizing, the way normal computer games are optimized — *if* — you just have enough brute force processing to handle the raw data.

    • dangel says:

      Crysis 3, friday.

      Some folks are already running it – just try it maxed out at 2500x… res.

      Me i just want 1920×1200 at 60fps maxed out – and for that 670s SLI-d will suffice.

      “OK just booted up Crysis 3 with max settings and TXAA High, 15 fps at 1440p on 680 SLI. Dropped AA down to SMAA and got 30 fps. AA off and got 45 fps.

      This is all on the first level.

      It uses 99% of both my cards. I put shading, shadows and particles down from very high to high and now I get 60 fps. This game is a beast.”

      http://forums.overclockers.co.uk/showthread.php?t=18489583&highlight=crysis

      • Delusibeta says:

        So, in other words it’s badly optimised as per usual for the Crysis franchise.

        • Low Life says:

          No, like the original Crysis it’s just a ridiculously good-looking game.

        • dangel says:

          Got something substantive to back that one up? Aside from the lack of SLI support in 1 I can’t say i’ve had much to grumble about in terms of visuals and optimization – or are you simply someone who believes uber eye candy is a no-cost item? Cryengine 3 seems to scale remarkably well – once Crytek let it go high end (witness C3 and of course http://maldotex.blogspot.co.uk/ for C2).

          Of course if you want to grumble about the stupid politics of crytek (DX10 debacle, consolitis of C2) that’s another matter but really these ‘arguments’ over poor optimization seem to me to stem from ‘OMFG my rig can’t take everything on ultra crytek suck’. That sort of thing got them to dial it back in C2 and then they got lambasted for that too.

          Frankly i’d like someone to push the envelope..

          • Premium User Badge

            FriendlyFire says:

            If those numbers are correct, something is definitely fucked up.

            TXAA and SMAA are post-processed antialiasing. Even 4x TXAA shouldn’t be able to drop TWO 680s by 30 fps from the non-AA’ed value. Those algorithms were designed to be lightweight. At worst I’d expect like 10 fps from maxed TXAA and 1 or 2 from SMAA (there’s an SMAA injector out there which drops your fps by that amount, and since again it’s post-processed, it CANNOT be affected by the scene’s complexity or the rendering engine’s capabilities).

            I’m guessing that this may be a case of bad SLI optimization, the numbers make no sense.

        • ResonanceCascade says:

          Crysis games are well optimized for what they offer. The first Crysis still looks and runs better than many new games.

          • SuperNashwanPower says:

            Warhead ran much better than Crysis, I heard they acknowledged issues in C1 and so did a lot of optimisation to get better frames in WH, so its a fair comment about C1 at least lacking optimisation.

            God I loved those two games, soooo much :*)

          • DK says:

            Crysis 1 looks better than any of the sequels – and it’s better optimized and has a smaller hardware need.

            Warhead ran faster still, but sacrificed some advanced effects to do it (simplified bloom is the most obivous thing it changed)

      • fish99 says:

        Is that the same Crysis 3 that’s being released on 7 yr old consoles and won’t look as good as Crysis 1?

        • ChromeBallz says:

          Actually, Crysis 3 looks to finally beat Crysis 1 in the looks department. Crytek also went balls out on the PC version, making it the lead platform this time around. Console versions are an afterthought to please EA.

          • DK says:

            Crytek made the same claim with Crysis 2. They’ve lied repeatedly.

          • Low Life says:

            Luckily at this point we have screenshots and video footage from reviews so we don’t need to rely entirely on what they’re saying.

    • Premium User Badge

      bear912 says:

      I plan on playing Quake at 9600 x 5400 and 480 FPS. It seems obvious that this is the intended purpose.

      • Low Life says:

        Meanwhile, Counter-Strike players still playing at 800×600 and minimum settings.

    • Premium User Badge

      PoulWrist says:

      I wouldn’t mind it… 2560×1440 and up is pretty hard, even on my Asus GTX 680 :(

      • dangel says:

        Which is pretty much the thing that keeps me off that res/27″ monitors – well that and the huge jump in input lag etc. I’d rather get a solid 60fps at 1920x because it keeps things far more fluid and playable.

    • grundus says:

      With 120Hz and 4K coming up, we’re going to need something to push all those pixels. Even at 60Hz and 1080p, triple monitor gaming exists and is a thing, hopefully something that will become more widespread (ah, who am I kidding?).

    • Premium User Badge

      Joshua says:

      Running Arma 3 at the maximum view distance (which is probably 10+ km)

    • x1501 says:

      Very few games really require high-end graphic cards at the moment of their release. If you’re asking of ways of utilizing all that power without worrying about the price tag, you could already use it to add more eye-candy (4096×4096 texture packs, maximum forced AA, PhysX, AF, etc.) to a whole bunch of existing games today. And that’s even before the next wave of poorly coded “next-gen” console ports, with their inflated requirements, will begin to flood the market in a year or two. Of course, by that time you’ll probably be able to grab this card at 1/3-1/4 of its current price, so…

  6. caddyB says:

    Now you can play Xbox ports in full HD with max graphics!
    Yay!

    It’s not meant for gaming, obviously. I hope.

    • GenBanks says:

      It’s meant as an upgrade for the tiny niche of gamers who are investment bankers and like to snort cocaine off their mousepads between psychotic episodes of screaming obscenities into their mics while playing competitive games. Most of them are going to buy 4.

      • Guvornator says:

        The combination of name and power does rather scream penis substitute, don’t it? “Honey, I’ve got a Titan in my bedroom and I’m not afraid to use it.” Imagine the disappointment…

        • Premium User Badge

          Wret says:

          I did realize somewhere around it showing the front of the card “This is a sexy car commercial, for a car you stick in a dark garage and never see except through the door window”

          I’m sure sane compensators will just go with a nice car.

      • Premium User Badge

        GiantPotato says:

        Incorrect. We’ve all migrated to tablets now since they’re more convenient to snort cocaine off of.

    • -Spooky- says:

      *lol* Made my day ..

    • Premium User Badge

      darkChozo says:

      It is very specifically meant for gaming. This is the GTX counterpart of an existing computing GPU, meaning that it’s being marketed towards gamers. Well, gamers who desperately want to benchmark on their 120 Hz 3D 4K triple monitor setup. And to make sure that the nuclear reactor powering it can handle the load.

      • Premium User Badge

        PoulWrist says:

        It only draws 250 watt :/

        • dangel says:

          he meant a nano-nuclear reactor :)

        • Premium User Badge

          tigerfort says:

          That’s still more than most people’s entire houses.

          • dangel says:

            Well they don’t exist either but I wouldn’t want to be seen as a pedant.

        • Premium User Badge

          darkChozo says:

          The monitors are made of starstuff and output thought instead of images. The reactor provides the pure gamma energy required to fabricate the multiverse.

    • hypercrisis says:

      who is Max Graphics?

  7. Cinnamon says:

    I only use graphics cards inspired by hyper computers.

  8. GenBanks says:

    I like how they added helicopter noises to the fan… They’re not marketing this as being cool and quiet then?

    • Hoaxfish says:

      followed by Ride of the Valkyries, and the card dispensing napalm all over your motherboard

  9. almostDead says:

    Oh, come on people, it must still be funny to say it…

    You know you want to….

    But, can it play Crysis??

  10. whexican says:

    This will go great with my special edition Intel Hal CPU.

  11. Leprikhan says:

    I can’t tell if I just watched a trailer for a GPU or the new Mass Effect

  12. Njordsk says:

    Yes, but does it run Minecraft?

    • Scumbag says:

      Or can you engineer it in Minecraft?

      • RaiderJoe says:

        Can you run Crysis in minecraft with it?

        • Scumbag says:

          Can it run Crysis, being rendered on the card made in minecraft while being run on a machine that is running minecraft, made in minecraft?

  13. Artista says:

    I hear if you have this card, you play as a ball in Thomas Was Alone.

    • Premium User Badge

      Naum says:

      That’s antialiasing done right! Away with those filthy edges everywhere!

  14. Guvornator says:

    “hopefully Mr Laird will be able to tell us more soon”

    I think Mr Laird will have one of his rage-spasms. nVidia/Intel artificially holding back tech and overpriced cards are his buttons and I think this card just mashed it’s gaudy fist onto them…

  15. teamcharlie says:

    I hope that, like pure ethanol, the Nvidia Titan also spontaneously combusts at room temperature.

    • Premium User Badge

      GiantPotato says:

      No, like other Nvidia cards it will instantly heat itself to 200 degrees upon being plugged in and THEN spontaneously combust.

      • dangel says:

        …i think i remember an nvidia driver update that facilitated that ;)

        • Premium User Badge

          GiantPotato says:

          Yes, there was one, but it didn’t come in time to save my beloved 6600. Or the ethernet card above it.

          It sure was nice not having to heat the house in winter, though.

      • Premium User Badge

        PoulWrist says:

        Old ones, I guess. From when they had the vacuum cooler.

    • almostDead says:

      Wait, but, what… pure ethanol doesn’t do that though.

      Colour me confused.

  16. Ginga121 says:

    “worked in tandem with 18,687 identical cards”

    RUN ALL THE GAMES!!!

    ….

    AT THE SAME TIME!!!

  17. almostDead says:

    It’s not the 800 quid that’s the problem, it’s the liquid helium being cooled by the liquid nitrogen that’s the problem. Helium is expensive.

    • ThTa says:

      Not with the US government still acting like complete and utter idiots with their stockpile.

  18. int says:

    Oh I get it, you need that to play Blizzard’s Titan.

  19. Greggh says:

    I say we make a fundraiser to get RPS one of those, out of pure kindness.

    After it’s done, we’ll send you the piece and you FIGHT FOR IT! IN AN ARENA!!! IRL GLADIATOR-STYLE!!!! EXCLAMATIONS!

  20. jimbonbon says:

    It definitely IS expensive, but you can certainly see why from technical spec and build quality alone. I’m also sure the benchmark results (which will become available once the NDA expires on Thursday) will further justify it’s price positioning.

    It’s a really damn niche product, but you can still guarantee it will sell out immediately. Hell if I had £800 lying around i’d buy one, but then that’s probably why I have loads of hardware and no money.

    • Premium User Badge

      Joshua says:

      You probably will need more then that. This is one of those cards that you do not built into your PC. You built your PC around it.

      • jimbonbon says:

        This is very true, although in my case I currently have 3x GTX 680s so there wouldn’t be too much issue. These do actually struggle in several games running a surround setup (5760×1080) so in theory the Titan would be a nice step up. But at about £800 x3 thats not going to happen!

        The Titan is a very unusual card. That much VRAM is hugely excessive for single monitor gaming, but really good for surround. But in surround you would probably need more than one of them (in terms of GPU performance) to run at high settings and resolution.

  21. jimmydean239 says:

    Want want want want want want want want want

  22. Havok9120 says:

    The music sounds like it’s pulled directly out of Deus Ex: HR.

  23. Bishop149 says:

    I still find it slightly amusing that these stupidly expensive things marketed as “Ultra high end bleeding edge type stuff” are still cooled by a poxy little fan
    Off the shelf watercooled gfx cards do exist but are so barely marketed you’d be forgiven for thinking they didn’t. I’m surprised they don’t go for “Look at our super awesome peltier cooled GeForce GTX MEGA$$”

    • Narzhul says:

      There are notable differences between the two premium boards, though. Whereas the 690 needed an axial-flow fan to effectively cool two GPUs, GeForce GTX Titan employs a centrifugal blower to exhaust hot air out the back of the card. No more recirculating heat back into your case

      Doesn’t sound bad at all

    • jimbonbon says:

      To be fair, it’s far more than a simple fan and heat-sink on that thing. It’s also supposed to be quieter than the GTX 680… which was really damn quiet anyway!

  24. N1kolas says:

    It will sell out. While the price tag is extremely high even for a top gaming card, the uncapped FP64 performance makes it an actual bargain for researchers/professionals who need it, at almost a fourth of the price of a real K20 GPU.

    Which makes getting one of these and writing it off as a business expense a very real possibility. Hmmm….

  25. SuperNashwanPower says:

    I might buy one

  26. Sicram says:

    Waaaait, I know what this is… it’s an ad for their new SPACESHIP

  27. SuperNashwanPower says:

    Wouldn’t you need an absolute God of a CPU to avoid bottlenecking this thing? I was reading AMD and Intel comparisons on Toms Hardware and they talked about how selecting the wrong CPU can effectively throttle your GPU – so that given, does a CPU exist that could actually run alongside this without holding it back?

    • barcharcraz says:

      well for gaming I don’t think there exists a game that could possibly saturate this, but if you are doing HPC you can start a “job” and let in crunch for a while since you dont have to push stuff to the GPU every frame/few frames.

      I wonder if it could deal with sever virtual machines all using it as a RemoteFX vGPU

  28. PopeRatzo says:

    Now I’m fully engorged.

  29. PopeRatzo says:

    My wife does fluid dynamics, specifically numerical simulations of waves.

    I bet I can convince her she needs one of these for one of the machines she uses to work at home.

    Once it’s in the house, I make up a eSATA drive with Windows7 on it and Steam and I’ve got myself a kickass Crysis 3 rig that will also crunch the hell out of some Fourier transforms.

  30. So Many Eggs says:

    Video does a great job at showing just how pretty a video card can be. Because that’s why people buy video cards, right?

  31. F3ck says:

    Too bad it’s an evil nvidia card…waiting for the answer from AMD…at which point I’ll sell my car to afford…

    …but the shadows in Columbia will look spectacular…

    Seriously, for a moment – what sickens me here is that I actually know douche-bags whom are going to purchase this. I’ll get the call the night before they do – “Well, I heard it’s the best one”. “Yes, it likely is…” and since money means nothing to them they’ll buy this monstrosity, to match their $1000 i7 and SSDs… …all so they can open Firefox in the blink of an eye.

    Meanwhile, I’ll be overclocking my 965 to 6ghz so’s not to bottleneck…

  32. newprince says:

    Ah, I remember my old Voodoo 5 5500, then seeing nutso videos of the 6000 which was never released. Thing got in the way of my hard drive, it was so big. Glad to see silly stuff like this still going on.

  33. Fox89 says:

    Hmm… I think I’ll wait for an Overclocked version.

  34. Xzi says:

    If I’m dropping over 1K on a single piece of hardware, it damn well better have more than one fan. Eight plus watercooling seems more appropriate.

  35. SuicideKing says:

    @Alec: There is no ATI anymore, ATI was bought by AMD and now it’s merged, so all Radeon HD 6000 series and later cards are AMD cards.

  36. Parrot says:

    I wish I could store my brain in it, and fly to some distant star… Did NV reveal if it surpasses the speed of light?
    Oh well, it’ll cost a couple of bucks in a few years