Nvidia release the most powerful GPU ever for the fourth time in under a year – but why?

It hungers for --new blood--

I appreciate that headline inclines a little towards melodrama, but this is really the situation: with AMD having spent the past year as something of a sleepy giant, Nvidia have been engaged in serial one-upmanship with themselves. Just under a year ago, their GTX 1080 GPU became handily the world’s most powerful consumer graphics card, followed by the even beefier 2016 Titan X shortly afterwards, which was then marginally pipped by the comparatively affordable GTX 1080 Ti last month. Now they’ve leapfrogged themselves once again with a new $1,200/£1,180 brute known as the GTX Titan Xp.

Specs’n’that below, but I think the bigger question here is ‘why are they doing this?’ Are they scared of AMD’s long-delayed riposte, or are they trying to trounce yesterday’s reveal of Microsoft’s new 4K Xbox?

Here’re the main numbers for the already-available Titan Xp:

NVIDIA CUDA Cores: 3840
Boost Clock (MHz): 1582
Memory Speed: 11.4 Gbps
Standard Memory Config: 12 GB GDDR5X
Memory Interface Width: 384-Bit
Memory Bandwidth: 547.7 GB/s

It’s a higher-spec refresh of the existing Pascal architecture rather than new-new, but the main thing it does is re-establish Nvidia’s ‘Titan’ brand as their biggest dog – the 1080 Ti having recently trumped (sorry) 2016’s Titan X for around two-thirds of the price.

Here’re the Ti’s numbers for comparison:

NVIDIA CUDA Cores: 3584
Boost Clock (MHz): 1582
Memory Speed: 11 Gbps
Standard Memory Config: 11 GB GDDR5X
Memory Interface Width: 352-Bit
Memory Bandwidth: 484 GB/s

So it’s a leap, but maybe not a gigantic one – the cores are the most meaningful gain there, as the other improvements can be obtained via surprisingly pain-free Ti overlocking. Make no mistake, the Titan Xp is more about bragging rights than anything else. A status symbol both for the company that makes it and for well-monied PC owners. Sucks to be anyone who bought a Titan X last year though (other than all the ways it doesn’t because they’re surely swimming in gold).

In many respects, the Xp is a repacking of the $5000 Quadro P6000 workstation card, which makes it a relative bargain if you’re a professional rendering sorta person. But though it’s based on the same Pascal GP102 chip, RAM is halved from 24GB to 12GB.

Anyhoo: while undeniably an impressive and desirable slab of technology, especially so soon after the last time Nvidia did this, no-one outside of that industry actually needs this card. There are very, very few usage scenarios where all that power will provide a meaningful benefit over what the £700 GTX 1080 Ti can do – multi-monitor gaming and absurd amounts of anti-aliasing at 4K, perhaps. It might be slightly longer before you ‘need’ to upgrade again, but if you’re the sort of person who can even begin to countenance spending $1200 on a graphics card then I imagine that’s an academic issue anyway.

Onto the why. I have three theories for you.

1) Because they can. 2016’s Pascal’s clearly been a particularly fruitful architecture for them and, for whether long-term planned-for or not, they seem able to keep on raising its ceiling in a way. In the absence of new top-end competition from AMD, who have stuck to smartly-priced mid-tier cards for the past year or so, they’ve had an open goal for free PR. They could possibly have jumped straight to 1080 Ti-level specs for the initial 1080, but this way they get more rounds of publicity (hello sorry).

2) Because they’re terrified of AMD’s much-anticipated Vega GPU, and are doing all they can to sell cards and/or overtake it in the performance stakes before its planned Q2 release (which, not being sure whether they mean real Earth quarters or AMD financial year quarters, could be any time or now, or not until deep into summer). No-one outside of AMD and their partners (including Bethesda, who they recently announced something of an alliance with) knows the full capabilities of Vega, but it has been seen running maxed-out 4K Doom at super-high frame rates, which at the very least suggests toe-to-toeing with the 1080, in the right games at least.

It may be that it whips anything Nvidia has to offer, or it may be that if offers similar performance for a fraction of a cost – either would be be a big shake-up for the market. Hence, speculation that Nvidia have been putting out new cards like there’s no tomorrow and released the Titan X-beating 1080 Ti for ‘just’ $700 because they’re running scared of Vega is understandable, if as yet unproven.

If that has been the thinking, I imagine it’s been successful too. Anecdotal I know, but I’ve been holding out for Vega to power my 3440×1440 Freesync monitor for quite a while, but gave up waiting and, though it’s cost me adaptive sync, now have a 1080 Ti all but guaranteeing me my sixty frames for the next couple of years and hopefully beyond. I very much doubt I’m alone, particularly because the vanilla 1080 saw a price cut in the wake of the Ti.

Though the AMD vs NVIDIA war is fought loudly and constantly on the internet, for most of us there’s no brand loyalty – it’s just about which card will best offer us a good whack of performance for a good amount of time, and that’s been a conversation Nvidia have been able to control for the past year. Although, conversely, AMD’s smartly-priced RX 480 has done a grand job of tying up what is now the lower tiers of the mid-range. Anyway: Vega’s going to be fascinating, as either it’s an enjoyably rug-pulling strike back or Nvidia’s stream of recent new cards has set the bar impossibly high.

3) Yesterday saw Eurogamer/Digital Foundry’s world exclusive reveal of the hardware powering Microsoft’s Xbox One Point 5, aka Project Scorpio, aka whatever they actually end up calling it. You should read their piece for full details and oh-so-many numbers, but the long and short of it is on-paper PS4 Pro-beating 4K performance (and a boost to 1080p performance to boot).

Scorpio uses a custom GPU that reads like a turbo charged Radeon RX 480 and is, in practice, liable to fall somewhere between the capabilities of a GTX 1060 and GTX 1070. (Though actual game performance is likely to be significantly better than that might imply, thanks to their being optimised for a fixed console spec, something that PC gaming does not enjoy).

More to the point, Scorpio’s GPU uses AMD architecture, which means the red team get to brag that they’re powering the world’s beefiest console. The Titan Xp can thus be read as a “yeah, but check this out” on Nvidia’s part.

Or, all of the above. Or, this was was always the pipeline. Alls I know is that, given how much has happened since, last year was, in hindsight, a bad time to buy a graphics card, even though it seemed like the opposite was true at the time.

From this site

72 Comments

  1. Premium User Badge

    phuzz says:

    Or 4) Their top management has secretly been replaced by AI and they are trying to hasten the rise of the robots and the end of the human race.
    allegedly

  2. causticnl says:

    this is not an response to AMD because they have no competition in this segment. Its simply the titan variant of the 10xx line of cards.

  3. Sakkura says:

    Once the GTX 1080 Ti arrived with slightly better gaming performance than the Titan X Pascal, it was just a matter of time before a new Titan was launched. They need an overpriced top dog card to fleece the “whales” who will pay any price for the best.

    It’s the same as Intel’s Extreme Edition chips.

  4. Snargelfargen says:

    5) The gpu market is increasingly split between low budget gamers and the extreme high end. Lots of well-to-do gamers in their 30s and 40s willing to buy the next best high margin GPU.

    • Buuurr says:

      This. Yes, correct. Just because $1200 is pricey to you does not mean $1200 is pricey to me. There is for sure an aging segment of gaming society that have high-end salaries that demand high-end products.

      Personally, the only reason I have not jumped on this wagon is because I am waiting on the release of the X299 series and chips to make a $1200 purchase make sense monetarily as well as having a board that can handle this throughput properly.

      • automatic says:

        Most of the times, unless you’re using CUDA cores to process something that gives you back some of the money spent on it, it doesn’t make sense to buy high end hardware. That is because the performance difference between cards from different generations is barely noticeable on the big majority of titles available on the market. I quit this upgrade nonsense about 10 years ago when I realized the industry pattern. No sensible person will keep up to it. It’s not really a matter of being able to buy it or not, it’s a matter of being able to use it.

        • Premium User Badge

          Nauallis says:

          Oh, you think buying habits are sensible? Ahahahahahaha

          • automatic says:

            I do realize it’s like a religion to some people. But that’s what I’m criticizing. What’s the point of buying something you don’t get any noticeable improvement from? Faith that you have something better?

          • Premium User Badge

            Nauallis says:

            Like you said, for some it’s a religion. Or a hobby, or a strong interest. Even if for you and me, and probably most people, there’s no discernible true difference, somebody (the polite term is an enthusiast) is getting some measure of utility out of it. I mean, who am I really to say the improvement isn’t noticeable?

            Using the analogy mentioned below, when you already have a lamborghini, why get another one? I dunno.

            I decided years ago to stop trying to figure out people’s motives for always needing better hardware and upgrades. If they have the money and the will, and it doesn’t hurt me, more power to ’em.

          • automatic says:

            No,they are not getting utility from it. At least no practical utility. Think of it as a lamborghini on a ghetto. There’s no practical difference between having 600 hp and 650 if there’s no room available (game titles) for you to put it to use. Unless you’re a dev or a beta tester it doesn’t matter if your hardware is the best available, it will probably not make any difference until there’s software that pushes it hard enough to put it to use.

          • Thants says:

            That’s flat-out wrong though. There’s lots of games out there that can put high-end graphics cards to use. On a 1440p 144HZ monitor and a 1080 you still need to turn down the setting a bit in many games. At 4k you’re just starting to get near 60FPS in maxed-out modern games with these new cards.

          • automatic says:

            We’re talking about differences on high end cards. Even if you do push it to a strain,there’s little or no practical differences between a 600 horsepower lamborghini and a 650 one to the average user.

          • automatic says:

            Enthusiast is just a convenient label for industries to ppl who consume stuff religiously.

        • Buuurr says:

          “automatic says:
          Most of the times, unless you’re using CUDA cores to process something that gives you back some of the money spent on it, it doesn’t make sense to buy high end hardware. That is because the performance difference between cards from different generations is barely noticeable on the big majority of titles available on the market. I quit this upgrade nonsense about 10 years ago when I realized the industry pattern. No sensible person will keep up to it. It’s not really a matter of being able to buy it or not, it’s a matter of being able to use it.”

          Not sure what I did to make my post so incomprehensible to you. If you note: I wrote that I would buy this… I would buy this IF. It is a big IF. You will note that I wrote this —– “Personally, the only reason I have not jumped on this wagon is because I am waiting on the release of the X299 series and chips to make a $1200 purchase make sense monetarily as well as having a board that can handle this throughput properly.”—- as a closing to my statement.

          • nearly says:

            Not sure what made the reply so incomprehensible to you, but it made the point that buying the 1200$ card won’t really give you any tangible improvements over the 700$ card.

            Your “if” is utterly irrelevant to the point that, regardless of what else you have in that system, you’re almost definitely never going to see a meaningful difference between the two cards in any aspect other than price. The throughput is not going to change significantly but the price will.

    • golem09 says:

      Exactly, it’s because they can, and there is demand. You might as well ask “Why is Lamborghini coming out with a new car?”.

  5. jellydonut says:

    The reason for this is that the Titan needs to be the top of their lineup, they can’t have the Ti card upstaging the Titan card.

    As for choruses of who the fuck buys this, the whales do. The same people who keep buying Intel’s $1k ‘extreme’ platform with outdated microarchitecture.

  6. Premium User Badge

    Drib says:

    A few years ago I got a new computer pre-built in a black Friday sale. It came with an AMD ATI Radeon something or another.

    Card never worked for anything. It would crash playing games, crash randomly, overheat, and the drivers went three years without an installable upgrade. It was “a known issue” that you couldn’t update drivers, but they didn’t care.

    I will never touch AMD again.

    It’s not so much brand loyalty to Nvidia as being burned by AMD, but that’s what happens.

    That said I’m also not the type to drip $1200 on a graphics card. Geez, my car only cost me $1500.

    • horsemama1956 says:

      I really doubt you give up on other brands after one failing like that. Usually when people say it’s not brand loyalty, it’s just ignorance..

      • Premium User Badge

        Drib says:

        I would if other brands required me to shell out hundreds of dollars for a product, and then dicked around not supporting it for years.

        But I suppose I see where you’re going with that.

      • Optimaximal says:

        No, really, you do. I, personally, won’t touch AMD graphics cards or Android after having owned devices featuring both.

        I unfortunately have to deal with fairly recent business versions of AMD cards in my job and the Catalyst software is still as dreadful as it was back when I owned them personally.

    • Buuurr says:

      I hear the drivers, man. I really do. I used to be a die-hard AMD fan. I wasn’t a big fan of their GPUs but bang for buck they could not be argued with. Not the fastest but money is nice.

      That was until I had two boards in a period of three months literally blow their tops. Took a GPU and RAM the first time. Second time was the whole accept the PSU and HDDs. Never went back since and have no intentions of doing so with the support I got. Which, was none. They said I was overclocking and junk. I was not. They had a known cap issue and were not budging on it.

      • Sakkura says:

        Motherboards blowing up has nothing to do with AMD, it’s all on the motherboard manufacturers.

        Which are the exact same companies for Intel and AMD.

        • Buuurr says:

          Yeah… except that the chipsets are different. Gigabyte board AMD chipset – blown caps. Asus board same chipset – blown caps. Asus board Intel chipset – no blown boards since… 10 years and counting now (the replacement Intel/Asus board from Newegg is still being used to this day).

    • Sakkura says:

      That’s an issue with buying shitty prebuilts, not AMD.

      AMD has better drivers than Nvidia these days.

    • rochrist says:

      I had the same experience. Won’t go near AMD cards now.

    • Ryuthrowsstuff says:

      AMD bought ATI in ’06, and retired the branding in ’10. Meaning if your card was branded that way it was made in that narrow bad of time. 11 to 7 years ago. Meaning its a decade ago rather than a few years back (or you bought a system with a old ass card in it). A lot changes in tech in that time span. AMD/ATI has had its driver weird over the years. But so has Nvidia, and AMD’s drivers are widely considered the better of the two (or it that’s what I hear, not used an Nvidia card recently).

      Personally I’ve been on ATI/AMD cards pretty much since they’ve existed. I started with a used Nvidia card, then a 3dfx model of some sort. After that ATI straight through the buy out and switch to AMD branding. They always seem to give me more for what I had to spend at the times I happened to buying cards. I’ve certainly had driver issues (especially early on) at various times. But nothing severe. And no more often than anyone I know with any brand of GPU over the same time span. And major driver issues seem to have become a thing of the past over the last decade for both manufacturers.

      Which is to say “AMD cards have bad drivers” is outdated common wisdom. It was generally true at time recent enough that it just stuck for a lot of people. They’ve had bad drivers at various points in their long history. But so have Nvidia cards. And about equally.

  7. Voxel_Music_Man says:

    Maybe one possibility is that they are targeting VR users? They need all the power they can get as VR games need to run at 90+ FPS with Full AA and high resolutions. VR games are a generation or more behind in graphics because of these limitations. At the moment even if HTC or Oculus released a high resolution headset, no-one would be able to render games for them at acceptable framerates with modern graphics.

    Although the tech is still first generation, if high quality VR catches on, graphics card manufacturers might have an opportunity to make a lot of money selling high-power hardware.

    It might not be the main reason, but I feel that this could at least be a factor.

  8. Premium User Badge

    Don Reba says:

    There are very, very few usage scenarios where all that power will provide a meaningful benefit over what the £700 GTX 1080 Ti can do

    Well, playing Nier: Automata at 1440p would be one. (don’t hold your breath for 4k). It’s an 80% jump in pixel count from 1080p, so the 50% increase in performance 1080 Ti provides over 1070 is not quite enough.

    • LazyAssMF says:

      Believe it or not i can play Nier Automata maxed (+2x AA) NP with my r9 390x.
      I know Nvidia’s cards are suffering in this game but it’s probably just drivers so Nvidia needs to get off their asses and optimize them for Nier. ;)

      • Premium User Badge

        Don Reba says:

        Why do people who brag about how well Nier runs never tell their FPS and resolution? It’s endemic.

        • Ghostwise says:

          And then there’s injectors, and driver injection…

        • LazyAssMF says:

          Well, since you were mentioning 1440p it’s at 1440p obviously, duh.
          And since i said NP that means at 60fps (v-synced and no drops), duh. :P

          I’m not braggin, just telling how it is since you seem to think you need a Titan X to run it properly at 1440p. ;)

          • Ghostwise says:

            But how, exactly, are we supposed to run the calculations if you do not specify the wind speed and humidity ?

          • LazyAssMF says:

            Hahahahahaaaa! XD

            Well, the GPU has 2 fans so it produces wind speed of 25 GFlops/parsec with “moist” humidity.

            That should make you calcs. a lot easier. :D

          • Premium User Badge

            Don Reba says:

            Ok, thanks. :)

  9. LazyAssMF says:

    I have a feeling Nvidia is afraid of AMD’s VEGA release, which is not too far off, and are trying to cash in as fast as they can before it’s release.
    VEGA is “supposedly” around 1080Ti’s performance and if AMD nails the price, like they did with RX 480, we could have 1080Ti performance GPU for 400-500 bucks.

  10. geldonyetich says:

    I think that having moar power is probably enough reason for a certain tier of gamers with too much money to buy the card, and NVIDIA has little reason to not provide every opportunity for people to buy $1,000+ video cards from them.

  11. Premium User Badge

    Aerothorn says:

    Can someone explain to me why the consoles are obsessed with 4K? Like, this seems like a tremendous use of processing power; most consumers can’t even tell the different between 1080 and 4k, and for ‘more savvy’ gamers surely 2.5k gets you almost all the way there are much less of a performance hit? It just seems like there are way better uses of beefy hardware than absurd resolutions.

    • mpk says:

      Well, Sony want to sell 4K-capable TVs, and Microsoft will do their best to make sure their consoles are exactly-the-same-but-different to Sony’s, just in case they lose sales.

    • hemmer says:

      There are an innummerable number of more useful things that could be done with the processing power, but all of them are harder to market than a single number being higher.
      Marketing is a big factor that is often overlooked.

      Personally I just hope that at least the mainstream resolution chase finally stops at 4k and since 3D and Curved gimmicks have reportedly failed we’ll see more practical solutions soon.

      hashtag optimism and all that

    • Premium User Badge

      Don Reba says:

      1080p is not some special golden middle resolution that’s “good enough”. Sure, you can downsample your picture to 2560p or 1080p, but the quality drops in proportion.

      I choose the highest settings I can get at my monitor’s native 4k resolution, because the extra shaders never make up for downsampling. And if I could get an 8k monitor, I would.

    • Korakys says:

      Yes, I’ve been wondering that too. However if you think about it the whole concept of consoles is basically a dupe to get you to pay a low up-front cost in the short term for a higher long term cost (of games and such) and vendor lock-in.

      Those who have been duped once are easier to dupe again I suppose. Probably the same people who think 1440p phones are a good idea and then complain about battery life.

      • agentd says:

        I know this is a PC site but is ‘hahaha consoles’ still the best you can come up with? Was it really worth adding that thought to an otherwise interesting debate??

        To clarify, people who buy consoles often don’t have the money to spend on a custom whizzo super PC, and the price point for the level of hardware is very low. You’re also guaranteed to be able to run the game without tweaking settings, something people who aren’t super technical quite appreciate.

        Does that make sense to you now?

    • agentd says:

      I think it’s being led by TV manufacturers rather than consoles – though there’s an obvious cross over there with Sony. Not sure why the industry standard jumps from 1080p to 4k, not seen any 2.5k or 3k screens for sale.

    • Ryuthrowsstuff says:

      In a very real sense resolution is less important than pixel density. So higher resolutions are more important as screen sizes get bigger. 4k vs 8k is readily apparent in cinema projection where the screen sizes are measured in feet. And the same principals apply when you talk about monitors and TV. There’s little apparent benefit to going above 1080p in monitors below 30ish inches. But above that size things start to look bad (meaning lower resolution/fuzzy) at 1080p at typical PC viewing distances. So you move to 1440p. Get even bigger, maybe beyond 45″ (I’m not sure of the actual cut off) and even 1440p will look weird at PC viewing distances. So a higher resolution is needed. That’d be 2160p (or UHD-1, the base standard for 4k).

      Now TVs and consoles are different. The pixel density you need to looks smooth and clear goes down with distance from the screen. At TV viewing distances you can get away with lower density and still look clean. Go sit within a few feet of an HD TV, especially a cheap one or a very big one. And you’ll see what I’m talking about. It gets fuzzier the closer you get, till you can eventually see the individual pixels (pixel size also is a factor here). Or remember the rear projection TVs back in the SD days. The really large ones, over 55″, looked like shit if you were sitting withing like 10 feet of them. A 480i signal stretched over massive screen meant very low pixel density, very large pixels, and very long viewing distance for best clarity.

      Consoles have more reason to be concerned about 4k because the size of a TV is typically much larger than a PC monitor. IIRC the top selling models are in the 50″+ range these days. And screen size has been on a solid up trend as prices come down. As screens get larger you need higher resolutions to maintain clear image at regular viewing distances. And especially if you’re sitting any closer. It won’t ever matter as much to the PC player who games close to a monitor, because a 72″ monitor is never really going to be practical. But anytime you get TVs or projection involved it certainly will.

      As an example I know some one who uses a 1080p projector in place of a TV. Typically set up for a screen size 8 or 10 feet in diagonal. This is fine. But you definitely see the quality drop if you’re sitting within 8 feet of the wall/screen. And it has a much more apparent impact on gaming. And much more apparent on console gaming where a lot of the older games are locked to like 720p. So he would like to trade up to a 1440p or 4k projector, and there it would be appropriate. It would not be appropriate for anyone involved to pick up a 19″ 4k screen for their gaming PC.

      • agentd says:

        That’s an excellent explanation, thank you!

        • Ryuthrowsstuff says:

          No problem. It never ceases to amaze me how little most people understand about how video or film works. You can ask a basic question like Aerothorn and get thousands of words without the actual answer coming up. Its pretty much the first thing they teach you in film school.

          But general rule of thumb for monitors. Below something like 17″ or 19″ you won’t be able to tell the difference between 720p and 1080p. Under 27″ you won’t be able to tell the difference between 1080p and 1440p. And no clue what the cut off is for 4k. It was something in the mid thirties or forties. Same idea with TVs, but accounting for viewing distance and the generally larger size of things. You’re talking about larger numbers overall and wider spans between them (plus no 1440p TVs). So like 30-something” through 40-something” (and smaller) TVs are probably fine at 1080p. Over that 4k should definitely start getting noticeable. At TV distances, start sitting closer, as people often do for gaming. And the numbers will get lower, closer you are the tighter you want the pixel density.

  12. mpk says:

    I recently bought a 1050 and it’s the quietest graphics card I’ve had since… well, I can’t remember.

    Games like GTAV, Skyrim, XCOM2, Ark are all looking better than previously (my old card was a GTX 560Ti) but, crucially, I am no longer listening out for the sound of Richard O’Brien whenever I boot something up.

    I’ve spent 10% of the price of that card up there, and can hear myself think when playing games. This is A Good Thing.

    • soul4sale says:

      Same here. I just picked up an ever-so-slightly beefed up 1050Ti, and I’m loving it. It chugs a bit on ultra settings, but it uses board power and makes no audible noise. For me, this is the 1080p price/performance sweet spot.

  13. Premium User Badge

    qfwfq42 says:

    The card is very likely not geared towards current gamers: beyond professional rendering, there’s a lot of demand for powerful graphics cards on the part of people trying to run neural-net AI algorithms, where there is demand for unlimited amounts of processing power (the pcworld article on the release confirms that this is probably what it’s about)

  14. napoleonic says:

    Last year was still a good time to buy a graphics card in the UK, thanks to the post-Brexit currency collapse.

  15. Premium User Badge

    Coretex says:

    From a business perspective it doesn’t seem to make sense (though I’d wager there is a sound reason somewhere).

    Even so, this article just shows how much everyone has bought what businesses are selling.

    For how much we talk about the evils of business stifling progress this sort of thing should be lauded as an improvement. It SHOULDN’T be just about one upping the competition at a fixed rate that keeps you just ahead. It SHOULD be about one upping yourself and improving everything as a whole. I don’t really think that is why they are doing this, but reactions like the ones in this article certainly don’t encourage it.

    Progress for progress’ sake should be encouraged (as long as it does not come at the cost of people, or the earth, of course)

    • Slazia says:

      It could be so many reasons. It may be a simple as starting production runs in smaller numbers before moving mass production. It may be a contract with memory suppliers specifying how much they would buy every quarter. Either way, it is a good thing they are pushing themselves.

      • Malcolm says:

        I suspect what is happening here is that their manufacturing process has settled down and is now producing “fully enabled” chips in larger quantities than they can sell as Quadro P6000 cards. So they either artificially limit them and sell them as Titan X or 1080 Ti cards, or introduce a new “faster” model for more money.

  16. Unsheep says:

    Think of all the games you could buy for €/£ 1200; some 20 or so triple-A titles, or 60+ Indie games. That’s a high price to pay for marginally better graphics.

    • Premium User Badge

      Don Reba says:

      This argument wouldn’t make sense to people who value their time. $1200 might be not a lot of money to spend, but at the same time 20 AAA titles could be impossible to get through.

  17. KenTWOu says:

    I’ve been holding out for Vega, but gave up waiting and now have a 1080 Ti

    Correct me if I’m wrong, but it seems that the whole RPS team is sitting on nVidia graphics cards now.

  18. RitaDayBeley says:

    This article is disappointingly obtuse. NVIDIA is turning its attention to markets beyond gaming. Just take a look at their website. They are targeting deep learning researchers and sensor fusion applications, in particular for autonomous vehicles. Presumably they are also aiming to be the hardware of choice for learning-intensive data centers. I don’t know the current size of these markets relative to gaming, but long-term they could be enormously larger (e.g. multiple Titans in every car), and they are certainly more exciting and prestigious.

    There’s a very interesting story to be written here about the coming boom in graphics hardware, driven by massive deep learning applications. Although these products won’t initially be marketed towards gamers, gaming should eventually reap the benefits of rapid technological improvement and mass production.

  19. estebanlives says:

    Surely it’s got something to do with wanting to corner the processing power market for VR early on too?

    Lots of people want to build VR machines right now and that’s going to take a lot of whack, so might as well try to get more and more machines build with their cards while they’re the sole supplier. And one-upping themselves creates face competition amongst those rich and competitive enough, as is said.

  20. Buuurr says:

    “nearly says:
    Not sure what made the reply so incomprehensible to you, but it made the point that buying the 1200$ card won’t really give you any tangible improvements over the 700$ card.
    Your “if” is utterly irrelevant to the point that, regardless of what else you have in that system, you’re almost definitely never going to see a meaningful difference between the two cards in any aspect other than price. The throughput is not going to change significantly but the price will.”

    So to be sure here. We are saying that if I am running a 1080ti and I upgrade to the aforementioned Titan that I would not see any difference in data folding? None? No noticeable difference? Really? So NVidia just released a card that will do nothing different except cost way more?

    Yeah, no, thanks.

    Some of us do more than game with video cards. Some of us use them when not for gaming for useful things.

  21. UK_John says:

    This story is a waste of space or just the same hype as though we’re still in the 90’s! The fact is we now have practically all AAA PC games being converted from console and we have a much smaller AAA games market and a much larger indie market. Surely most of us can see that AAA games like Skyrim of it’s day and Fallout 4 and Mass Effect Andromeda will run perfectly fine on high settings with a medium level PC! Then you have the fact if you’re like me, you now spend more on indie and less on AAA. All this means is my 970 runs Witcher 3 and everything since at the highest settings. The first game that might need something higher than a 970 will be Cyberpunk 2077. But even then, I expect to be run it on medium or high.

    If you have ANYTHING more powerful than a 970, you may not have to upgrade EVER AGAIN! Because indie will take more and more of your money as there are fewer and fewer AAA games.

Comment on this story

XHTML: Allowed code: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>