Why Nvidia is overcharging us all off, just a bit

As I was saying, Intel’s CPU strategy has gone into meltdown. As a consequence, the cynicism of its approach in the face of weak competition – right up until AMD pulled its new Ryzen out of the proverbial – has been laid bare. But it’s not just Chipzilla that’s worthy of your scorn. For some time now, Nvidia has essentially been ripping us all off just a little bit. Here’s why.

The basis of my argument goes something like this. Nvidia’s GeForce GTX 1080 isn’t based on a high-end GPU. It’s based on a mid-sized GPU with a 256-bit memory bus that’s cheap to manufacture.

I could go through all the technicalities comparing GPU die sizes over the years, dissecting the implications of bus width, drilling down into the history of the G**04 series of chips dating back to the Kepler graphics architecture. But for those who know the details and don’t agree, it won’t make any difference. For everyone else it’s just dull.

Anyway, up to a point, it’s all academic. The GeForce GTX 1080 Ti and the Titan boards are not only based on Nvidia’s biggest GPU but also pitched higher up in the market. What, exactly, is the problem?

It comes down to this. The way these things are flogged, the essential ruse is that Ti and Titan are part of a new even higher-end segment that was added in on top of what used to be the high end. The regular 1080 is high end. The stuff above is something newer, something even higher end. But that’s bollocks.

Admittedly, it’s all been enabled by the brilliance of Nvidia’s engineers. The transition began with the Kepler generation of cards, AKA the GeForce 400 series. But it was Maxwell, which underpins graphics cards like the GTX 980, where the new paradigm really got locked in. Nvidia had come up with a graphics architecture so effective that its mid-sized GPU could trade blows with or even beat AMD’s biggest GPU.

Suddenly, Nvidia could pitch a middling GPU into a premium market segment and style its true high end GPU as something even more exotic.

There is, of course, an extent to which this is all academic. What matters is the performance of these things and the gaming experience you get, not any of the crap I’ve been talking about, right?

I don’t think so, no. Because the net result is that Nvidia’s high end GPUs and in turn the best gaming experience gets pushed up into even more exclusive territory and we all end up paying, to put it in the historical terms of an Nvidia GPU family that predates all this nonsense, GeForce GTX 280 money for a GeForce GTX 260 gaming experience. And that is what makes me grumpy.

The GeForce GTX 1080 Ti: High end times a thousand

It’s also where the parallel with Intel’s current woes comes in. As the Linus Tech Tips guy puts it, Intel got into this mess because it lost sight of the simple strategy of making the best products it can and offering them for the best prices it can. For far too long, Intel has put far too much emphasis on trying to triangulate its market opportunities rather than just aiming to produce great products. It’s in just that kind of territory that Nvidia has been operating of late, too. The net result is that we as customers get more and more screwed over.

As this point, you might argue a BBC-esque pursuit of even handedness, not to mention a certain sense of rotational narrative symmetry, implores we stick the boot into AMD. There is indeed plenty to moan about when it comes to AMD. Indeed, it’s AMD’s failure to compete at certain times in the respective GPU and CPU markets that’s been the underlying enabler in all of this.

What’s more, AMD currently seems to be conspiring to swap its dysfunctionality from CPUs to graphics. The new Ryzen CPU is a huge improvement and has Intel in a major flounce.

Unfortunately, there seems at least a danger that its Radeon graphics tech is going south just as fast. The new AMD Vega family of GPUs grows ever later and what indications we do have of its performance, which come courtesy of some early-access Founders Editions cards, suggests that the big Vega GPU is if anything not quite on pace with Nvidia’s middling Pascal GPU in the form of the GeForce GTX 1080.

Problem is, Pascal is shortly going to be replaced by Nvidia’s new Volta technology. So, that situation is only going to get worse. To put it another way, AMD is a full generation behind Nvidia and it can’t really compete fully with Nvidia’s last gen tech, let alone the new stuff coming through.

Let me repeat that. I hope I am wrong. But it looks like even if AMD Vega was fully available right now it wouldn’t be all that stellar. By the time it does eventually limp onto the market, Nvidia will have something even faster on offer.

The point about all that, however, is that it’s about failure to deliver. Call it incompetence if you want, what it’s not is anything remotely like the cynicism and implied belligerence towards customers that Intel and Nvidia share. Hell, maybe AMD would be beating up its customers too, given half chance. But it never tends to do well enough for long enough for the abusive or practices or cynical product positioning to kick in. So who can say.

Either way, as it stands, it’s the difference between being frustrated and offended. AMD may be frustrating. But Intel and Nvidia are a little bit offensive.

93 Comments

  1. Luciferous says:

    Okay, maybe it’s just me that is supertired and a bit unfocused… But is anyone else finding this article poor written and more than just a bit rambling?

    • Premium User Badge

      Nauallis says:

      Yes. I’ve had to reread a bunch of paragraphs because of jargon and slang that seems unrelated. Not least the title.

      Also the whole article is bitching about being ripping off using similes and metaphor, but none of it is addressing the actual hardware capabilities that aren’t providing an increase in output for the premium price. Jeremy, I’m getting that you think the 1080 Ti and Titan GPS are akin to over-priced, slightly better 970/980’s but why? Is it just the hardware capability, the software available, the inability for the average person to see a marked difference? What’s so similar about the chipset?

      • Premium User Badge

        Nauallis says:

        Great, now I’m doing it. Tired, me.

      • xyzzy frobozz says:

        It was explained in the story….

        … Nvidia is selling mid-ranged chips- Gxx04 – as high end. In doing this it moved its high end chips Gxx00 into a higher price bracket that previously didn’t exist.

        So essentially they have raised prices due to a lack of competition, not because their products cost more to produce.

        So as Nvidia’s market share has increased, they have increased prices, rather than reduced them as is usually the case when a company wins market share and increases volume.

        The argument is therefore that Nvidia is abusing its market dominance to rip off consumers.

        • poliovaccine says:

          Right, as well as adding on that, “the worst you can do to blame AMD for any of this is to insist on an entirely dog-eat-dog scenario whereby we deserve to pay more to Nvidia due to AMD’s inability to compete.” Whether this is actually a problem to you depends on your philosophy to some extent, though I can safely say I know what I think about it. Suffice to say, this sort of thing is why America is at least supposed to have legislation protecting against monopolies.

    • fray_bentos says:

      You are correct. Waffle, with no quantitative evidence to back up the headline. Even a price/performance ratio graph over time would have done that.

    • Chorltonwheelie says:

      Yes, like the last one.
      Poor us. It’s hard paying £700 for a graphics card. I might have a little cry. Sniff.
      Pretty poor form of nVidia having and even better architecture up it’s sleeve too….just not fair.

  2. Yasha says:

    Small point worth noting: Kepler was the name of the microarchitecture found in Nvidia’s 700 series cards. The 400 series were based off of the Fermi microarchitecture.

    I’d refrain typically from being the poindexter to make a big stink out of semantics, but it is important in the context of what you’re saying. Fermi was somewhat unsuccessful for several reasons, namely performance and temperature, and one could argue that Fermi’s failings provided a compelling reason to pursue the business strategies outlined in the article.

    Kepler was the first time that Nvidia tested the market strategy of having “high-end” and “something greater” (think 780, 780ti, Titan). Titan’s unexpected success was a prelude for what was to come, and a precursor for the modern gpu market’s situation.

    • Asami says:

      8000, 9000, 100 and 200 series were Tesla. 300 series was mobile-only Tesla. 400 and 500 series were both Fermi. 600 and 700 series were Kepler. 800 series was mobile-only kepler. 900 series was Maxwell, and now 1000 series is Pascal.

      Anyway, I get where Jeremy is coming from with this, and sure you can argue that what used to be the GX102 range is now the GX104 range, and what used to be the GX100 range is now the GX102 range, but at the same time they’ve still been jumping roughly the same amount in performance, sometimes more, between generations. You can call it just “brilliant engineering” on Nvidia’s part but that comes from somewhere, and usually that somewhere is money. Not to mention it seems Jeremy doesn’t quite understand die-shrinks and the fact that what was a 7 billion transistor chip can become a 12 billion transistor chip over generations, meaning even if GK100 and GP100 sound like they should be similar product ranges, the truth is there can be a massive jump between the complexity and cost of those chips, which makes it more sensible to put them in different market segments.

      Though I will agree that Nvidia seems to needlessly choke their memory busses for cost saving.

  3. gi_ty says:

    I too share an apprehension for the VEGA line it looks middling at best. However if it can drive a 1440p widescreen at decent frames that’s where I am going to spend my money. It may not get Nvidia to rethink the way they flog their cards but if its competitive in the space where I can actually afford to buy a card in AMD will get my money every time. Especially considering the difference between cost in Gsync v. Freesync.

    • Fade2Gray says:

      I’m in pretty much the same boat. I just can’t stomach shelling out for a g-sync monitor. Here’s to hoping Vega isn’t terrible…

      Man I wish I could be more optimistic.

    • ravenshrike says:

      It will. More importantly the reason that nVidia suddenly announced they’re looking into MCM tech with that MIT whitepaper is because of Navi. Navi is rumored to be 7nm Vega in MCM form. GPUs are significantly more suited to an MCM design when you have good enough interconnects, which Infinity Fabric definitely qualifies as. When you consider the fact that Vega was meant to directly compete with the 1080 almost a year ago and that it was only delayed because of HBM2 issues, which Navi won’t be using if their slides are any indication, and that from the limited testing GN has been able to do can considerably outrun the 1080; that’s bad news for nVidia when shrunk down to 7nm and using 40-60%(GloFlo theoretical 7nm power reduction is 60% but I assume that it could be up to 20% less) less power and put into an MCM configuration, which it was virtually certainly designed to do easily, and nVidia’s Volta won’t be able to keep up in price or performance no matter how much better of an underlying architecture it is, at least in the gaming scene. Hell, if they wanted to directly compete on performance and ignore the insane power requirements, they could go for the 30-40% performance increase that you can get at 7nm and ignore the increased power savings.

  4. kalirion says:

    Is “overcharging someone off” a British expression I haven’t encountered before?

    Anyway, tl;dr is “NVIDIA gives us the best bang for the buck, but it could afford to give us the same amount of bang for even less buck, which makes me upset because I like my bucks.” right?

    • gi_ty says:

      Essentially yes, however it does seem very cynical for Nvidia to restructure the high end market in order to charge extravagant prices for the same stuff that cost $500 only a few years ago. Since it is a luxury item perhaps it may even be somewhat justifiable. That doesn’t change the fact that they are knowingly selling on a 100% price increase versus just a few years ago. So it’s still a bastard move.

      • TormDK says:

        “Everything is worth what its purchaser will pay for it. “

        • gi_ty says:

          Indeed, that doesn’t mean its morally justifiable though. That is of course unless morally justifiable ends with how best to milk your customers for every last cent (pence for you Brits right?). Which seems depressingly common among the business folks.

          • Buuurr says:

            Morals have no business in business. If you are looking for morals from a Corp I have a farm in Sudan. Cool temps. No famine. Just a touch of war.

          • General Ludd says:

            How do you calculate a moral price? In a situation where the customer has a choice – and with graphics cards there just isn’t an imperative to upgrade at the moment – then any price that the customers will buy at feels fine.

        • vahnn says:

          I’m getting Quark vibes from this comment. Tingling lobes and all.

      • Sakkura says:

        Eh, go back and check what the 8800 GTX cost. $599 in 2006 money, which is about $727 today.

        The 1080 Ti has an MSRP of $699.

    • elevown says:

      Nope – you can’t say that. Unless it is word play on what you CAN say, which is ‘ripping us all off’.

      • Sarfrin says:

        I’m guessing it was meant to be changed from “ripping us all off” to “overcharging us all” to make the headline less controversial, but the off got left in by mistake.

    • poliovaccine says:

      Actually it’s cockney slang for angry masturbation.

    • FurryLippedSquid says:

      I reckon it’s a play on mugging us all off.

      Being mugged off is being taken for a fool.

  5. DavesDesk says:

    Yup, Mr. Laird, it was all down to that first Kepler card, the GTX 680. That was the first time Nvidia represented the second-tier GPU as their high-end desktop card. They could do it because the frame rate boost was enough to present a proper generational performance uplift over the struggling Fermi architecture, but it also meant they could generate huge margins from manufacturing mid-tier GPUs and selling them for high-end prices.

    And funnily enough, they’re rolling in cash now…

    • try2bcool69 says:

      I went from a 680 to a 1070 in my 4 year old PC, and it was like getting a brand new computer for a mere $430. That, at least, was a bargain to me.

      • fish99 says:

        That’s not really the point though is it. You would have been even happier if there was some actual competition in the GPU sector and you’d paid $100 less for the same card.

        • nitric22 says:

          Exactly! Personally, my 750ti will sit inside my computer for as many years as need be until the next quality “leap” in performance can be had for only about $200. I may be a few years behind but I just can’t assign the value of 4, 5, 600+ priced cards in my mind. Knowing that while Nvidia are not the only players on the block, knowing that they “own” the block has given me a clear sense that those high end price points may not be fully justified.

          All this being said, I also see a general peaking in traditional tech. Way to long of a discussion to be had fully, but the fundamental leaps in technology happened decades ago and we have been on the path of refinement for so long that growth and evolution will most certainly start being more incremental as we reach the height of our current technological foundations. That is if and until the next great leap takes place (e.g. quantum processing or some like).

      • Papageno says:

        Unfortunately, thanks to the bleeping cryptocurrency miners, you can’t touch a 1070 for that price any more. What happened to the days when you’d buy a nice video card to make your games look good?
        The alternative seems to be to shell out nearly 600 bucks for a 1080, which seems pretty extravagant.*
        *my current card is a 970 with the infamous “1/2 GB of slow access VRAM” issue. It works well enough for 1920×1200, but if I ever want to get a 1440p or greater monitor I’ve gotta get something newer.

  6. lifeboat says:

    A big difference between Nvidia and Intel is that Nvidia has been generating huge performance gains each generation plus Nvidia has been pushing the limits of technology to do this, for example, the V100 is the largest chip ever created.

    Intel’s chips have been providing only small gains each year and therefore were a ripe target for AMD. On the other hand, if AMD had any success with a new graphics card architecture, Nvidia could simply release cards with slightly bigger chips in them and win the battle again.

  7. zulnam says:

    I got an Rx 480 nitro+ and it works like a charm even with ultra high Witcher 3.

    Point of the matter is , as a gamer, unless you’re one of those people who play on a 100 inch screen or triple monitor system, the kind that doesn’t mind spending 2000£ on an upgrade, it won’t matter if you’re ultra highend, mega highend or plain old regular highend.
    There’s not much point in getting a 1080, much less 1080 Ti, unless youre a hardware enthusiast.

    • TormDK says:

      Sure there is, for anyone playing resolutions higher than the standard 1080p, while still enjoying more than 60FPS.

      • Fuhckerschite says:

        ….and the people like me that like running games with ultra settings above 100fps.

        • that_guy_strife says:

          I’m pretty certain that makes you a hardware enthusiast.

    • Premium User Badge

      phuzz says:

      The Witcher 3 is pretty well optimised though, I get a solid 45+fps on ultra with an R9 290 (not X).
      There’s not been much need to upgrade CPUs for a few years, but I’m not seeing much reason to upgrade my four year old GPU yet either.

  8. Nolenthar says:

    Unfortunately, our world is increasingly based on a “perceived” value rather than an actual value. GPU or CPU are no different. What doesn’t help is the absence of competition. Both Markets only have two, and they actually share one. Provided AMD has less R&D budget than either Intel or Nvidia, it’s still a miracle that the situation is not somewhat worse.

    Now, oppositely to Intel, Nvidia has been offering tremendous performance increase every new generation (Pascal was between 60 to 100% more performant than Maxwell) so admittedly I don’t think its comparable. People have been very slow upgrading their CPU recently. You could really ignore 3 generations completely without any issues. No so much in the the GPU department.

    Does that make it OK to charge 1200$ for your top card ? Probably not but there is little things we can do. A pity we don’t have 4 or 5 companies competing.

    • Buuurr says:

      “Provided AMD has less R&D budget than either Intel or Nvidia, it’s still a miracle that the situation is not somewhat worse. ”

      Likely the reason that AMD has not been steam rolled into the dirt is that they provide NVidia with the illusion of competition (just like Intel v AMD). This in turn provides NVidia with anti-trust protection they otherwise would not have.

      • Otterley says:

        Exactly what I was thinking. Even if Intel or Nvidia were inclined to offer us their best development efforts at a fair price, they would probably be ill-advised to do so. Wiping out the last vestiges of competition would do them more harm than good.

  9. Flappybat says:

    It’s where markets go wrong because a single company has too much market share and doesn’t have to be competitive? If they can’t use planned obsolescence, which I feel Intel sort of does by constantly changing sockets, the next best thing is to milk your customers for every penny when you don’t need to offer them more for cheaper.

  10. Baines says:

    Mind, what the cryptocurrency mining industry has done to GPU availability implies Nvidia could/should be charging even more than they already do.

    • panfriedmarmot says:

      Crypto miners need to take their darkweb fake money and the drugs and kiddie porn they buying with it and jump off a bridge.

  11. brucethemoose says:

    based on Nvidia’s biggest GPU

    That’s not quite true. The 1080 TI is a cut down version of GP102 GPU. The king of Nvidia’s lineup is the monster GP100 with HBM2, which for now you’ll only find in Quadros or Teslas.

    You should at least add a qualifier in the article. It is technically the biggest gaming GPU.

    __________________

    They also revealed the successor, GV100, awhile ago.

    Part of the problem is money. Taping out a chip is expensive, and if you noticed AMD is not taping out very many:

    There are 8+ Intel desktop/server/laptop chips (3 server chip sizes, Xeon Phi, Xeon D, quad and dual core parts desktops/laptops share, as well as the laptop parts with beefed up graphics. All are separate designs.

    AMD has one Zen CPU die. One. They might have another next year (the APU).

    For Pascal, Nvidia has GP108, GP107, GP106, GP104, GP102, and GP100.

    Meanwhile, AMD has Polaris 10 and Polaris 11.

    Even worse, for the next generation, Vega 10 has to pull double duty as a compute chip AND a gaming card, while Vega 11 is a ghost so far. They don’t have the money to tape out 2 dedicated big chips like Nvidia does.

    • Sakkura says:

      The only difference between GP102 and GP100 is memory interface and FP64. The parts relevant to gaming performance are the same. So GP102 really is a full-fledged high-end gaming chip.

      It’s true that the 1080 Ti is cut down a little. But the Titan Xp has the full chip enabled.

      • lifeboat says:

        “So GP102 really is a full-fledged high-end gaming chip.”

        The GP102 is smaller than the GP100. So the parts they removed from the GP100 to create the GP102 could have been replaced with more graphics power. Nvidia could easily make a faster graphics chip that would be the size of the GP100. And they would have done this if AMD was providing more competition.

        • brucethemoose says:

          They could. That’s precisely what they did with Maxwell.

          But to be fair, it probably wouldn’t be worth it. The first GPUs on new nodes tend to be smaller, hence GP102 is only a bit bigger than GM204. A monster like GP100 is only possible if margins are ridiculously high.

        • Sakkura says:

          GP102 is 471mm^2 – that’s a big chip. That is NOT a medium-sized chip. It’s almost exactly the same size as the G80 that Nvidia charged a huge price for a decade ago.

          It’s not comparable to the Geforce 600 series, where the GK104 in the GTX 680 was just under 300mm^2. That’s where you can validly say Nvidia was selling a midrange chip as the top of the line.

  12. omf says:

    “ripping us all off just a little bit” is kind of the whole basis of capitalism, n’est pas?

    • Chorltonwheelie says:

      The basis is paying us what they can get away with rather than handing us the full financial fruits of our endeavors.

  13. Jenuall says:

    Meh, people will pay what they want to pay and if they perceive that the performance increase isn’t worth the price then the market will need to adapt.

    If a company finds a way to make a product that meets their improvement roadmap but does so at a cheaper cost than anticipated they aren’t obliged to then sell it for less. It would be nice, but it’s more of a “hats off to them” if they did do it, rather than being justified in saying “boo!” when they don’t.

    Also I paid less than £200 for a 970 that’s still going strong 2.5 years later, I think nVidia gave me a bargain rather than ripping me off.

    • fish99 says:

      I don’t think anyone is denying the GTX 970 was good value for money. Its replacement, the GTX 1070 is £120 more expensive though, due IMO to lack of competition.

  14. Vitz says:

    It’s indeed a bit shitty how the enthusiast customer is being screwed, but let’s not forget that the gaming industry grows with the mid-range and high-end; not the absolute top-end that costs thousands. We’re still getting very good performance increases with every generation. Nothing’s changed in that respect. Also, these “underpowered” chips have allowed them to integrate them into ridiculously small form factors like the Razer Blade.

    Ethically, it’s sketchy, but if they’d refrain from doing this, they’d just delay their new technologies instead of releasing them as a more expensive product.

  15. TimRobbins says:

    I thought for sure this article would be about G-Sync.

  16. konondrum says:

    I am sorry, but the entire premise of this article is just plain wrong.

    Calling the GTX 1080 a “middling GPU” is just plain silly. I know it’s GP104 not GP100, but that is a really dumb argument. It was the undisputed fastest GPU in the world on release, in what way is that middling? AMD, a year later still has not released a product to compete with it. Unless of course you count the Vega FE (which AMD doesn’t), and it uses almost twice the power to do so.

    How can you call a product overpriced when it’s faster, more efficient and all around better than any other product on the market? Video games are not a human right, they are a luxury. This entire idea is flawed. Nvidia creates new market segments because it can, no one if forcing you to buy a Titan.

    I bought my GTX 1070 9 months ago for $400, and AMD still doesn’t have a product in it’s class at any price. Nvidia has market leader or competitive products at every price point. How are they exploiting anybody?

    • Buuurr says:

      “ex·ploi·ta·tion
      [ˌekˌsploiˈtāSH(ə)n]

      NOUN
      the action or fact of treating someone unfairly in order to benefit from their work:
      “the exploitation of migrant workers”
      synonyms: taking advantage · abuse · misuse · ill-treatment · unfair treatment · [more]
      the action of making use of and benefiting from resources:
      “the Bronze Age saw exploitation of gold deposits”
      synonyms: utilization · use · making use of · making the most of · capitalization on · [more]
      the fact of making use of a situation to gain unfair advantage for oneself:
      “this administration’s exploitation of the fear of crime”
      synonyms: taking advantage · abuse · misuse · ill-treatment · [more]”

      They are not exploiting anyone. They are the only game on the block. Is that their fault? No. Everyone else has an opportunity to do what they have done and succeed. Others just aren’t. As you stated, they are the only gig in town. It’s hard to complain about prices when there isn’t a gun to your head. Crying over ‘wants’ is just that – crying. If we were talking food, water and shelter this would be a different story.

    • Vitz says:

      You either didn’t read the article or you didn’t understand the point.

    • Ravenine says:

      *sigh*

      Here’s the thing: you’re paying top dollar for something that isn’t pushing the most performance it could, hence making it a middling card. It could do more, but it doesn’t, yet it’s priced as if it did.
      It’s the best performer out of the bunch, but it’s not performing to the best of its ability. Think Usain Bolt jogging down the track because, well, nobody else is even remotely close, so why bother sprinting?

      • konondrum says:

        Vitz

        I read the whole article thank you, and I think I understand this stuff at least as well the author does. What is it you think I don’t understand, huh?

        Ravenine

        “you’re paying top dollar for something that isn’t pushing the most performance it could, hence making it a middling card. It could do more, but it doesn’t, yet it’s priced as if it did.”

        You are just showing that you don’t know what you are talking about. GTX 1080 is not crippled in any way. Hell, it even used a memory technology (GDDR5X) that had never been used before to extract the most performance out of it.

        The author is claiming that it’s a “middling” GPU because it doesn’t meet his criteria for GPU die size, which is entirely arbitrary. We are not paying for the cost of the silicon, we are paying for the cost of R&D.

        • Ravenine says:

          You seem to have a case of mental disability, which I can’t fix. You don’t seem to comprehend that Nvidia gave you a chip that costs them fuck all to produce, but charged you as if it was top of the line. Sure, it wasn’t crippled, but it was (and is) overpriced for what it could have been. Having seen your responses, I’m peacing out, because I don’t argue with idiots. It’s unproductive.

          • konondrum says:

            So I corrected your inaccurate statement and your reply is to suggest I’m mentally disabled? How am I the rude and unreasonable one?

            And while they may cost “fuck all” to produce individually (not really, but okay), they cost billions in research and infrastructure.

            I think this all proves my point that this article was poorly considered, because it didn’t contain the necessary information in it to even try to argue it’s case. Now you’ve got a bunch of people jumping into the comments who don’t know what they are talking about, who just want to rage at Nvidia.

  17. poliovaccine says:

    Basically, within a block of Central Park, hot dogs cost upwards of three dollars. Anywhere else in the city, they cost one to two. But Central Park is tourist-heavy, it’s midtown Manhattan; there’s money there, and anyway how far are you willing to walk when you’re hungry to save one dollar on a hot dog?

    Does that make the hot dogs actually worth $3.50 just cus they’re near the park? Depends who you ask – customers or hot dog vendors? Though at some point, as hot dogs become $4, $5, $6, and the example becomes more extreme, you have to start thinking in terms of, “Well, no… one of those wimpy cart dogs is, like, maximum three bites… even with all the toppings, that’s only worth so much when it takes two or three to get you to stop feeling hungry…”

    I think it’s entirely in our best interest to see consumer watchdog pieces on the level this is, and I’m grateful for it. I dont know about anyone else here, but I dont have the kind of disposable income to be using it for congratulating Nvidia on their current technological superiority. I’d like if they didnt award themselves a bonus markup just cus they can, if at all possible. I dont exactly feel outrage or anything like that, but that’s only because the thing is so difficult to articulate, it defies primal reaction.

    In any case, I’m glad to see an article like this.

    • euskalzabe says:

      We’ll put. This is the reason I got an RX 480 instead of an Nvidia card for the first time ever. I hope I can keep avoiding NV until they come back down to earth (AKA when there’s actual GPU competition to speak of).

    • Mezelf says:

      I could not have said this better myself (this entire post is golden tbh).

      “I think it’s entirely in our best interest to see consumer watchdog pieces on the level this is, and I’m grateful for it. I dont know about anyone else here, but I dont have the kind of disposable income to be using it for congratulating Nvidia on their current technological superiority.”

      Me neither, and even if I did have crazy amounts of money, it would STILL be in my own self-interest to critize Nvidia for their actions, because what affects the mid-to-high tier affects the high-to-upper tier.

      Don’t ever be a corporate apologist. Even if, or ESPECIALLY if you are an enthusiast-level consumer of their products.
      Corporations need us, they want us. They want our juicy bodies. As a collective, we have final say in who gets to take us home at the end of the day and treat us to some hot coffee.
      I know Nvidia may be the big boy with the biggest muscles, but that doesn’t mean you have to take it up the ass every time. If you let it be known that you want to be treated right, and you say it publicly, some other corporation will overhear and step up to the plate, willing to provide whatever you could possibly desire.

  18. KingDP says:

    I get the point but does no one see the irony in complaining about the price of a luxury good that is objectively better then anything you could buy before it and costs almost as much as the yearly income of half the worlds population. I mean really no one owes you this product. F***ing first world problems FTW.

    • ColonelFlanders says:

      “Children are starving in Africa, so you should bend over and get ripped off.”

      • KingDP says:

        I’m glad you understand

      • Mezelf says:

        Kinda makes you wish all these poor children in Africa would go ahead and starve already so that we could finally move on and improve ourselves instead resorting to logical fallacies to argument against better things.

    • Otterley says:

      I’d be surprised if I’ve ever had a single problem in my life that couldn’t be called a “first world problem”, but I never perceived them as entirely irrelevant and dismissible, just because they weren’t particularly existential.

      So, you can perhaps imagine how happy it would make me if people could just drop this “first world problem” crap (though, admittedly, people calling problems out as “first world problems” is possibly one of the best examples of a first world problem ^^)

  19. TotallyUseless says:

    I just want a good PC with good parts, sadly AMD can’t provide those. With Intel and NVidia I could at least get decent things for decent budget.

  20. Chill_Rasta says:

    Perfect article, we need more of these. I do find it funny how many people seems to be okay bending over at taking it directly from Nvidia while saying please and thank you for it.

    I use my gaming pc for 6-7 years before upgrading, but I’m saving up for at a Ryzen Rx580/Vega build right now, very happy that they are back with a strong CPU when I’m due for an upgrade, and yeah I am never going to pay over 300€ for any GPU when my CPU is 250€-270€, keep a balance and common sense.

  21. Rikard Peterson says:

    On the other hand, if NVidia were to sell at the prices you’re arguing for, AMD might become completely unable to compete and we’d end up with a single graphics card manufacturer, so maybe this isn’t the worst behaviour from them?

  22. Slinkusss says:

    Proofread please.

  23. herpderp says:

    People who look at $500+ GPUs really lose sight of anything resembling “bang for the buck”.

    The $200-300 range is fairly competitive.

    Sure, there isn’t anything besides 1070 currently at $400-500, and that’s why Vega is so eagerly anticipated.
    For myself, I care for the open/free technologies, which AMD is spearheading: Vulkan (Mantle contribute), Freesync, opencl, etc. – I could say Nvidia needs to get off from their high horse and start supporting these open standards.

  24. kse1977 says:

    These types of articles serve their purpose and are important. Educated consumers make better choices, when they are able to. Having said that, this is also capitalism and NVIDIA is in a position in which they are able to set their prices and folks are willing to pay them. In much the way Apple has set the prices for Phones and Tablets that everybody else tries to follow. One could argue that some of Apple’s upgrades were minimal and obviously the ram upgrades are a joke given the cost of ram, but Apple dictates the cost of their products and the market has enough consumer demand to allow for it.

    The only thing that would get NVIDIA in line is for AMD to make major strides in their GPU market, enough so, to keep NVIDIA honest. Yes it would be nice to see NVIDIA only concerned with releasing the best new products at the best prices, but that is not the world we live in. Competition will ever dictate the prices and for far too long NVIDIA has not had enough realistic competition.

  25. Kasjer says:

    I think all NVIDIA products are overpriced a bit – from low end to ultra high end, but the value (power per dollar) diminishes the higher you go in GPU pricing brackets. I’ve bought 1050Ti this year to replace 750Ti – I’m a low-end peasant who game on 1080p TV with X360 controller. I’ve bought this card on sale, about 20% cheaper than usual price, so it was a no-brainer to go with it instead of AMD card. But truth to be told, I would choose it over closest AMD offering even at standard price. It simply does have more consistent performance, consumes less power (no need for me to change power supply unit), does have better driver support and control panel is powerful tool to force certain options like AA, AF and so on in older titles that do not fully support them in game. Dead Space I’ve bought cheap as chips on GOG sale looks crystal clear thanks to this. Installation of new card was easy, no driver issues, which I cannot say about swapping AMD cards I’ve done back in the day when I was buying with pure bang for the buck approach because I was a poor student. I just like how convenient to use nvidia cards are and I’m willing to pay a little bit more for it.

    And to be fair – people who go for best in class cards are the same people who drop tons of cash on top tier i7 CPUs, 2TB SSD drives, ultrawide, high refresh rate or 4k gaming monitors and DDR4 RAM. If someone is going for 1080 or higher, other components will also be expensive. This is not mainstream gaming – mainstream gaming is still stuck in 1080p and 30-60 fps land thanks to consoles. 1070 is highest card that makes sense to buy here. Premium products always cost much more than their hypotetical value because they are aimed at people who just want the best, no matter the cost. nvidia does charge a lot for top tier cards but look at their mainstream cards – from low budget 1050 2GB to powerhouse which is 1070, they all have pricing which is okay. But if you want to have next gen experience here and now, to be at forefront of new technology, or simply futureproof that’s going to cost you. If you don’t have money, just wait a couple of years for the tech to go down in price and become mainstream and more affordable.

  26. mercyRPG says:

    I have a GeForce GTX 280 just sitting there unplugged, but still in the water loop and it has an 512-bit memory bus!!

  27. panfriedmarmot says:

    Can we talk about the REAL issue of overcharging? How sellers are hiking prices because the damn Ethereum miners bought up all the graphics cards to make their fake money? How when they get some back in stock they are marked up 30-50% or more? My wife built a new PC about a year ago when GTX 1060s were brand spanking new and got it for $250. The exact same card is now listed at nearly $400 on Newegg.

    • Otterley says:

      How can that be a case of overcharging? Demand has escalated beyond supply causing the prices to surge. That’s about as close to economics 101 as you can get.

      Don’t get me wrong, I’m not saying that we should be pleased about this price spike, but speaking of “overcharging” just seems wildly unrealistic.

  28. 4004 says:

    Well, they want to recoup their past and invest in future RD expenses. And the market is willing to pay

  29. Kamikaze-X says:

    This article is opinion stated as fact, and needs a disclaimer. AMD just is not competitive. They are in the ‘Fermi’ years – just keeping up, but at the cost of heat and power consumption, and unless AMD do something radical with Vega they will be trailing for at least the next 2-3 years. nVidia and AMD are innovators – some of the products they brought us in the past have been fantastic, but AMD seem to have drained anything exciting out of the old ATi branch recently. Where are the mad dual GPU gaming cards? There was a glimmer of hope with the Fury and Fury X, but they failed to get traction because of weird marketing, aiming it at prosumers. I was ATi/AMD up until the 6870, when I switched to the 770 and now have a 1070. AMD just do not compete, and that is why nVidia can charge as they do.

    • brucethemoose says:

      AMD was competitive against Kepler (just look up 7970 vs 680 benchmarks now), it’s after Maxwell where things really started falling apart.

  30. DWRoelands says:

    Honestly, I’m happy to pay extra for nVidia because I don’t need to perform arcane rituals to update their drivers.

    AMD driver updates were developed by the Prince of Darkness to sow despair.

    • Buuurr says:

      I actually knew the ‘guy’ back in the day that used to help code a majority of those drivers. It was when they were still ATI. I attended many a party with much drink, as did he, when he was on a deadline that he couldn’t give the time of day to. He always bragged about how much he made… not so much about what he did.

  31. twixter says:

    When deciding to upgrade the primary question I ask is if the performance gain justifies the cost. Is the gain going from a 980 to a 1080 equivalent to the gain from 780 to 980, or 680 to 780? If so, than a similar price point seems warranted. Of course, the price for an x80 card has always been too steep for me, so I’ve always bought AMD or the latest x60 card.

  32. Moraven says:

    And does not help AMD that with every digital currency mining craze their cards sell out, making them not available at MSRP for people building their PCs for gaming.

  33. zat0ichi says:

    £1000 for a 1440p gsync monitor and a 1070. (granted GBP exchange rate is a joke at the moment)

    If that seems right to you then fair enough but to a lot of us it doesn’t.
    We don’t want free stuff.
    We just want a fair price, not a market optimised one.

    • oliseo says:

      Your “wants” aren’t essential, they’re luxury items.

      Crying about the cost of luxury items and stomping your feet because you’re angry at how hard you have to work to get them seems something you should grow out of by primary school.

  34. bfar says:

    Make no mistake about it, if AMD could make a competitive product, Nvidia could and would drop prices. However the people I blame are the idiots who buy $1000 dollar graphics cards year on year, aka the Titan range. Talk about having the wool pulled over your eyes over and over again. At least Nvidia still push great performance out of their products – what Intel have been doing is simply shoddy.

  35. antszy says:

    “Why Nvidia is overcharging us all off, just a bit”

    that even a real sentence?

  36. oliseo says:

    Completely disagree with the Author. Here’s why.

    If you make the best product in the world, in this case the fastest GPU’s, how is it “wrong” and “ripping people off” to charge the best prices for the best products?

    “Ripping people off” would mean you’re charging them the best prices in the world for average or mediocre products. Basically you’re charging ABOVE and BEYOND what that product is worth.

    How is nVidia ripping people off in this situation exactly? When you buy a 1080ti isn’t it the 2nd fastest card in the world? When you buy a Titan, isn’t it the fastest card in the world? How exactly are you being ripped off if you’ve just purchased the number 1 product in the entire world?

    Perhaps the author doesn’t actually understand what “being ripped off” actually means.

    Let’s say you buy the fastest CAR in the world, should you feel ripped off because it doesn’t cost the same as a 2nd hand ford Escort? Or does the author BELIEVE HIMSELF ENTITLED to get the best in the world at mid range prices.

    The author just comes across reeking of entitlement.

    Who amongst us, having made the best product in the world, wouldn’t charge accordingly for our product.

    Some random dude crying about the cost on the Internet isn’t going to shame us, if we make the BEST PRODUCT MONEY CAN BUY. Which, by it’s nature, is the epitome of ripping people off.

  37. directorguy7 says:

    Sorry, but kind of have to disagree. Yes, graphics cards are expensive. But.
    If the price is too high, people will not buy your product. Plain and simple.
    Especially “luxury” products like GPU (you can always use Intel integrated GPUs in their procs.) They have found their market, and they are doing what businesses solely exist to do. To make money.

    And its not like they are making sub par products. Because of this they rule over the GPU market politely. I will never use anything but microsoft but use them as an example of ruling over something (their market) more authoritarian, shitty un-thought out, non progressive products (windows 8, windows phone anyone?) Nvidia uses its R&D better than a lot of companies and yes, they are moving faster than AMD can handle. Make a better product then. BE competitive.