Nvidia’s Turing graphics cards apparently delayed AGAIN

Not Nvidia's Turing

Normally, delays are considered a Bad Thing, but as the great graphics card price crisis rumbles on, it’s not like any of us actually have any money to upgrade our PCs anyway, so the later, the better, really, when it comes to hardware.

Indeed, the latest gossip appears to suggest that Nvidia’s Turing graphics cards won’t be here until the autumn now, after previously being tipped for a reveal at the end of this month during Nvidia’s GTC conference, and then later for a mid-June release date once that initial rumour had been well and truly busted.

Well, DigiTimes has apparently done some more myth-busting today after speaking to ‘some market sources’ about the ongoing cryptocurrency mining fiasco. Despite prices for the best graphics cards remaining sky-high as of late, the demand for mining-friendly GPUs is finally starting to fall, according to those in the know, and both Nvidia and AMD have reportedly been slowing down their GPU production lines as a result to try and get things back to normal.

Consequently, Nvidia’s next-gen Turing cards will now not enter mass production until sometime between July and September. If true, it means Nvidia will definitely miss the global PC bonanza show Computex in June, and maybe even Gamescom in September depending on how quickly they can get the cards turned around for launch.

The cards in question are the successors to Nvidia’s current GeForce GTX 10-series, which will either be called the GTX 2070 and GTX 2080, or the GTX 1170 and GTX 1180 – no one can quite make up their mind.

For a while, no one could decide on their codename either. Previously we thought Nvidia’s next gen cards would be codenamed Volta, utilizing the same architecture that’s currently inside its ludicrous Titan V, but then there were murmurs of something called Ampere. As it turns out, all the evidence points to Ampere being Nvidia’s new pro-business line, with Turing reserved for their new consumer cards.

As is often the case with ‘hot rumours’, though, all this could be instantly thrown out of the water once Nvidia’s GTC conference starts at the end of March, so we’ll have to wait and see what actually happens before we find out the truth. We might get a small glimpse of what’s to come, we might get nothing. All will (or won’t) be revealed come March 26.


  1. Premium User Badge

    Drib says:

    So, some doubtlessly overpriced cards might be coming out late, or might not, and they might be called Turing, unless they aren’t, in which case they might not be.

    Man. Games hardware rumors aren’t exactly hard and fast, are they?

  2. iainl says:

    Still finding it bizarre that Ampere is supposedly the name for the serious math-crunching cards, while Turing is the consumer range.

    • airmikee99 says:

      From Wikipedia:

      “André-Marie Ampère was a French physicist and mathematician who was one of the founders of the science of classical electromagnetism, which he referred to as “electrodynamics”. He is also the inventor of numerous applications, such as the solenoid (a term coined by him) and the electrical telegraph. An autodidact, Ampère was a member of the Académie des sciences and professor at the École polytechnique and the Collège de France.

      The SI unit of measurement of electric current, the ampere, is named after him. His name is also one of the 72 names inscribed on the Eiffel Tower.”

      Ampere is so badass you don’t even know his name, while Turing is recognized by consumers.

      • JustCallBen says:

        And yet, somehow, Ampere never saved our Yankee asses from Germanic conquering as Turing did. So, um, there’s that.

  3. sk0sH says:

    Lol– not like any of us can afford to upgrade our PCs right now anyway…

    Truer words have never been spoken. Seriously, I mean most people, at least in the U.S., can’t afford a $1000 emergency. What makes nVIDIA think they are going to strike it big with their newer cards? Unless it can devour anything I throw at it, it’s probably not worth buying…and they need to find their sweet spot in terms of pricing.

    But, I would imagine all the cryptocurrency miners are going to buy up all the cards again, and then re-sell them after they run them into the ground for a massive profit.

    Me still clinging to my gtx970, now overclocked a bit to handle PUBG better…all my other hardware is future-proofed, but if the 2080/1180 costs more than $600, forget it.

    The 1080ti is only slightly faster than my OC’d 970 and it’s still over $1,000. LOL! People must be fools.

    nVIDIA really lost their customer base over the last generation of cards, I think. Barely anyone I know has a 1080, much less a 1080ti. Most my friends have 980s or a 1060. Whilst I would like to go all-out on a super powerful card, I would rather have a second monitor, or a new gaming mouse with a lot of extra side buttons, like one of the more recent ones from Corsair.

    Looking forward to whatever nVIDIA has in store, but for the love of god, don’t rake us gamers over the coals with the crypto mining people that will buy the first 10 zillion cards.

    • Skiddywinks says:

      “The 1080ti is only slightly faster than my OC’d 970 ”

      Well that is clearly a load of horseshit, looking at any number or benchmarks out there. What games do you play?

      “nVIDIA really lost their customer base over the last generation of cards, I think. Barely anyone I know has a 1080, much less a 1080ti. Most my friends have 980s or a 1060.”

      I kinda agree and disagree. I mean, yeh, I’ve been feeling the same for years with card prices getting insane. But you just said their that your friends still own nVidia cards at the end of the day. Even if they keep spending the same absolute amount on a nVidia card, nVidia haven’t lost anything, still have the marketshare, and have the more halo products around to claw the money from the people who are willing to spend it.

    • Foonshanks says:

      @sk0sH 1080ti close to a oc’d 970… That’s not even close to true. I’ve got both cards Sonny Jim. My 970 can’t do 100fps on demanding games on my ultrawide 3840x1440p monitor, and it sure as hell can’t do 4k 60fps on my 4k tv. My 1080ti can do all of the above. You don’t know what your talking about.

    • Foonshanks says:

      Hell I’ve got a 1070 as well in a build, and it can’t hold a candle to the 1080ti. Your extremely full of shit, saying a overclocked 970 is barely under a 1080ti’s performance. You obviously don’t have the proper equipment to have any idea whatsoever how ridiculous that claim is… And no shit most the people you know are running 1060 cards. That’s the average middle ground card, only 20% percent of new card buyers would have gotten a 1080, and 10%, or less will buy a 1080ti, so yeah your not going to see many people with them. I’m the only person I know personally that has one. I’m an enthusiast. Your not going to see the average gamer shell out $600+ on a single graphics card. Lastly nVidia has been proactive about not raising prices, because of crypto currency. The cards will be expensive, because they power as all hell, and are using the latest version of VRam which is very new.

  4. Raoul Duke says:

    “the demand for mining-friendly GPUs is finally starting to fall, according to those in the know, and both Nvidia and AMD have reportedly been slowing down their GPU production lines as a result to try and get things back to normal”

    This makes no sense. Prices are high because demand is outstripping supply, so if demand is finally going to ease, why would reducing supply “get things back to normal”?

    • Pogs says:

      Its their ‘normal’ level of supply. They like to maintaining the price of their cards at what people will pay for them. If they see the demand falling from the bitcoin miners they’ll reduce supply to avoid a glut and an oversupply and price reduction below their ‘normal’. They have cashed in on the boom by upping the supply a bit but not enough to maintain the ‘normal’ price point.

      Now they don’t want to suffer if the bottom drops from the market and they are still maintaining their bitcoin production levels.

      Upshot, they don’t want us having cheap cards.

      • MajorLag says:

        That’s pretty much my analysis as well. It’d be really great if a nimble new competitor entered the GPU scene.

      • Raoul Duke says:

        My concern is that they now regard ridiculous price gouging as ‘normal’ and so the drop in production is designed to maintain current prices, rather than get things back to where they should be.

  5. pistachio says:

    I really, really need a price drop.

    Just upgraded cpu / MB / memory last week, in part because the battery on my old motherboard ran dry so it wouldn’t wake from sleep properly (it was a very high-end 1st gen i7 config finally starting to get old after about 6 or 7 years). I could run at 5GHZ with my new setup if I wanted but no need since my graphics card is…. a 960 GTX. How’s that for a bottleneck? But I rather run some games at 40fps than pay the insane price of current cards. Not about money, but value for money. 800+ euros for a card that fits by new build is just not acceptable.

    I can wait, my last system lasted me for over half a decade.

  6. hEadSpike says:

    Whatever the performance of an old or new card is or will be, the price is already known: How long it takes for the card to mine its own value in 10 months. I don’t see how it’s so hard to game and mine to offset the outrageous prices. Just a thought.