Maybe Nvidia won’t be releasing their new Ampere/Turing graphics cards at GTC 2018 after all

Wrong kind of Turing

Earlier this week, the hot goss on the graphics card grapevine was that Nvidia was going to launch its new, next-gen line-up of GeForce GTX graphics cards at this year’s GTC 2018 conference later this month. Dubbed Ampere, or maybe even Turing (no one can quite decide between the two it seems), these cards would replace Nvidia’s current range of 10-series cards, such as the Nvidia GeForce GTX 1070 and GTX 1080 etc, with a brand-new, potential 20-series or maybe even 11-series of cards that would go something like the GTX 2070 and GTX 2080, or GTX 1170 or GTX 1180, making some of this generation’s best graphics cards even better.

However, despite several outlets confirming with  lots of ‘sources close to the matter’ that this will in fact happen, a new report from Tom’s Hardware suggests that all this is actually a load of hogwash and Nvidia won’t be launching anything of the sort at GTC this year, or indeed GDC while we’re on conferences beginning with the letter ‘G’.

According to Tom’s Hardware, ‘multiple independent sources’ have told them that Nvidia may well give us a tiny teaser of what they’ve got coming at GTC, but the proper launch of their next-gen cards, including concrete details on their specs and everything else, will come sometime later. So don’t get your hopes up, essentially, as it may well end up all being a bit disappointing.

The good news, though, is that the whole confusion around what kind of architecture/code name/silicon wizardry might succeed its current Pascal tech seems to be getting clearer. For a while, we all thought Nvidia’s next set of cards would use its souped up Volta architecture, which has already been deployed in its top-end incredi-cards like the Titan V. Then we started getting wind of something called Ampere that was meant to be the next Volta, and then another thing altogether called Turing.

Well, it would now appear that Volta will continue to be rolled out across Nvidia’s higher-end products, but going forward there will be a much more distinct line between Nvidia’s work and play products. Indeed, Volta’s successor, Ampere, will be confined to servers and more businessy applications, while Turing will likely be whatever we end up with in consumer-based graphics cards.

Provided Tom’s Hardware’s info is correct, we might have to wait until mid-June before we start seeing Turing cards go on sale, as card partners have yet to receive Turing’s actual specs. Right now, it’s currently expected this information will arrive in May, with a proper launch pegged for some time around July. That means Nvidia will also likely miss the big Competux show in Taipei in June, and may even wait until Gamescom in September before pushing them properly.

Of course, there’s still a lot of speculation surrounding these new-fangled graphics cards, so we’re unlikely to find out anything concrete until GTC starts at the end of this month on March 26th. Either way, with graphics cards prices the way they are right now, I’m sure we can all stand to wait just a bit longer before we start thinking about any potential upgrades.


  1. Sakkura says:

    Hardware rumors tur(n)ing out to be inaccurate? Say it isn’t so!

  2. Ghostwise says:

    It’s a conspiracy to make me finish my Baldur’s Gate playthrough by endlessly creating reasons for me not to upgrade my old GPU. It’s all my fault and I’m sorry.

  3. TormDK says:

    Boo I say, Boo!

  4. waltC says:

    Who cares?..;) Existing soa GPUs are all (AMD RX480 and newer/nVidia 1080 and up) more than capable of running the gamut of existing 3d games without so much as a squeak of protest. It’s an especially poor market because of x-coin mining, which is driving up gpu prices to ridiculous levels. I won’t pay even $50 more than MSRP for a 3d card–and certainly not double the Manufacturer’s Suggested Retail Price…;)

    • TormDK says:

      That’s like, your opinion maaaaan.

      I care, I have a 1080Ti now – it isn’t enough for 1440p, and it certainly isn’t enough for true 4K.

      • ColonelFlanders says:

        Pretty much everything my friend runs on his 1080ti runs at 60fps in 4k provided he turns off Aa, which let’s face it you don’t need if you’re on a 4k resolution. Unless of course you have a 45 inch monitor or something stupid.

      • Vandelay says:

        Well, that doesn’t entirely accurate. My 1070 can run most things at max 1440p. Those it can’t normally only need a few options dropped down.

        If you want VR though, current cards are only just about okay at pushing 90fps. Higher resolution headsets and improved fidelity in VR games is really needing a new bunch of cards to come out.

      • Premium User Badge

        Malarious says:

        Same boat. I can get most games running ~80fps at 1440p at max, on my 1080TI, but that’s not really good enough. Hoping next-gen’s TI-replacement will let me run at 120+ without drops. And 4k at a playable framerate is still a ways off.

      • Chomper32 says:

        I’m able to run plenty of games at 1440p and I have a 980. Maybe not a stable 144fps for my 144hz monitor on games like PUBG (I can have a stable 65 on medium though) but things like CS:GO which is what I mainly play work at 144+ fps

    • Bimble says:

      Nonsense? Show me a single card GPU that can run GTA5 maxed at locked 1080P/60. Yes 1080p. Yes locked 60. I’ll be over here…

      • ColonelFlanders says:

        You can bloody well stay there too. Every card as good as or better than a 780ti will piss all over GTAV. It’s a well optimised game.

        • Bimble says:

          Oh please. You missed the word maxed. Go and read up on the settings and the advanced settings. There’s a good kid.

          • Vitz says:

            How about you drop the act of being a sanctimonious ass and start realising that you’re plain wrong? Just to make sure, I booted up GTA V for the first time since I got my 1080 Ti and guess what, it’s a stable 60 at 1440p. Nope, not 1080p. 1440p. Now I CAN drop my FPS to about 18, but that involves “Frame Scaling Mode”, otherwise known as Supersampling. It literally upscales the resolution of your screen and then displays it at your native resolution. In other words, it’s not rendering at the 1080p resolution you like to cling onto.

            Current video cards are able to max out GTA V, and then some, at 1080p & 60FPS. At this point I should probably be the one to tell you to go sit in the corner like a good little kid.

          • Farden says:

            Yeah sorry mate but this is just wrong. My 1080 and old ass 4770k can run GTA V maxed at 2560×1440 at solid 60+ (around 70-80). If I crank a few pointless things like AA down a smidge I can run it at 165fps which looks amazing on my 165hz screen.
            GTA is super optimised, if you’re struggling with GTA V running 60 at 1080p on a 1080ti (that naming convention…) then you have something pretty wrong with your pooter.

          • givingtheheat says:

            What the hell are you on? I have to use VSYNC maxed out in 4k to lock 60fps. If I dont use vsync im well above. I never go below 60 maxed out in 4k on a 1080ti and 8700k. Dumb people.

      • fsgeek says:

        I have an i7-3770K with a ~6 year-old GTX 670 and I can run GTA5 at a fairly consistent 1080p/60 on medium-high settings. That game is very well optimised.

  5. Amphetamine says:

    Last 2 upgrade cycles on GPU’s were both ~18 months and at this point I’ve had my 1080 for close to 24 month. So if these rumours are anything like correct then we could be looking at a 30+ month upgrade cycle this time around.

    Given that Jensen said there would be 7 new desktop card SKUs released this year, does that mean we could see xx30, xx50, xx60, xx70, xx80, xx80Ti, and Titan T released in the last 4 months of the year? That seems unlikely….

  6. Premium User Badge

    MajorLag says:

    I’m finding it really difficult to summon any give-a-damn about GPUs these days. Seems like, going forward, there’s a good chance even the mid-tier will be locked out to anyone unwilling to pony up as much for the card as the rest of their PC.

    Future looks pretty grim for 4k and VR gaming.

    • hungrycookpot says:

      IMO all it’s going to take is someone releasing a dedicated mining card, something more cost-effective that’s built exactly for mining specs so you’re not overpaying for gaming features you don’t need for mining, and then it won’t make sense to buy gaming gpus anymore. Either that or the coin market crashes, I’m fine with both.

  7. dethtoll says:

    They’re probably waiting for the next big cryptocurrency crash. As soon as you hear the keening wail of sad nerds losing their houses you’ll know it’s time to buy a new GPU.

Comment on this story

HTML: Allowed code: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>