Nvidia’s New GeForce GTX 1080 Graphics

Hate to say I told you so. Or rather, I don’t and so I’m going to gloat. Contrary to numerous comment protestations, Nvidia’s 2016 graphics awesomeness has begun in the shape of its new GeForce GTX 1080 and 1070 cards. Based on the new Pascal architecture and teensy 16nm transistors, the new GPUs are exactly as expected. And yet also quite different. Meanwhile, AMD has dropped some hints regards the shape of Radeons to come. It all adds up to an exciting summer for PC graphics and a very good reason to put your GPU purchases on temporary hold, especially if VR is your bag…

So, yes, many doubted this would be happening so soon. But Nvidia has announced its first Pascal graphics chips for gamers, the GeForce GTX 1080 and 1070. Obviously, this is initially a paper launch, but I can assure you cards are incoming and I’ll report back when I’m eyeballing actual hardware myself.

So, what’s this new Nvidia clobber all about? The non-surprise aspects involve the shrunken 16nm bits made by production partner TSMC, including FinFET tech, and the Pascal architecture. Long story short, that all adds up to in excess of a double-generational leap in terms of the chip production technology. And it’s ultimately that technology that underpins progress and improved performance for computer chips of all kinds.

The key specifications for the 1080 version start with 2,560 of Nvidia’s CUDA cores (shaders to you and I), 160 texture units (this particular figure isn’t fully confirmed) for, you know, processing textures, 64 render outputs for squirting pixels in the general direction of your display and 7.2 billion transistors.

For reference, the existing card that the 1080 replaces, namely the GTX 980, rocks in at 2,048 shaders, 128 textures, 64 render outputs and 5.2 billion transistors. Based on those numbers alone, the 1080 looks like a disappointment. A double-generational leap in production tech and not even 50 per cent more transistors or shaders? Really?

Nvidia’s lean, mean pixel-pumping machine

Well, here comes the mitigation. For starters, Pascal shaders are not directly comparable to the Maxwell shaders in the old 980. Pascal shaders probably do more, er, graphicsy stuff. But more importantly, these new Pascal chips run way, way faster. The maximum boost clockspeed for the 1080 is 1,733MHz to the mere 1,216MHz of the old 980. In both cases, we’re talking speeds for standard ‘reference’ cards. Board makers like Asus, MSI and the rest will inevitably cook up overclocked specials.

The rest of the spec list includes a 256-bit memory bus and GDDR5X memory with a data rate of 10Gbps – note these first Pascal boards do not boast fancy HBM memory, though GDDR5X is a minor novelty and features four data transmissions per cycle to the two of GDDR5. IE it’s twice as fast at any given operating frequency.

Anyway, the upshot is a near doubling in raw computational performance over the 980, from 5TFLOPS to 9TFLOPS. Actual gaming performance will not be twice as fast, I suspect, but something in the region of 50 per cent faster in many cases… though with one critical exception. With Pascal, Nvidia has cooked up some special sauce for VR gaming.

Simultaneous Multi-Projection, for it is ye, is a new feature that allows for rendering of multiple viewpoints – up to 16 – in a single pass. At this stage, the technical details are a little foggy. But the key implication is a huge leap in VR performance.

Virtual reality rendering, of course, requires stereo images – one for each eye. The basics of Simultaneous Multi-Projection involve rendering the geometry for both eyes just once and then also discarding pixels that won’t be rendered in a VR scenario before they are rasterised. As I said, the full details aren’t clear, but Nvidia is more or less implying that you get two rendered images for the price of one which is exactly what you want for VR.

See, it’s faster. But note the ‘VR’ small print…

Certainly, Nvidia is claiming that the 1080 will be twice as fast as even a GeForce Titan X for VR rendering. That’s stunning if true and would make the 1080 an absolute, instant no brainer for VR fans. Simultaneous Multi-Projection also has beneficial implications for multi-monitor setups.

Elsewhere, further interesting features include the fancy new ‘Ansel’ screenshot tech that Alice mentioned previously. Again, this is effectively a technology that takes a snapshot of the whole 3D engine rather than a 2D frame. In other words, once captured, you can navigate around the frame in full 3D, apply filters and effects and even up the resolution and quality settings dramatically. So you can take a screen shot on a feeble £100 GPU and make it look like you’re running a 5K panel and triple GTX 1080s. Apparently, the final output can extend to 61,440 by 34,560 pixels. Madness.

Inevitably, there’s a whole bunch of other stuff including what’s being billed as ‘ray-traced’ audio designed expressly for VR gaming. You can grab Nvidia’s own overview here.

As for the slightly cheaper, cut-down 1070, full details haven’t been released yet. The 1080 officially goes on sale May 27th, the 1070 follows on June 10th. Pricing is officially $599 and $379 for the 1080 and 1070. Call that £450 and £270. For the record, the 1080 is $50 more expensive than the 980 was at launch, which is disappointing given that it is very likely a smaller, cheaper GPU to manufacture.

It’s arty. But is it farty?

Overall, these new chips are intriguing. I was expecting a bigger and more complex GPU, but a slower running GPU. The net result in terms of performance is probably about level with expectations but with that VR-performance kicker. In that context, the 1070 could well be the just-about-affordable sweetspot for VR gaming.

But what of AMD? Where Nvidia has Pascal, AMD will have Polaris, its own new family of chips. Nothing specific has been announced, but in recent interviews, AMD reps have been hinting that the target for the new GPUs is more mainstream than high end.

Wind your minds back to the Radeon HD 4000 series (cough) and you’ll remember that AMD made a big song and dance about the futility of high end willy waving in the graphics market – not many can afford £500 graphics cards – and instead concentrating on maximising performance at a price more people can swallow. Nearer £200/$300 than £400/$600, in other words.

I expect we’ll see a similar sales pitch for the new AMD Polaris cards. Whether this is by design or because AMD has slightly misjudged either its technology or what Nvidia has achieved probably doesn’t matter. What does matter is that it should make for real choice for us gamers. Absolute VR power from Nvidia. Bang for buck from AMD. Sounds good to me.

Sponsored links by Taboola

More from the web

From this site


  1. Vesperan says:

    I’ve been waiting for this story to pop up.

    As one of the commenters in the last story saying ‘hey what about AMD?’: Nvidia has delivered.

    Well, not literally yet but there’s no reason to doubt the actual boards will arrive when they say they will.

    What has surprised me is that it went from no rumours, teasing or anything to full on announcement with (fairly) quick availability.

    Meanwhile there has been a comprehensive silence from AMD, leading the rumour mill to speculate that AMD is both delaying its Polaris chips and bringing forward its Vega chips. Which is amusing – rumours love a vacuum.

    So while the 1070 is far more than I can justify to spend (if only as I’m on a 1080 screen!), this is good news.

    So hats off to you Mr Laird for sticking to your guns?

    • sfury says:

      I thought they made a counter-announcement on their own – they’re having a press event later this month? link to wccftech.com

      Probably gonna announce a soon-ish release date for Polaris, but yeah Nvidia’s response was swift and AMD are probably still scratching their heads whether to rush their own release dates and have some parity.

      Still – the 1080 and 1070 are pricey and enthusiast level cards and Polaris seems mid-level/performance cards, hopefully AMD seizes the chance and releases them at more affordable prices like the 4000 series, like Jeremy said.

  2. Greg Wild says:

    If AMD can replicate the 4000 series I will be very happy. My 4850 was a great card back then.

  3. Gibs says:

    Im so gonna buy a new videocard this summer. Even if it has to be nVidia.

  4. TheBloke says:

    Don’t I feel silly, having finally taken the plunge to update my aging graphics with a 980Ti.. two weeks ago.

    I’m still holding on to the ray of hope that all these 1080 comparisons are against the base 980, not the 980Ti. Mine is the MSI, OC’d to 1279mhz and with 2816 cores which is more than the 1080. But of course, running far slower – and it seems I spent more on the 980ti than the 1080 is going to cost! Wow, this may be the worst upgrade I’ve ever made.

    Oh well, I have no plans to VR so at least I won’t notice that difference. I’d think about selling it except of course 980 prices are going to plummet, and I really don’t want to end up spending yet another hundred or two on top of the stupid amount I already spent.

    Live and learn – I should have researched better and realised how close a new generation was.

    • TimRobbins says:

      Meh, as soon as you’d buy the 1080 the super ultra deluxe 1080ti specs would be announced. If you don’t care about VR, that 980ti will probably die of regular use before optimized game development catches up to actually tax it.

      • Premium User Badge

        gritz says:

        The 980ti came out 10 months after the 980, that’s a pretty serious wait.

    • Buggery says:

      I mean, sure, enthusiast cards are always a risky gamble unless you time the purchase juuust right. Buuut… A 980ti will serve you perfectly well up to 4k (with some graphical fripperies turned off) for many years to come.

      I bought a 770 about 3 months prior to the 9X0 series being announced and it still serves as a 1080p powerhouse and will eventually be used in a couch gaming PC when I drop some cash on a 1080ti (once that comes out). No regrets.

    • C0llic says:

      Don’t be too hard on yourself. VR is the only thing you’ll need the extra power for; that’s still unproven and will take a fair while to settle (a year or two at least I think). I think by the time it does you won’t regret having a great card in the meantime.

    • Narwhalicorn says:

      Most Nvidia partners, like EVGA & Gigabyte, have step-up programs where you can buy the newer cards for the difference if you bought the older one within 90 days. Just a heads up.

    • Love Albatross says:

      Where did you buy it? If it was online and they have a generous returns policy it might not be too late to get a refund.

      This is why I like to get hardware from Amazon whenever possible…they don’t give a shit.

    • Gilmer9999 says:

      I don’t feel that bad and I just bought a 980TI to build a new computer. I imagine for my purposes a 980TI will be great over the next 2 years at least!

  5. Amatyr says:

    I am so ready for a 1080 to go with my ultrawide 1440p screen.

    • Premium User Badge

      gritz says:

      A little ironic that the 1080 is best used on things that aren’t 1080p.

  6. Psihomodo says:

    I’m not sure why everyone parrots prices that are not correct for the consumer.

    The 1080 will cost 699$ because only the “Founder edition” will be available at start with a few partner cards. And those with added value/cooling will again cost around 699$.

    Same for 1070 and 449$.

    • Maxheadroom says:

      Yeah no matter how they dress it up, the ‘Founders edition’ (ie reference) being $100 more than their own MSRRP just seems like a ‘fuck you’ tax on early adopters

  7. Asurmen says:

    AMD have been saying for some time that Polaris is low to mainstream part of the market. Vega is high to enthusiast.

    There is a new rumour that they’re moving Vega up from first quarter of 2017 to October. I can’t see that being true really.

  8. J. Cosmo Cohen says:

    Wait, so, I’m still better off with a 970 for 1080p gaming? Or…?

    • gunny1993 says:

      You’ll probably be fine, if you’re okay with playing most(aaa) games @ around 60 fps maxed out or close to it.

      Almost certainly a bad idea financially for you to upgrade until next year

  9. Sakkura says:

    “GDDR5X is a minor novelty and features four data transmissions per cycle to the two of GDDR5. IE it’s twice as fast at any given operating frequency.”

    That is not the case. GDDR5 already has an effectively quad data rate, by using a double data rate (=DDR) off of two offset clock signals. GDDR5X does not change that, it just tweaks the number of nibbles and add signal pins, which helps it achieve higher clocks than GDDR5. But the performance per clock cycle is completely unchanged.

    That’s why the 1080’s 256-bit GDDR5X running 10 GT/s only gets 25% more memory bandwidth than the 1070’s 256-bit plain GDDR5 running 8 GT/s.

    • Jeremy Laird says:

      link to anandtech.com

      “At a basic level, GDDR5X increases the overall capacity of the memory bus by moving from double-pumping (DDR) to quad-pumping (QDR), pairing this change with a larger memory prefetch to feed the faster memory bus, all while avoiding a more painful/difficult increase in the memory core clock.”

      link to monitorinsider.com
      “QDR mode: data can be transferred at up to 4 times the rate of the word clock. (New!)”
      “The most important feature is the new QDR mode, but there are some interesting aspects about the other points as well. We’ll have a deeper look at them below.”

      link to extremetech.com

      “GDDR5X accomplishes this in two separate ways. First, it moves from a DDR bus to a QDR bus.”

      And loads of other similar sources. I was going to check the JEDEC whitepaper, but you have to register to download it and I couldn’t be arsed.

      • Anti-Skub says:

        The real question is, how much difference is the HBM version going to make. I heard some pretty wild claims about the card coming in 2017 being the same step up again over the 1080. Is this likely to be true? Is it worth waiting for that? Is the HBM card going to replace the 1080 next year or come in at a Titan style premium price point?

        • TacticalNuclearPenguin says:

          Minimal as per usual, with more opportunities to shine at higher resolutions but still not a game changer, as AMD taught us 90% of the legwork is still done by the Chip.

          The Ti and Titan will be fast mostly because of the fat chip and the double generational leap in power efficiency, the 1080 is extremely laid back considering it requires a single 8 pin connector.

          I’m not saying HBM does nothing, off course it does, but it’s never going to be remotely comparable to the hype it generates. Still, it’s the way forward and it’s factually better than GDDR, so i’m obviously glad they’re using it.

        • ScubaMonster says:

          I think if you keep waiting for something better that’s rumored to be coming you’ll end up never buying anything. There will always be something better on the horizon. If you really need an upgrade right now I’d just go ahead and jump on 1080/1070. If you already have something like a 980 then I’d say you could just wait.

          • ScubaMonster says:

            Well okay “rumored” is a bad word. They WILL be coming. But I stand by my statement.

      • Sakkura says:

        Thing is, GDDR5 is already running an effective Quad Data Rate. Look at the GTX 980 for an example; 1753 MHz memory giving a 7010 MT/s data rate.

        What you’re doing is parroting some technically inaccurate marketing from Micron, who understandably want their new memory to sound impressive. “Quad data rate” is a little more understandable to people than “more nibbles” or “doubled prefetch” – which is, as your sources explain, what’s actually happening.

        GDDR5X is still fundamentally using DDR operation, just achieving an effective 8X data rate by going wider than GDDR5. That’s why they show 4 offset clock signals in their diagrams.

      • TacticalNuclearPenguin says:

        Honestly even Nvidia markets the 1080’s bandwidth as something like 320 things/thingers, which is not that impressive.

  10. Premium User Badge

    caff says:

    This Nvidia generation represents a reason for me to upgrade everything – but mainly my i5 2500K which has served me so well for 5 years now. I’ve upgraded bits around it, but this CPU and mobo have been a great investment.

    However, bring on either Haswell-E or 6700K and let’s smash up the 4K graphics arena with this Nvidia shizzle.

    • Unclepauly says:

      If you’ve clocked yours to 4.5 or higher there’s almost zero reason to upgrade(for gaming) unless you need something that newer mobos have. IPC improvements are on average only 10-15% with skylake.

      • grundus says:

        As an ex-4.5GHz 2500K owner, this has not been my experience at all. I was playing GTA V a lot before and after my switch to Devil’s Canyon and the difference was huge – I went from a stuttery, vomit-coaxing nightmare to an unbelievably smooth dream even at the stock speed. I don’t know what the 4690K has that the 2500K hasn’t, but I was very surprised by the difference.

        • TacticalNuclearPenguin says:

          There’s definitely something weird with your experience, but the difference in IPC is indeed really starting to add up, not even remotely as low as 10-15% as the OP suggested.

          2600k here at 4.6ghz, i’m starting to feel the weight with CPU bottlenecked stuff like Black Desert and other games that are showing some remarkable advantages with new tech. I guess it’s not only about IPC either but also many other little intricacies that are helping the situation, especially if games are now being developed on the new stuff without much importance being given on how they translate on older ( even if still performing on paper ) stuff.

          I was trying to wait for skylake-E, but honestly i’m not sure i want to spend that much, and i’m not going for Broadwell-E since it still sits on an obsolete chipset. I’m most likely going to wait for Kaby lake ( skylake refresh ) and the new 200 series chipset.

          • PenguinJim says:

            Their experience sounds correct to me. The i5-2500K had the HD3000 IGP – moving to the HD4600 would certainly feel as grundus describes.

  11. Astatine says:

    I’m in two minds about this… it’s nice to see technology advancing, but I’m a FreeSync owning AMD fan, and it’s frustrating that it looks like they’re going to cede the high end to Nvidia for the next few months. Their market share is already low enough these days, it makes me worry they’re going to drop out of gaming graphics entirely.

    Going to wait and see how all this plays out. My 2-year-old Radeon 290X still seems more than adequate for all my regular games and everything I’ve tried on my shiny new Vive… :)

    • Asurmen says:

      Nah, they’re not dropping out. They’re just focusing on efficiency and the mid market with Polaris. They’ve made a big hoo ha about hitting the minimum for VR at a cheap and low power level.

      • jezcentral says:

        One of the problems with the more mainstream part of the market is the margins are a lot less. I want to see AMD flourish (to keep Nvidia pushing forward), and that is only going to happen when they make a nice profit on their products.

        • bhauck says:

          Couldn’t you just as accurately say:

          “One of the problems with the [high-end] part of the market is the [volumes] are a lot less. I want to see AMD flourish (to keep Nvidia pushing forward), and that is only going to happen when they [sell a lot of their products].”

    • Little_Crow says:

      It’s been a long time since AMD/ATI have stolen a sizable march on either Intel or NVidia so rumours of AMD in collapse are not unusual. News of their continued financial losses while ‘restructuring’ only add fuel to the fire.

      AMD dropping out of the high end graphics market would be disastrous, we rely on the 2 companies pushing each other to drive tech advances and keep prices down – a monopoly will only hurt consumers.

      Your comment on market share made me look at the Steam Hardware survey, and you’re right, it’s stark reading for AMD. Nearly 60% NVidia and growing, whereas AMD are at 25% and shrinking – it’s no wonder the rumours never go away.

  12. Det. Bullock says:

    mmmm, I wonder how Polaris will do, at the moment nothing under 200 Euros is that much of an upgrade for me, perhaps because I mostly play games on the cheap my Radeon HD 7770 has been more than enough.
    I just wonder with so many console ports that eat up so much video RAM if we are going to see low-ish range cards with 3GB or more even if the chipset itself wouldn’t be that much beefier than my actual video card.

  13. PenguinJim says:

    “For the record, the 1080 is $50 more expensive than the 980 was at launch, which is disappointing given that it is very likely a smaller, cheaper GPU to manufacture.”

    Of course, you’re not buying just the GPU, you’re buying the graphics card. But you’ve piqued my interest – what is your source for GDDR5X having the same (or cheaper) cost to manufacture as GDDR5?

    “Pricing is officially $599 and $379 for the 1080 and 1070. Call that £450 and £270.”

    Ooh, I haven’t seen that £270 figure elsewhere. Given that the 970 launched at $330/£260, I’m surprised to see a $50 increase in US RRP has only impacted the UK’s price by a tenner. Did Nvidia themselves confirm the £270, or have you found a pre-order at that price?

    • Unclepauly says:

      I have a feeling you latch onto errors and mistakes like a shark to a bleeding seal pup.

      • PenguinJim says:

        Your feeling has mislead you in this instance, I’m afraid. While there are several definite errors and mistakes in this article, if you’d read through it you’d have seen that I have completely ignored them. Anyone who cares already knows, so what’s the point?

        But I have no idea about GDDR5X manufacturing costs, nor do I have any idea about Nvidia’s relative investment in software over the past three years, so it’s probable that Jeremy knows something on that score. Otherwise why bring it up?

        I’m also not clear on why he arrived at that figure for the 1070 price. I know you’re saying he’s just pulled it out of his arse, but that doesn’t seem likely to me – doesn’t he get paid to write these articles, regardless of their timeliness?

    • fish99 says:

      I think it’ll be closer to 290-300. And that’s once the launch price gouging is over and stock becomes plentiful.

  14. Premium User Badge

    AutonomyLost says:

    I fear I will not be able to resist IMMEDIATELY purchasing a pair of EVGA 1080 Hybrids when they inevitably release… It won’t be remotely necessary, given what I’m already running, but I know myself all too well. I’m somewhat surprised the announcement of this new line came as it did, with such little preemptive fanfare on Nvidia’s part. I for one figured we’d be getting the follow-up to the 980Ti and Titan X this summer, but was second-guessing myself after the Papa Pascal was unveiled just a short while ago. Thought we’d have to wait until they figured how to best strip it down for the enthusiast market. I’m pleased it is here, undoubtedly, but am not EXACTLY looking forward to parting with roughly $1500… Such is life, though, and my compulsion to own the best gear. Regardless, I’m happy those that have been yearning for, and patiently awaiting, a proper reason to upgrade to a new GPU now have one. This is a great time to be gaming.

  15. AlexStoic says:

    As someone who’s not an expert on this stuff and interested in a 1080, is there any use for my current 980 (not 980ti)? Or is my best bet to toss it up on ebay?

  16. FabriciusRex says:

    When Vega is out I’ll consider buying a new GPU. Either way my next card will be from AMD. (I’m getting tired of NVIDIAS use of proprietary tech and refusal to collaborate with open standars like freesync.)

  17. Dale631 says:

    From specs comparison alone link to versus.com , I don’t see how the 1080 would be able to push twice the performance relative to the Titan X. Though if the 1080 would really be able to do what NVidia claims, it would be amazing to see how it can use its resources very efficiently.

  18. FCA says:

    I’m so excited for a new videocard, it’s a bit unhealthy. My plan was to buy a new computer this summer (have done so every 2 years since around 2002), but given the relatively small increase in CPU power, I might just spend it all on a videocard.

    As a bonus, I can tell the missus it’s for better “machine learning performance”, not just gaming ;) . It’s sad, but OpenCL does not seem to be adopted much by high performance computing. All libraries I see are either CUDA or no GPU acceleration at all.

  19. Maxheadroom says:

    Anyone else confused by some parts of the presentation?

    Like when he was showing off that VR funhouse tech demo and saying “Notice how when I push this balloon into these other balloons how they all react realistically and bounce off each other!” like this is the first card to feature any kind of realistic physics.

    And the ‘fake out’ with the image of the soldier and “One day we’ll be able to render this in real time. Well, one day is today!” looked more like a Crysis loading screen than anything ground breaking.

    Im not trolling, im genially confused why they were show casing Pascal doing something my 5 year old graphics card can do. Am I missing something?

    • TacticalNuclearPenguin says:

      Nah, it’s just you not being sensitive enough when it comes to fine detail, i guess.

      Still, it’s still marketing, as rendering something admittedly extremely complex against a blank background is not the most impressive thing ever.

  20. sirdavies says:

    I just wanted to say, thank you so much for that blog you posted around new years predicting these changes. I’ sure if I hadn’t read that I would have made a silly investment around Dark Souls 3’s launch!

    • Themadcow says:

      Same here, although I’ll probably wait until next Jan to upgrade. The upcoming SSD innovations are also very exciting but God knows how they’ll be priced.

  21. Solidstate89 says:

    Should be a very nice upgrade from my GTX 780. That just doesn’t cut it anymore at 3440×1440.

  22. Replikant says:

    I really, really want to replace my GTX460 and finally be able to be overwhelmed by VR (one of the few proper pieces of SciFi to arrive in this new millenium).
    Still, I am so not going to buy a 1070 or 1080 card. Shrunk die size for improved yields for the manufacturer but at higher cost for the customer with somewhat underwhelming performance increase? It really does sound like a move to milk early adopters.
    I fully expect NV to lower prices/release new models, when AMD finally releases their next gen models.
    The big player is starting to cash in on the market leader position, we need more competition. I’ll be switching back to AMD. At the very least, I’ll wait for Polaris to be released.

  23. Carra says:

    Been waiting for the new generation. Should be a nice upgrade over my 670 (doesn’t seem worth it to upgrade my i5 3570K). I’ll finally be able to play the Witcher 3 at 2560×1440 with all bells and whistles.

  24. fish99 says:

    I wanna see some real world numbers, benchmarks, power consumption, etc …. then I’ll consider replacing my 970 with a 1070 if they’re under £300. I do want one though.

    My (EVGA) 970 SC is going to be hard to sell though because it has a minor design defect. The balcony in the first room in Witcher 3 makes the card power off, and it’s also happened running Unigine Heaven. Other than that it’s solid, but it’s gonna knock a lot off it’s resale value.

  25. Cinek says:

    Odd that it took RPS so long to report on that GPU. It’s the most exciting thing in terms of GPUs since Titan, only unlike Titan – it’s decently priced.

    1080 so far in every benchmark seems to be significantly ahead of every GPU on a market, including TitanX and 980Ti. Not to mention that for those invested in VR this generation seems to be a must-have. Hopefully Radeon will catch up, cause nVidia just raised stakes more than ATI did for years now.

  26. kud13 says:

    I’m currently running a GT 740. Looking to upgrade this summer, within a $250 range.

    What’s my best option?

    I’m running AMD FX 4800, 16 GB DDR3 RAM and a 520 watt PSU.

    • kud13 says:

      May have gotten CPU wrong- off the top of my head, 8-core @ 4.1 GHz

    • Sakkura says:

      Wait for AMD to launch their new cards, check if there’s any news on a lower-end Nvidia launch (like a GTX 1060 or 1060 Ti), and pick based on reviews.

  27. Niente says:

    I’ve got an i7-4790K and a GTX 980. I game at 1920×1080 and I’ll be sticking with this rig for the time being. I will probably upgrade when the 980 can no longer max out everything at 1920×1080 or when a new generation of consoles are released.

    Until then I’m happy with what I’ve got.

  28. ShatteredAwe says:

    Despite all of these benchmarks, I think I might still be getting a 980Ti for my first build.

    After all, in August is when all the prices drop… right?

  29. Premium User Badge

    Benratha says:

    But will it run Crysis?