Week in Tech: Don’t Buy A New Video Card

Actually, do. But possibly don’t. Or probably do. The problem here is partly ye olde NDA or non-disclosure agreement and the threat of legal immolation at the hands of sharp-suited lawyers and their homicidal liability clauses. I’m not actually under NDA, but I’ve seen things that are and there’s little value in getting people into trouble for the sake of 24 hours. And apparently Nvidia doesn’t fancy shifting its global PR campaign to suit RPS’s Thursday hardware slot. Short sighted as that may be, we must make do.

Nvidia is outing some new GPUs tomorrow and they’re definitely going to shake things up. In fact, they already have in terms of the pricing of existing graphics cards with some conspicuous bargains popping up – on this side of the pond at least. Meanwhile, there’s some interesting LCD screen news, including high refresh IPS on the horizon, and the Beast of Redmond officially brings the Xbone’s controller to the PC. Yay! But there’s no wireless support. Boo!

So, we’re right in the heart of the graphics silly season with Nvidia about to unleash some new high-end graphics cards on Friday – GeForces GTX 980 and 970 – and rumours building around AMD’s next uber GPUs. And yes, that does mean Nvidia is largely skipping the GeForce 800 Series in terms of branding on the desktop.

We’ll have to wait for the official announcement to get into the details, but these new GPUs are based on Nvidia’s new Maxwell technology as already seen in the GeForce GTX 750Ti. That’s enough to tell you they’ll likely to set new standards for efficiency. Trust me on this, these things are phenomenal, especially when you consider that Nvidia has been forced to use skanky old 28nm transistors once again instead of simply leaning on the usual efficiencies and performance gains that come with a shift to small transistors. When Maxwell GPUs get a die shrink they are going to be outrageously good.

Anyway, as far as I’m aware AMD doesn’t have any imminent product to respond to the new Nvidia GPUs, so price cuts it is for at least the next few months. Which is obviously great for you and we. There are already some tempting bargains on some of the usual retail outlets here in the UK, with Scan offering an XFX branded Radeon R9 290X for just £270 – less than its cheapest R9 290, in fact.

Doesn’t make much sense until you factor in the awful reference cooler on this particular 290X. It’s ridiculously noisy and not very good at cooling. You have been warned. Thing is, the 290X is a beast with a 512-bit bus and for £270 is bloody tantalising, even with a borked cooler. I probably still would.

Whatever, my advice for anyone in the market for a new GPU right now is keep your scanners peeled for some great deals in the next few days. No doubt there will be interesting deals where ever you’re based.

If this is the Jesus panel, think of the upcoming IPS equivalent as the gaming monitor Richard Dawkins would choose…

Next up, high-refresh IPS panel tech for gaming PC monitors. IPS is, of course, the LCD screen tech du jour. It’s higher quality in terms of colours, contrast and viewing angles than cheaper TN screens, but has so far been slower in terms of response. That’s why we haven’t seen 120Hz-plus IPS monitors so far.

The latest news involves good old AU Optronics, maker of an awful lot of the actual LCD panels you find in many affordable PC monitors. It’s announced a new 27-inch 2,560 by 1,440 IPS-type panel with native 144Hz refresh support.

I say ‘IPS-type’ because it’s an AHVA panel which is AU Optronics’ take on IPS – a bit like PLS is Samsung’s version. Anyway, I have an AU Optronics AHVA panel in my laptop and bloody nice is is, too.

So, before you dash out and buy that Asus ROG Swift with its slightly muddy 27-inch TN panel, you might want to ponder just how good something similar but IPS powered sounds. Pretty much perfect, if you ask me. Not sure exactly how long it will take for the AU Optronics panel to pop up in actual retail monitors, but I’m hoping to see something before the end of the year.

On a related note, VESA has nailed down the DisplayPort 1.3 standard. As ever, the big news is even more bandwidth and thus support over a single cable for irrelevancies like Dell’s new 5K monitor. That’s 5,120 by 2,880 on a 27-inch panel and nearly double the pixel count of a 4K display. Via Windows. Using today’s GPUs. Ha ha. Er, ha.

Of marginally more interest is the prospect of 4K at 120Hz. Again, it’s a slightly comical notion given the catastrophic GPU load that represents – your GPU would need to process and render roughly 480 million pixels per second to drive a 3D game at 4K 120Hz. But then my 30-inch panels have lasted me nigh-on a decade, so if I was buying a 4K panel today, having 120Hz support ready for GPU technology to catch up would certainly be attractive.

DisplayPort 1.3 also serves up a few additional frills including support for higher resolutions when daisy chaining multiple screens through a single connection as well as 4:2:0 pixel sub sampling. Yeah, that one’s got me whoop-whooping up and down the corridor, too.

In traditional Microsoft style, they’ve cocked up the initial Xbone-for-PC controller implementation

And finally, the Xbox One controller on the PC. As a grown up, I use a keyboard and mouse. Proper analogue joysticks are obviously suitable in the right context, too. But I’m told some PC gamers occasionally dabble with these control pad things. Ghastly.

In all seriousness, control pads are looking increasingly relevant as game streaming to multiple devices begins to take off. So official support for the Xbox One’s controller on the PC is certainly welcome. It’s a little outside of my area of expertise as I’ve never really jived with relative lack of precision you suffer with a pad. But I’m reliably informed that it’s a quality bit of kit.

The major snag, however, is that wireless operation isn’t supported for now, which does rather compromise utility for game streaming. Hopefully a dongle like the one Microsoft released for the Xbox 360 controller is in the works. Whatever, if you hadn’t already heard the news, you have now. Toodle pip.


  1. Penguin_Factory says:

    I bought an Xbox 360 controller for my PC a while back and I absolutely love having it on hand for the occasional platformer or third person action game (can’t imagine playing Dark Souls with a mouse and keyboard). I’m seriously tempted to give the Xbone controller a go, since I’ve heard it’s very comfortable.

    • mattevansc3 says:

      I’d personally wait until there’s software out there making use of the XboxOne controller’s improvements over the 360 controller such as the trigger rumbles. The controllers otherwise are practically identical with the differences being more down to user preference than any actual improvement.

      That and its also easy and “cheap” to pick up a 360 controller with a wireless dongle. Personally I’d stick with the 360 controller.

      • David Bliff says:

        Yep, I’m sure the joystick feel and d-pad are much better as well, to the point where it’d at least be a little difficult to go back to the 360 pad from the One pad, but really without support for the trigger rumbles there’s no way it’s worth the asking price. I guess if you really think the 360 pad feels cheap, but to me it still feels really great.

        • Ryuuga says:

          How are the shoulder buttons (aka bumpers)? My current 360 controller has seen a lot of use, and the left shoulder button is starting to be bothersome.

          Also, any word on whether the xbone controllers are easy to take apart to remove the rumblers? I detest rumble, and it also makes the controller pleasantly light.

          • Tatty says:

            The bumpers are the one part of the Xbone pad I’ve not gotten used to yet. They require rolling your knuckle on to them rather than having your fingers already placed – Once I get used to it, I’ll be fine.

            Other than that the controller is in improvement on the 360’s in every way and I’d definitely recommend buying one now instead of a 360 pad. Official drivers have been out for a while and every game I’ve tried it with recognises it as a 360 pad (some newer ones see a One controller) and it operates as such.

            It’s worth it for the d-pad alone in platformers. The 360’s ‘Pog floating on soup’ d-pad never really did it for me…

          • TacticalNuclearPenguin says:

            From what i heard around the interwebz, the new one still has some unreliable bumpers.

            I was interested in the new Xbone controller since Dark Souls destroyed the right bumper of my 360 pad, but then again i figured it was simpler not to risk it and go for another 360 one. The wired version is just 30-ish Euro afterall.

            I could only justify the new one if PC games started using it’s new pseudo force feedback features for the triggers and stuff, but as far as i know the drivers currently available are just there to fake Windows into thinking you have a regular 360 pad connected.

          • Ryuuga says:

            Thanks, good to hear some feedback on the xbone pad! Seems it isn’t a big enough upgrade to be worth it. The improved analog sticks still seem rather tempting, though.

            Those trigger force feedbacks are what worry me. I’ve been doing a bit of googling but haven’t found any instrux for removing them. I guess rumble etc is for some people, it just isn’t for me. It was nice & easy to remove on the 360 pad. Well, fairly easy at least.

            (As for turning rumble off? Well. There’s the games where that doesn’t work, like Borderlands 1. Which I spent ages playing with gamepad. There’s an option for turning it off. It just doesn’t do anything.)

    • fish99 says:

      You can also use a Dual Shock 3 or 4 on PC with the charging cable and xinput wrapper drivers, and I personally prefer it to the 360 pad because the sticks have more range of movement and aren’t as stiff, plus the d-pad is much better.

      • Jason Moyer says:

        DualShock 3 has awful triggers and cramp-inducing analog stick placement. I don’t even use the DualShock with my PS3, having instead purchased a PS3-compatible pad that mimics the Xbox gamepad layout. The only advantage I can see the DualShock having is the D-Pad, and how often do you actually use that as the primary control.

        • fish99 says:

          I use the d-pad to play Spelunky. For games with digital movement, analogue sticks aren’t quite precise enough. The triggers aren’t an issue if you buy the little clip on attachments.

          I had the experience of playing half of Dark Souls 1 on the 360 pad (at the time it was my favourite pad) and then switching to DS3 for the second half, and the DS3 was clearly superior for that game. The analogue sticks are too stiff for quick movements on the 360 pad.

          As for placement of analogue sticks, pretty much user preference but I’ve never had major discomfort with either layout.

        • albertino says:

          I always thought the duelshock 3 pad was outclassed by the 360 pad (owning both) but the duelshock 4 is a vast improvement and now trumps both (for me). I think it has something to do with the sticks being further apart, and I like the weightier feel.

          I agree with the poster further above too – despite loving K+M for some games (eg: CS:GO), I much prefer third person games with a pad.

        • VelvetFistIronGlove says:

          I’m one of those weirdos that finds the DS3 miles better than the 360 pad. It’s lighter; the sticks are more responsive and more comfortably placed (for me); the battery lasts much, much longer. Plus it’s bluetooth so I don’t need any custom dongle hanging off my USB ports when I want to use it with my laptop. I’d probably use it with the 360 too if I could.

      • pullthewires says:

        Everyone I know who’s tried using a PS controller for PC gaming has reported just enough input lag to be a problem, I guess because of the extra layer required. Is this a universal problem?

        • Dare_Wreck says:

          I’ve never noticed any lag when using my PS3 controller on my PC, but since there’s no official driver for it from Sony, I can imagine your mileage may vary, depending on how you set it up.

    • Ejia says:

      I find that the 360 pad is better for platformers like Super Meat Boy, yes. But I tried it for Kingdoms of Amalur: Reckoning, a third-person actiony game, and found that even though it was built for controllers in mind (you can’t even aim the bow, unless there’s a way to turn off auto-aim that I haven’t found), I still defaulted to kb/m.

    • Premium User Badge

      phuzz says:

      A gamepad is pretty damn handy if you’re using an Oculus Rift, because you don’t have to look at it to find all the buttons.
      Hunting for WASD on a keyboard when you’re not entirely sure where the keyboard actually is can be tricky.
      Mind you, I’m also looking at picking up a proper flight stick.

      • CookPassBabtridge says:

        Its especially nice for standing VR, which I find more immersive. Definitely get a flight stick, with as many buttons as possible. DCS in the rift is utterly incredible.

        • Harlander says:

          How do you do all the little buttons in the rift? Something like the X52’s weird little mouse thumbstick thingy, or something else?

  2. Sp4rkR4t says:

    Does DisplayPort 1.3 also include their freesync implementation?

    • MrAvalier says:

      The DisplayPort 1.3 specification does indeed have dynamic refresh rate support, obviously you’ll have to spare some change on a video card which supports this feature before you depart from Green Team’s Gsync. AMD refers to their implementation as FreeSync, so it may be that both Gsync and FreeSync are supported by DP 1.3, it could be that Nvidia doesn’t support DP 1.3 with Gsync at all, or it could be that space toasters are popping in and out of existence every second.

    • Sakkura says:

      Adaptive-Sync (the new name for Freesync) was already added in the DisplayPort 1.2a specification.

  3. Pich says:

    meanwhile after finally getting a decent graphics card for the first time in my life i managed to fry both my psu and motherboard and i’ve been waiting almost two weeks for a replacement :(

  4. feffrey says:

    My old 570 needs put out to the pasture. I think a new 970 will be quite nice.
    Would it be worth it to keep it going just for physix processing?

    • nrvsNRG says:

      Go for the 980. It’ll be a nice jump up for you (in-between the 780 & 780Ti performance wise).
      …Personally Ive never bothered or needed separate PhysX cards.

    • FurryLippedSquid says:

      You can do that? Huh.

      PhysX doesn’t work with AMD cards does it? Booooooo. Hisssssss.

      • TacticalNuclearPenguin says:

        It might still currently be possible with hacked drivers, but even that way you’ll have to get some Nvidia GPU to go along your AMD one regardless.

        Then again, even a cheap one can do the trick on most games. Something like that is already overkill on 90% of the supported games. It might not be perfect for the remaining 10%, but still decent.

    • TacticalNuclearPenguin says:

      Ever since i upgraded to the 780ti, my “old” 670 is sitting in the second slot as a Physx card.

      Got a 25% max usage of that in stuff like Metro Redux, but you can go far higher in something like Borderlands 2, and off course it’ll serve me just fine for the Pre-Sequel ( ugh ) aswell.

      And yes, your 570 has enough meat to be used like that. Many on the interwebz will tell you that you can use a very crappy Physix card, way less powerful than the 570. While this is true for most titles, the same doesn’t apply the to heaviest of them, like the latest Batman for instance. Your card should be a nice sweet spot.

      Do also bear in mind that, while a dedicated PPU can do all the maths needed, your main card still has to draw the actual extra effects, so you’re not completely safe from the extra strain demanded by the highest physx settings. Especially true in the least optimized effect of all, the super special volumetric smoke in AssFlag.

      Considering that you probably won’t be able to sell your card for decent money, you really should try this option.

      • Tatty says:

        Do you need an SLI mobo for that or just one with the requisite second PCIE slot?

        I’ve got a 780 in my PC but have a 560ti laying in a drawer.

        • TacticalNuclearPenguin says:

          Nothing like that, you don’t even need to use the SLI connector. As you said, it simply needs to be there.

          The next step is going to Nvidia control panel, then to the Physix tab, and then select your 560 and tick the “dedicate to Physx” box.

          Also, as you might imagine there’s little problem if that thing runs on 8x or even 4x lanes.

          Oh, and you don’t need to touch any extra drivers.

  5. Asurmen says:

    The monitor will still require messing around patching drivers, surely?

    I’m unsure what to do with GPUs. I’m on a 110hz 1440p IPS monitor on a AMD 7970. Tempted to Crossfire with a 280X.

    • Dale Winton says:

      That is what I do , drivers are fine and you’re frames will go up 40% on most games

      • Asurmen says:

        What I meant by drivers is that both AMD and Nvidia software lock the pixel clock of their GPUs to certain Hz limits. In order to overcome that, you need to use unofficial patches on their drivers to unlock pixel clocks. You have to do that because the pixel clock is a function in Hz of the bandwidth required to drive a particular resolution at a particular frequency. High frequency 1440p is beyond the pixel clock limited by both companies.

        • Smoky_the_Bear says:

          Expect that to change in Catalyst/GeForce Experience when these 1440/144hz panels become readily available. You won’t be running at 1440 and 144hz for newer games with many older cards anyway unless you have a cutting edge SLI setup.

          Early adoption for this sort of thing is rarely a good idea, I’d encourage anyone to wait a while, competition will appear, prices will drop and the web will be able to tell you the superior products in the bracket.

  6. bee says:

    HDMI 2.0 please? It feels like I’ve been waiting forever for it!

  7. nrvsNRG says:

    “I’ve never really jived with relative lack of precision you suffer with a pad.”

    what precision do you need when playing 3rd person ARPG’s, beat’em ups or racing games, that is better on KB/M?

    • fish99 says:

      Depends what sort of game you’re playing. A serious flight sim is better on a joystick, the greater range of movement allows for more accuracy. Same thing with a steering wheel/pedals for serious racing sims. For fighting games a fight stick is probably the best (although expensive) option.

      3rd person action games, a gamepad is usually the best choice, although I did finish Kingdoms of Amalur on mouse/keyboard and it was fine. I also played Gears of War and GTA4 on mouse/keyboard and both benefitted from the extra aiming precision. Part of the problem is 3rd person games often don’t support the mouse well or have control schemes that don’t adapt well to mouse/keyboard, but it can be done.

      The Arkham or Souls games, or a platformer like SMB, a pad is the best choice IMO.

  8. 7vincent7black7 says:

    I bought a XFX ATI Radeon HD 6870 graphics card not too long ago for about 160 USD from a guy who owned a local computer repair place in town (which was a shadow of its former self when the business had opened 2 years prior with 7 friends and was now run by one).

    The graphics card worked, except for the fact that it was bigger than my ATI Radeon HD 5570 graphics card. Despite my utilizing the onboard GPU cooling fan, replacing my NIC with a network USB drive to clear some airflow, and keeping the fan speed at maximum with different 3rd party programs, the computer would overheart at the bottom around the GPU when I ran games and crap, and eventually the GPU would crash and I’d have to restart the system.

    I eventually had to switch them back out, and its collecting dust in a ziploc freezer bag in my computer desk. I’ll not be buying another XFX Radeon graphics card any time soon again. They are just a little to big for my computer’s ATX form factor in relation to where all the other components are in my tower, and the somewhat lack of airflow or cooling in the bottom area. I have only one GPU slot, so I didn’t have much of a choice in the matter but to ultimately remove it.

    • Sakkura says:

      Why would you spend so much money on such an old card. That makes no sense.

      Seems likely you also got screwed with a partially faulty cooler on the card.

    • Person of Interest says:

      To contrast, I use an XFX ATI Radeon HD 5850 and it’s been going strong for 4+ years in a machine on nearly 24/7. I put an Arctic Cooling Accelero S1 heatsink on it, zip-tied an undervolted fan underneath, and it’s given me zero problems (and virtually zero noise).

  9. TacticalNuclearPenguin says:

    One thing i can’t seem to find on those new high refresh AHVAs is if that 144 hz actually suggests the presence of G-sync or not.

    Probably not, or it would probably be mentioned with some serious enthusiasms. Then again, one can hope.

    With that on the Horizon my not-final choice of the newest 32 inches AMVA panels @1440p is being seriously challenged.

  10. steves says:

    “the awful reference cooler on this particular 290X. It’s ridiculously noisy and not very good at cooling. You have been warned. Thing is, the 290X is a beast with a 512-bit bus and for £270 is bloody tantalising, even with a borked cooler. I probably still would.”

    There is a solution – I put one of these on an old 6950 (which was starting to sound like a helicopter due to cheap & nasty fan), but it’ll fit almost anything modern:

    link to quietpc.com

    Not for the faint of heart when it comes to mucking about with hardware though. It’s pretty fiddly to set up, requires a dab hand with the thermal paste, and the instructions are a bit opaque. Works a treat though, especially if you can wire it up to a fan controller, which is all of another £3.

  11. DanMan says:

    Care to explain the “4:2:0 pixel sub sampling” bit? I have some vague idea from JPEG compression, if that’s related, but that’s about it.

    • TacticalNuclearPenguin says:

      It has some similarities.

      A good article about it is this: link to nag.co.za

      It basically explains how Nvidia uses the same trick to force the crappy HDMI to enable 60hz on 4k. In most uses you shouldn’t see much difference, especially if the source you’re viewing is not that good to begin with. With better material to watch, however, and a properly calibrated monitor you are bound to lose some fine shade variety and so on.

      It’s mostly a problem for very discerning eyes watching the right content on the right monitor though.

      JPEG compression is even more crude though, it simply grabs all the “close enough” colors and merge them in a single shade, that’s why it’s rather horrible for photo editing since you’re simply going to permanently lose some “hidden” details that could otherwise pop up depending on what you do.

      A popular example would be a seemingly white sky that in truth hides the right shades provided you bring down it’s luminance values. In RAW you can recover that, with JPEG is lost forever.

  12. Horg says:

    Just a little PSA as you mentioned Scan; do not buy from Scan. If nothing goes wrong with your purchase, they are like any other shop, but if something goes wrong they will do everything they can to avoid a replacement / refund when they are at fault. For such a well known UK hardware company they have shockingly bad customer service. I got a GPU from them a few years back that was DOA, waited 2 months on the RMA, and was eventually told that I couldn’t prove the GPU was the same one I bought from them. After another month of arguing I sent them a copy of my small claims papers, and was one day away from submitting to the court before a brand new GPU just arrived out of nowhere. No apology, no warning it was coming, just 3 months of stress because they sent me a broken product and refused to admit fault until a court appearance was on the table.

  13. rcguitarist says:

    The writer of this article obviously doesn’t play racing games or platformers on his pc. Those are awful to play with a mouse and keyboard. Gotta have the right equipment for the right job.

  14. OmNomNom says:

    I own an Asus PG278Q and have owned an Eizo FG2421 (the premium VA gaming panel to date). I would be very surprised if the new VA panel (not really IPS) is as good as the PG278Q. The PG278Q colours are really almost comparable to the VA panels as well.

    It’s a nice idea but IPS and VA are always so blurry even with tech like Turbo240 :|

    • TacticalNuclearPenguin says:

      The biggest issue with these new high refresh panels is that they might use 6bit+FRC instead of native 8bit color, it happened on some PLS monitors aswell, mostly the budget ones off course, but it’s also worth nothing that such a solution improved the response times.

      That’s the biggest issues with LCD panels, you simply can’t have a near perfect one on all accounts. Besides, LCD’s biggest flaw is it’s dependency to back lighting: you’ll never have great blacks, not even with an AMVA panel. Black crush when viewed straight on forces a corrective calibration for the darker shades or they’ll disappear, but then again if you do that you’ll lose a good chunk of the contrast advantage.

      Just pick what seems to be the best offer for your preference, there’s nothing we can do before OLED finally comes to save the day.

      Also, AHVA might sound misleading but it has nothing to do with VA, it’s actually another IPS-like alongside PLS.

    • Jeremy Laird says:

      Th AU Optronic panel is not VA. It’s an IPS-type panel. Which I highlighted clearly in the post!

      The ‘VA’ in AHVA stands for Viewing Angles, not Vertical Alignment as per MVA and PVA panels.

      • OmNomNom says:

        Sorry my mistake with the VA business! As for the PG278Q it is 8-bit as opposed to the normal TN 6-bit.

        In regards to OLED, I have yet to see an OLED / AMOLED that doesn’t blur like hell. Do these even exist? I’m sure it’s not uncommon knowledge that phone / laptop ‘retina’ + displays etc are awful for real gaming.

        • TacticalNuclearPenguin says:

          It’s just that phones use some incredibly serious piece of crap panels independently on the technology used, you don’t see the same problem in the integrated display of professional cameras and the likes. It might seem that mobile devices are advancing the most with the incredible resolutions compared to screen size, but in truth this doesn’t happen without serious compromises.

          OLED is indeed not ready to be put in a big monitor for many reason, cost included, because it’s simply just too early for that. A proper implementation of that technology has a flabbergasting potential though, especially when it comes to contrast and color. Every pixel can emit it’s own light, the color gamut is not limited by the kind of backlight used nor does it need to reduce the backlight intensity to improve the blackpoint on very dark scenes ( at the expense of contrast ).

          Imagine a scene in which there’s a very, very dark road and a single lamp post in the distance. The lamp post pixels could be extremely bright while the rest of them could be almost shut down. That is a situation an LCD panel can’t simply recreate.

      • Sakkura says:

        All Hail Viewing Angles! (AHVA)

    • Premium User Badge

      samsharp99 says:

      I bought the PG278Q as well, had it for about a week now and I’m really impressed with it. The difference in the resolution is remarkable compared to my BenQ 24″ 1080p and I can run most of the games I play (LoL, TF2) at native 1440p resolution at 144Hz with my AMD 7950. I didn’t really get what the fuss was about with 120/144Hz refresh rates but now I’ve tried one (just grab a window and shake it on a 60Hz monitor and then on a 144Hz one and you can immediately see the difference) I doubt I’ll go back.

  15. ZombieJ says:

    C’mon 980, daddy wants a cheap 780ti! (good support for 4k gaming is 2 years away still guys)

  16. Person of Interest says:

    Where’s a place I could see these fancy new monitors in action? Might an electronics mega-store have 4k / g-sync / 144Hz / LightBoost / ULMB displays running with useful demos? Or do I need to go to a trade show?

  17. chaywa says:

    Kinda fallen out of the GPU scene recently but looking for an upgrade to my ageing 5850 in the near future for ~£200, any good recommendations to consider?

    • Person of Interest says:

      I’m in the same situation with my 5850. I want to stay under 150-175W TDP for noise reasons. I wasn’t paying attention when the GTX 670 came out, and I can’t imagine buying it now that I’ve seen the 750Ti reach 2/3rds of its performance with 1/3rd its power consumption.

      Speculation: will higher-end Maxwell parts have the same performance/watt as the 750Ti? That would mean a 28nm Maxwell part with 150W TDP (“960 Ti”) could perform as well as a GTX 780.

      Next speculation: how much will power consumption go down with the transition to 20nm? I think Ivy Bridge was about 25% more efficient than Sandy Bridge, and that was a 32nm->22nm change. Should GPU’s scale the same way?

    • fredc says:

      Just replaced my 5830 with an Radeon R9 270X.

      We’ll see what changes over the next few weeks with these new chips, but the 270X seemed to be in the sweet spot of price-to-power. Much quicker than the 5800 series, uses the same 2×6 pin power connectors and shouldn’t draw more juice, capable of doing pretty much anything at 1900xwhatever resolution on a single monitor.

      You can do a 280 for under 200 quid if you’re not that price driven. I didn’t bother with NVIDIA because for comparable single-card performance it appeared I’d be paying £200+. Also, ATI drivers haven’t given me any issues for the last 5+ years and I have read multiple opinions that NVIDIA currently has “issues”, with people using third party software etc.

  18. Brothabear says:

    RPS isnt bullshitting. DO NOT buy any new PC parts yet. Too much is going on with new GFX cards coming out, better CPUs, new OS. Ect.

    If i were to save up for a 1tb SSD. You can be damn sure I dont want to install an OS on it only to redo the process later.

  19. golem09 says:

    Gotta buy a new gpu for Witcher 3 next january, and I think it will be the 980. Now there is one thing I’m wondering about, and I don’t seem to find good info on it:
    Does it matter if my board only has PCIE 2.0 when buying such a high end card?

    • Person of Interest says:

      Anandtech tested GTX Titans in SLI on PCI-E 2.0 and 3.0 (in x16/x16 configuration) and mostly saw no difference. I assume a single card will be less dependent on PCI-E speed.

      link to anandtech.com

  20. fish99 says:

    Would like to see some price drops on the 4gb mid range nvidia cards like the 760 if there’s replacements on the way.

  21. Jake says:

    Literally just placed an order for a new graphics card then came to this site. I’ve cancelled it for now I hope. I really need a new card to replace my aged 5850 (which actually works just fine for the games I play but it’s so noisy that I am not sure you’ll be able to hear this comment over the sound of the goddamn helicopters. I think something has gone wrong with it).

    I was about to order a Palit Geforce GTX 750Ti Kalmx (it looks quiet). Worth waiting? Or is this more relevant for high end cards?

    • Person of Interest says:

      If you’re a little adventurous, you can install an aftermarket cooler on a new videocard and, depending on the amount of heat you need to dissipate, make it completely inaudible. SilentPCReview used the Prolimatech MK-26 to cool a 175+ watt video card (Radeon HD 5870) at only 15dBA [1].

      I just glanced at some leaked reviews for the 900-series cards, and all I will say is: they aim a lot higher than the 750 Ti. But if you want a faster video card, I expect some GTX 970 cards will ship with near-silent coolers, no modification required.

      If you’re just looking to match your current card’s performance, the 750 Ti will slightly exceed it in performance while using half the power. It’s still up to the manufacturer to pair it with a quiet cooler, though. (Edit: I just looked up the KalmX. That will certainly be a quiet card!) If you care neither about power consumption nor performance, but only noise, then any of Arctic Cooling’s aftermarket products can cool a 5850 quietly (for my 5850 I use an Accelero S1 and quiet fan).

      [1] link to silentpcreview.com

  22. The Dark One says:

    AMD hasn’t said a word about a 285X chip, but the 285 has a bunch of disabled cores that hint pretty strongly at the higher-clocked, fully-enabled X version coming in the future. The real question is whether it’ll be anywhere near as good at the Geforce 970.

    The 285 falls in between the old 280 and 280X in terms of performance, but Nvidia’s 970 outperforms the 290. That’s a pretty stuff challenge.

  23. HadToLogin says:

    Can’t wait for new Batman, as that will be probably-first real test for GPUs – if it can run it in fullHD with details at max, then it should be enough to playing in console standards until next-gen comes.

  24. yonsito says:

    This is all very nice, but I think I will be waiting for a 960 at around 220€.
    I find it a bit difficult to justify spending more money on a graphics card. Also, all games I’ve been playing lately are running just fine on my 3 year old 560ti.
    What kind of games need that kind of processing power anyway?

    • SuicideKing says:

      At 1024×768? Not many. At 1080p? Most in the last 2-3 years.

  25. danijami23 says:

    I really don’t think I’ll EVER buy a new monitor, unless 4k IPS displays get silly cheap. I still have my 23in Apple Cinema Display from when I owned a mac, and it’s the best screen I or anyone I know has ever owned. Perfect color, paerfect clarity.

  26. SuicideKing says:

    Reviews are out, the 970 is a killer card at a killer price.

    link to techreport.com

    Also check out AnandTech and others.

  27. Shooop says:

    Reports say the 970 is a good upgrade if you’re still using a 600 series card. It’s absurdly power-efficient too.

    Also I want that monitor you describe yesterday. I haven’t left my Dell 1080p IPS because nothing else has decent pixel response and isn’t a crappy TN panel, and the only logical upgrade available for it today is over 1000 USD.