Week in Tech: Buy A Decent Screen, That’s An Order

While I slave away gathering all the bits for our upcoming home-build vs factory-built PC comparo extraordinaire, here’s something to think about and even get on with in the meantime. Buy a decent screen. I’ve touched on this before, but some recent shenanigans with 4K monitors and Laird Minor (little brother) being in need of a new screen have reminded me of something. My main PC display is seven years old. My secondary PC display is eight years old. And it’s only now that I’m beginning to even think about upgrading. Imagine trying to game on an eight-year-old CPU or graphics card. Nasty. Meanwhile, the skinny is out on Intel’s new anniversary-themed CPUs and the rumour mill is building up for the next wave of high-end graphics cards.

Re the buying screens thing, admittedly my monitors were pretty high end back when. And, OK, I didn’t pay for one of them. But it was a sudden moment of clarity and import the other day when I realised how old they were. For the record, we’re talking Samsung XL30 for my main rig (30-inch, 2.560 by 1,600, PVA) and Dell 3007WFP-HC (30-inch, 2,560 by 1,600, IPS) for my secondary system.

Imagine I just had the Dell, for which I paid £800 eight years ago. That’s just £100 a year for the privilege of running a glorious 30-inch IPS display. Personally, I think that’s a bargain. It’s still a bloody nice screen.

4K pixel pitches are marginal even on uber 32-inch panels

Even now and having seen most of the new 4K displays on the market, there’s nothing I’m absolutely convinced I’d take over my current displays. We’ve covered some of this before, but my issue with current 4K panels is a combo of pixel pitch and software. I absolutely hate using the Windows scaling settings and everything is just too bloody small at 100 per cent, certainly with the 28 and 24-inch 4K panels and probably even with the 32 inchers.

I’d need a 34 to 36-inch 4K panel to have a pixel pitch I’d be happy with at 4K resolutions. Then there’s the whole problem of driving 4K resolutions in games and the epic GPU power you need to do that. I can’t be doing with multi-GPU and even if I could it would mean a hefty additional investment.

Short version: I’m still probably years away from screen upgrades and it may well be a decade overall before it actually happens. That seems extraordinary to me and in that context and if you can manage to put a big wad of cash up front for a great screen, I think it pays long term more than any other PC component by a mile.

Asus’s ROG display. 1440p. 120Hz. Good to go for years and years.

I suppose the danger is that suddenly big OLED screens get super cheap or 36-inch 4K panels become affordable. But honestly, if you bought something like a nice 1440p 27-inch with high refresh support today, I doubt you’d have any urge to replace it for years and years.

On a related note, it looks like some additional GPU power that will edge us closer to single-GPU 4K gaming is coming.

AMD is said to be prepping a new version of the Hawaii chip as currently found in the Radeon R9 290 series. Allegedly, the chip has had 3,072 stream shaders all along, not the piffling 2,816 we’ve suffered so far.

You’ll also get a few more texture units, to boot. Branding for the new card is supposedly 290XT or perhaps 290XTX and a repeat of the 290’s crappy cooling débâcle is not expected.

Cheaper Radeon R9 290s should be the big upside to a new high-end AMD board

It’s a plausible enough rumour given common industry practice to have redundancy in-chip to increase yields. The performance increase will be more of a PR coup than a dramatic real-world game changer. But as ever the real benefit should be to push down prices on the 290 and 290X. I’m pretty hot for the 290, so if that gets cheaper, it will be even more attractive.

As for Nvidia, the rumours here involve the Maxwell architecture that’s so impressive in the GTX 750 making it into the high end. Apparently, Nvidia is still stuck at 28nm rather than making the leap to 20nm.

Normally, I’d say that was disastrous for a new high-end board. But Maxwell is so impressive in the 750, it augurs pretty darn well for the high end, even at 28nm. I’d expect the GTX 870 and 880 (as they are expected to be known) to deliver the sort of performance jump that normally comes with a new architecture and a new manufacturing process.

The new Nvidia cards will be based on the GM204 chip and thus not the absolute uber Maxwell chip, which will be the GM110 and I suspect will have to wait for 20nm.

Nvidia’s Maxwell was mighty enough in the measly GTX 750, so it could be killer for high-end graphics, even at 28nm

While we’re talking Nvidia. There’s been some noise regarding added 4K-60Hz support for Kepler-generation GPUs via a driver update. It’s not worth getting too granular with the details here, but suffice to know it’s a bit of a kludge that involves chucking out some image data to allow a 4K-60Hz signal through the bandwidth limitations of HDMI 1.4.

Given that 4K is all about uber image quality, for me it makes no sense to kludge it, though I haven’t seen exactly what the impact is of binning some of the colour data.

Anywho, back to those new GPUs, dates-wise this is all rumour but the AMD GPU feels pretty imminent while the Nvidia stuff is towards the end of the year. Just something to factor into any graphics buying you may be considering.

Finally, Intel’s new we-still-love-PC-enthusiasts CPU models are out. I haven’t had a chance to play with said due to an impromptu and unavoidable 2,500 mile traipse around Europe.

Intel’s K Series Devil’s Dumplings (not actually pictured here, before someone complains) apparently don’t deliver

As it happens, the dual-core Pentium G3258 does hit the 4.5GHz I’d hoped for despite not having the proper thermal gubbins that the renewed Core i5 and i7 K series chips (codenmaed Devil’s Canyon) receive.

At that speed, it performs pretty similarly in a lot of games to one of the cheaper locked quad-core Core i5s, which is pretty bloody interesting for just £55 in Blighty and $75 Stateside. As for those Devil’s Canyon chips with the new thermal materials, the Core i7-4790K and Core i5-4690K, I’m told they run a smidge cooler but are barely any better for overclocking. Pretty pointless, then. Oh well.


  1. Deepo says:

    Regarding the ASUS: I can not and will not agree that a TN panel is any kind of decent screen. I know people love it, so I’m obviously an idiot, but I just can’t bring myself to enjoy a TN panel.

    • sandineyes says:

      I’m not sure I understand the reasoning behind this. If viewing angles are an issue, then yes, there is plenty to complain about, but in terms of contrast ratio and color accuracy, there are TN panels that are as good (or bad, depending on your point of view) as IPS screens are.

      Not trying to be confrontational; I haven’t had the opportunity to compare a TN monitor with an equivalent IPS one, so all I can see are performance metrics.

      • TacticalNuclearPenguin says:

        TN panels can be calibrated aswell, yes, and they can be color corrected.

        Most “facts” about panels are a myth. IPS gets the best treatment because it’s used on professional monitors due to perfect viewing angles straight on, which negates contrast/color shifts and thus is needed for color critical work.

        IPS: Average contrast, perfect viewing angles and consistency
        TN: Average contrast, crappy angles and fast performance
        AMVA: Mythologically high contrast, worse angles than IPS but better than TN

        I still wouldn’t go back to TNs though, i really notice the off center crap too much. Mileage may totally vary, obviously.

        It doesn’t help the fact that some TNs are of horrible quality and get out the factory with wrong gamma curves, just like some IPS are astounding and better calibrated out of the box, which further enables the myths to run rampant, but the reality is that each technology employs different quality levels between the same panel.

    • clorex says:

      For anyone waiting for an affordable, IPS-like, 27″ 4k display, there’s the Asus PB279Q.
      link to guru3d.com

  2. trjp says:

    I bought a new monitor last year – nothing special, it’s an LG 1920×1080 23″ jobbie.

    Just after that I was given an older 1440×900 19″ – matching (in size and res if not brand) the monitor I’d replaced – so I stuck those 2 either side of the new one.

    I have since become quite addicted to multiple monitors for work purposes (less so for gaming, mainly because nVidias ‘surround’ system is laughably shit if you don’t have 3 identical monitors!)

    More relevant to this tho is the STAGGERING improvement in brightness/contrast/crispness this LG has over the older monitors. One of those is a Samsung which was considered a ‘best buy’ when I bought it (about 6 years ago?)!!

    I’d take 2 more of these LGs in a heartbeat – hell I think I might even take a slightly larger monitor now I’m used to a 23″ :)

    p.s. it’s also the first ‘summer weather’ since I got this LG and it’s most excellent at not generating enough heat to replace a small radiator too!!

  3. FurryLippedSquid says:

    Oh, woe is you, Jeremy!

    My screen is also about 7 years old, but it is a lowly 1280×1024 19″.

    Happily I only game about a foot and a half away from it so it has served me well, and of course requires less power from my GPU/CPU to push pixels around. When I get a new one, I’m probably going to go for the same again, though they are becoming rarer and more expensive. I just don’t see the need for anything more.

    • Premium User Badge

      Aerothorn says:

      Almost all contemporary games are designed with the assumption of widescreen – so your issue there isn’t so much the resolution as the extreme verticality of 5:4, and the fact that few developers are likely to go to lengths to make sure the game plays well in that format.

      • FurryLippedSquid says:

        I have not experienced any problems yet.

        Tell a lie, there are a select few games that force me in to 1280×800, but they are usually a few years old, and but a few. Doesn’t appear to affect game play in any way.

    • LionsPhil says:


      Although 5:4 is the devil’s form factor. (No, I care not for this widescreen folly, either; give me 4:3! A sensible ratio, for sensible computing.)

      • PopeRatzo says:

        4:3 is a pleasing ratio. But I wonder why the golden rectangle was never used for monitors. Is it because 1024×768 is 2^10 x (2^9 + 2^8), three consecutive powers of two, which is somehow easier on processors than having to use the 1.618 : 1 golden ratio?

        • MistaJah says:

          I would like a 26:16 monitor. That’s 1,625:1. It’s superiority to 16:10 is negligible. Or 5:3, 1.666667:1… I have a 19″ 5:4 too. So square.

        • Zenicetus says:

          The 4:3 aspect ratio goes back to 35mm silent films in the 1920’s, which was then adopted for 20th Century television. I assume the reason that computer monitors never went to Golden Rectangle or other ratios before the current digital widescreen HD phase, was that it was just easier to adapt the TV hardware of the time.

          I’m still on a 4:3 display because I need wide gamut and color correction for the part-time graphics work I do. I’m using a NEC 2190UXi at 1600×1200. A widescreen upgrade would be something like the NEC PA242W-BK-SV that sells for around $1,300 USD, and don’t have that kind of money right now for an upgrade that would be mostly for gaming.

          Color gamut and accuracy don’t tend to rate highly on gaming sites like RPS, but it’s astonishing how good game colors look on a display like this.

          Once in a while, I’ll run into a game that has some quick about running in 4:3. But I don’t think most game devs are ready to drop 4:3 support for at least a few more years. Lots of us 4:3 users still out there.

          • TacticalNuclearPenguin says:

            That’s because AdobeRGB is actually detrimental to anything that’s not a completely, 100% AdobeRGB workflow.

            You can be color correct in sRGB too, but generally no wide gamut monitor can simulate that color space decently. That’s why wide gamut monitors are only for professionals with the right printers, because they work from top to bottom in a different color space. Color correctedness comes natural in that case, but only because it’s a necessary feature of such products, not because you can’t have that with other gamuts.

            Likewise, if you shoot your ultraexpensive reflex in AdobeRGB and then use that photo in ANY way other than what i mentioned above, you’ll end up with a turd on your hand.

            Anything else in the world works with sRGB, and that WILL appear oversaturated on your monitor, just like anything else that is not your color space, such as games, which might explain why you think they “look great”. You can be color corrected in your native, target gamut but you won’t be in others, unless you find a way to make the sRGB simulation mode work properly ( almost never happens ) and you have a good spectrophotometer on your hands.

            You don’t have to believe me, just go to TFT central and read reviews. Never go to any other site that is not as scientifical as that in the whole internet to read about monitors, once you’ll read even a single review you’ll understand why i’m telling you that.

            read also this: link to kenrockwell.com

            But beware that this guy is a jerk and the whole rant is mostly about photography. I don’t know what you do.

          • Zenicetus says:

            I haven’t tried one of the newer NECs, but for what it’s worth, I don’t see games running in sRGB space on this monitor looking over-saturated. I’m fairly sensitive to that look when it’s overdone with photo processing, and I think I’d probably notice it.

            Edit to add: And yes, I’m familiar with the problem of Adobe RGB published online without color conversion to sRGB. That’s an easy step, and I don’t have trouble doing projects in either color space depending on the client application (print or online).

          • TacticalNuclearPenguin says:

            Off course it doesn’t look like the overblown amateurish post processing of some photos, otherwise an enthusiast like you would have probably already committed suicide.

    • Gap Gen says:

      My monitor was bought in 2007, and I’ve never thought about replacing it. My CPU is nearly 8 years old, too (Q6600, which was launched in Jan 2007). Come to think of it, my keyboard is also that old and probably hella disgusting.

    • Rikard Peterson says:

      Mine is also in that resolution, and though it’s even older and only 17″, it still is a decent screen and I feel no need to change it. (LG Flatron L1730P) And I don’t have that kind of money to spare either, so I’m glad it’s still working just as well as the day I bought it.

  4. subedii says:

    Still on a 1680 x 1050. I think I bought it 6-7 years ago.

    Honestly, I still don’t feel the need to upgrade. Unless something really majorly groundbreaking comes along (and then becomes quickly affordable) my money’s probably better saved for other things.

    As things currently stand, I’m not even sold on 4K as of yet. Even on the presumption that it would be a massive boon to me, and prices come down a good degree more, hitting 4K would just mean that all my core hardware would be out of date again if I wanted to push 4K worth of pixels around without sacrificing all the other visual settings.

    It just doesn’t seem like a worthwhile trade-off right now. I’m in a happy place where even with the newest gen of consoles, my hardware still keeps up with the releases (that may change come Witcher 3), and having to push more pixels around just seems like it would be a net detriment if I didn’t actually spend a good deal more on upgrading my main rig just to stay largely in place.

    • FurryLippedSquid says:

      Couldn’t agree more, sir.

      (see my similar comment above)

    • Rindan says:

      Upgrading your monitor is a fool’s errand. Do you want to see the polygons better at a lower frame rate? If so, buy a 4K. Otherwise, it is a waste. If the resolution of the game is vastly lower than the monitor you are playing on, you are just wasting your time to get a visual improvement that your brain will filter out after 20 seconds of use.

      Honestly, I feel the same way about TVs too where the resolution might actually technically matter. Higher and higher resolution is an utter waste if after a minute your brain decides it doesn’t give a fuck and starts filtering out erroneous details. Sure, you can appreciate the difference when you really focus and try and compare, but that isn’t how you watch TV. Better contrast at this point is probably more valuable than higher resolution. A blacker black probably means more simply because your eye will strain less than slamming a few more pixels at the edge of your theoretical detection limit.

      The classic example: link to youtube.com

      If you want to drop money on a computer to make it, biggest graphics is best graphics card. Your eye might not give a shit if there are a few more pixels stuffed in there, but it will get annoyed at any frame rate under 30 FPS, and be down right pissy at anything under 10 FPS.

      • TacticalNuclearPenguin says:

        Want contrast and black depth? LCDs in general are a failure for that, be it IPS or TN, but they have at least one good panel type for that, AMVA:

        link to tftcentral.co.uk

        link to tftcentral.co.uk

        This is a real monitor’s static contrast, measured with proper tests, for those who are still confused by the stupid marketing numbers of dynamic contrast, which should be the first thing you turn OFF in any monitor/TV you buy anyway.

        EDIT: My bank is gonna die, first ever AMVA 1440p : link to tftcentral.co.uk

    • Toupee says:

      Yeah, I loved my 1680×1050. Viewsonic, 2006. Worked gorgeously until last year, when the on-screen display stubbornly started sticking around on the screen. I think it’s a button issue, but I can not for the life of me figure out how to solve it. I even tried taking it apart, and when I put it back together that seemed to fix it for a couple of days, so I thought maybe it was a piece of dust or something sticky making a contact. Then it came back with a vengeance and never left. :(

      (If anyone has any ideas of how to conquer that, I would love to get that monitor back up and running next to – yup, a 23″ 1080p LG! [which is nice and all.])

  5. BTAxis says:

    Everything I have is 1920×1080 at 60Hz, but unless someone suddenly decides to give me something better I doubt that’s going to change anytime soon. I can’t justify spending a lot of money on a new screen, and Doom 2 looks fairly decent on what I have anyway.

  6. Chorltonwheelie says:

    I asked the RPS hivemind (well, the forum) whether to go for a 1080p monitor and spend the left over spondoolicks on a GPU or take a chance on a grey import Korean 2560×1440 jobbie.
    The answer that caught my eye was “Pixels, pixels and more pixels” and they were dead right!
    If you’re ok for bread get a 1440p monitor…you will not regret it I promise.
    I’m driving mine from a single OC’d GTX780 (£350 used) and it is a wonderous thing indeed.

    (The thread’s here if your interested
    Obviously, go for a local branded product if you can afford it but I “won”…I love it to bits)

    Did I mention PIXELS?

    • Nemon says:

      I do believe you mentioned pix… sorry PIXELS yes. I also got myself one of these fancy 1440p korean screens and it is truly glorious. GabeN’s gaze has never looked so clear.

    • TheBuff1 says:

      Woo! I have been tempted by those ever since reading the original article on RPS a while back – I couldn’t trouble you to link me which one you got from eBay as a quick search throws up similar but different Korean goodies! But my for that price it is very tempting – especially judging by your pics!!

  7. dahools says:

    Jeremy what do you think of the ultra wide monitors 21:9 Im liking the look of the new LG ones. Just waiting to see when this free sync jobby comes out any news when that might be? Think any current monitors will be backward compatible when it does? Dont want to buy something then not long after it is not as good as others for gaming.

  8. Rozza says:

    You hit the nail on the head with the 4K problems of 1. Windows scaling and 2. Needing an uber-powerful GPU. I’d rather play on a 1080p 32″ with all the settings turned up than on a 4K 32″ with everything off. Especially as (dons flame-proof suit) antialiasing makes the edges of pixels less noticable.
    On an unrelated note, is it me or does this article read like a bit of a brain-dump? I guess that on RPS I’ve become used to articles that read like they’ve seen a bit more planning. Hope I don’t sound mean by saying this.

    • FurryLippedSquid says:

      Doesn’t 4K kinda negate the need for antialiasing?

      • FriendlyFire says:

        Depends. If you’re shooting for the pixel pitch Jeremy is talking about, then the effective DPI is similar to 1080p, you just have a much larger display, so you don’t get any aliasing reduction.

        If you’re going for a 24″ 4K screen, then you definitely reduce the need for AA significantly, but a lot of people can’t afford to strain their GPUs so hard just for less need for AA and sharper text.

        • TacticalNuclearPenguin says:

          You still get a better rendered image that is more capable of finer details before getting clogged by aliasing, though.

          Eitherway yes, too soon for 4k, i only use it as downsampling in a limited selection of games, and i have a manually MUCH overclocked 780ti.

          We’re still transitioning to 1440p which is getting close to be the next golden standard. Even in my case i still can’t always guarantee 60 steady FPS, even though i’m almost always close.

  9. nrvsNRG says:

    link to extremetech.com
    ….not too shabby.
    I’m very tempted to swap out my i54670k for the i74790k, in my “game cube” rig. My main desktop already has an i7 4770k so no point there.
    As far as the monitors go I cant wait to get my hands on the ROG Swift.

  10. Scandalon says:

    Does anyone know of a good “resolution vs video card” roundup?*. Toms hardware and anandtech and such have good price-vs-performance roundups for individual components, but trawling through individual benchmarks to see at what resolutions various cards start to break down is tedious at best.

    * Likewise a CPU vs video card, for instance, I have a feeling my G860, while a great bargain on sale and planned to be replaced, is starving my Radeon 7850…

    • Scandalon says:

      (Follow-up comment to allow me to enable notification emails.)

  11. dangermouse76 says:

    For my self I have a Syncmaster 940 ( 1440×900 ) and a projector at 1280×800. I sit 2 feet from the monitor and about 15 feet from the projector throwing about 250″. Don’t feel the need for an upgrade in res at all. I can run games at a solid 60fps or above with cheaper hardware.
    The only niggle I have is that – as a photographer – I feel the higher res would feel better for editing shots.But thats more aesthetic I think for me as a user.
    I would of course love a very high res set up, but I cant justify the cost. New lens ( £800 ), new monitor……. Lens I think.

    • Juke says:

      Someone may correct my logic, but I don’t see how you can run a 1200×800 on a 250″ area without it looking like you’re watching a particularly active game of Connect Four. I mean, if I’m to believe that your projector image is 250″ diagonal with a 3:2 ratio, then your viewing area would have to be like 17 feet wide x 11.5 feet high. Fitting that in an average room at all, much less with the technical limitations of the average projector boasting a lowish resolution, I just don’t see it. But you do say you’re sitting 15′ away. Do you have particularly strong prescription eyeglasses?

      All this to say, I suspect you would see dramatic differences in image quality and brighness going to a modern Full HD projector. Heck, I used to run a 1280 x 720 projector at only 90″, about 10-12′ from the screen, and I could definitely make out the pixel grid on PC sources (movies were OK.) When I upgraded to a 1080p projector, I was much happier with the use of my PC at that resolution and size.

      Maybe you just have a huge room and different priorities, though. Possible, but surprising. Sorry if that is the case. I just jumped when I saw those claims. A math alarm went off in my head somewhere, incredulous, determined to be acknowledged.

  12. Asurmen says:

    I’ve got an Overlord 27″ 1440p IPS running at 100hz. I can’t imagine moving from this monitor, well, anytime this side of turning 40 tbh.

  13. Zekiel says:

    One thing always puts me off upgrading my monitor from my piddly little 19″ thing – which is that (unless I’ve completely misunderstood how it works) as soon as I get a bigger monitor, I’ll need to upgrade my graphics card too – or suffer a comparative decrease in quality for gaming. And getting a new monitor and a new GPU always seems to be too expensive to contemplate, given a very limited budget for gaming (which I prefer to spend on games).

    • Wedge says:

      The size of the monitor itself doesn’t matter so much as the resolution it’s running at… though if you have a 19″ monitor that is probably a lower resolution than anything else you could (or would want to) buy these days though.

    • FurryLippedSquid says:

      I would like to ask, how far do you sit from your screen while playing?

      I have a massive office corner desk but I only sit about a foot and a half away from my screen. My 19″ 5:4 does me just fine. It’s like sitting in the middle rows of a cinema, just right. Anything bigger and my eyes wouldn’t be able to take in the whole screen, it’d be like sitting at the front of the cinema with my eyes darting around the screen trying to take in things in my peripheral vision. That just doesn’t appeal to me.

    • Carra says:

      It depends on your resolution. If you want to play on the native resolution of your screen and your new screen has a higher resolution then yes, your FPS will slow down.

      • Zekiel says:

        Thanks for the advice people! Yes I missed out the important resolution question – I play everything at 1280×1024 resolution which is native to the monitor. And I sit about 2.5 feet away.

        I imagine eventually I’ll get a 2nd-hand monitor from somewhere and upgrade and be amazed that I managed like this for so long!

  14. golem09 says:

    Gotta admit I’m envious of real monitors, but I’m using my PC from a couch, and thus have used HDTVs for the last 8 years. I don’t even want to know what I’d pay for 32″+ monitors…

  15. CookPassBabtridge says:

    How much grunt would you need to run Metro Last Light at 1440p, 60fps and max shinies?

    • FurryLippedSquid says:

      I would say at least an upper-end i5 and a 780ti.

    • Gap Gen says:

      You mean the Complete version, or the older version I played recently and enjoyed just fine?

      (Protip: If you have an AMD GPU, turn off PhysX or whatever in the game menu, it makes the animation horribly jerky)

    • Chorltonwheelie says:

      I’m managing it on a GTX780 (Asus DirectCU2OC), i5 3570K @4.6 and 8G ram.
      Looks absolutely beautiful on 27″ 1440. No need for any AA (maybe just a bit of FXAA, no real performance hit).
      Go for it!

    • CookPassBabtridge says:

      Thanks chaps. Wasn’t aware there was a new version. TBH I am (still) planning a new rig and that was just my estimate of the modern “can it run crysis” game. I know I want 2k and 60+ FPS, and definitely want 75+ at 1080p as I have a rift on order. I got a price for a “shop build” for rig and 2k monitor and they came up with £2400 give or take (i7-4770K, 16GB RAM and 780ti). Havent sat down yet to do a self build comparison. To be honest I am in a paralysed state with all the wiffle surrounding PC tech these days. Anyways your recommendations seem to chime with what I was thinking.

      • TacticalNuclearPenguin says:

        The complete version is not new, it’s just a DLC/Preorder content/Ranger mode bundle, what Gap Gen probably meant is the redux version that, as far as i know, has still to come.

        Basically, both Metro 1 and 2 will get redone, both in the new engine, with Metro 1 seeing the biggest difference and 2 having some upgrades here and there.

        An excuse to port them to the next gen, obviously. Then again, i feel sorry for those who played 2033 on a 360, so i guess i can’t complain.

  16. Wedge says:

    Jeese, over three years later and still no real new GPU’s. Of course 4k monitors aren’t happening anytime soon when we don’t seem to be anywhere close to affordable hardware that can support it.

    • Stochastic says:

      The rate of advancement in PC hardware these days seems quite sedate. The days of GPU performance doubling every 18-24 months are over, we haven’t had a major bump in CPU performance since Sandy Bridge in January 2011 (and the upcoming Broadwell doesn’t seem like it’ll be pushing the performance envelope either, plus AMD doesn’t have much interesting in store until at least 2015-2016), SSDs are getting cheaper and faster but don’t make an appreciable difference in day to day use vs current SSDs, and other components are just evolving iteratively. This means that if you get a midrange PC today it should easily last you 5 years before you need a substantive upgrade, but it also means the industry is a little bit boring. Ah well, at least the display industry seems to be waking up.

  17. rcguitarist says:

    I play on a 24″ Hanspree 1080p monitor that I got on sale at Staples a few years ago for $60 and it has served me wonderfully. I can’t imagine upgrading it until 4k is so mainstream that I can pick up a 24″ 4k monitor for $150 and a 4k level GPU for $200.

  18. Darth Gangrel says:

    Well, I’ll be damned! That’s the first ever article I’ve seen here without any tags.

    An eight-year-old CPU is no problem if you like eight-year-old games and I play even older ones on my gaming laptop, but it’s only three years old.

    • Gap Gen says:

      My Q6600, which is over 7 years old, runs modern games just fine. I’ve never had a game where I thought about upgrading it.

      • FurryLippedSquid says:

        I must admit, I had a Q…um…8200? Until very recently which served me well. Sadly, I have a DayZ fetish, and that can only be reasonably accomplished with a decent CPU so I upgraded to a 4th gen i5.

        I’m a happy DayZer now.

  19. Mungrul says:

    I still love my, what, 6-year old Dell 2407.
    I love that it’s 16:10, and I love its clarity and consistency.
    Also, my main rig is an i7-2600K with 16GB and a 580GTX.
    It plays everything I choose to play absolutely perfectly on that monitor, but I suspect it would start crapping out with more pixels to fill.
    And I am in no way convinced by 4K in the slightest.
    It reeks to me of HD displays reaching market saturation point, 3D displays being a flop and manufacturers desperately flailing around trying to sell us shit we don’t need and will not be able to appreciate unless we sit uncomfortably close to the screen.

  20. Carra says:

    I bought a shiny new PC two years ago. Seeing how much I spent on it, I figured that I could just as well spent a small portion of that on a new screen to replace my 22″ 1650×1080 screen, which I used for over five years.

    My new 2560×1440 27″ IPS screen has served me well. The first thing you notice is that, of course, it’s a *lot* bigger. And after playing some Diablo 3 on both screens, the difference was enormous. The colours are just right now and they much, much better. It didn’t bother me before but after seeing the difference…

    It depends on your budget of course but if you do spend some serious money on your PC, you’d be nuts not to spend a bit more and get a good monitor. I’m planning to reuse this monitor when I buy my next PC, should easily be viable for 5 years+.

  21. spleendamage says:

    I love my Samsung 305t, which I got in 2007. I guess it could die any day and then I’d probably go get a 4K.

  22. DanMan says:

    One does not simply buy a monitor. I’m still satisfied with my HP LP2475w and agree that spending a lot but rarely on monitors is a wise thing. Don’t insult your eyes.

  23. waltC says:

    If you enjoy gaming as a main activity (among several) on a PC (the only *real* way to game), the most important components, imo, are your monitor & gpu. With resolutions from 1920×1200-2560×1440, the cpu, today, is almost a negligent factor. At the highest resolutions, games are pretty much 100% gpu bound, meaning there’s no difference at all in frame-rate performance between AMD and Intel cpus, for example, and the higher the resolution the more true the statement is. The cpu’s performance, most especially in non-gaming benchmarks, @1024×768 or lower, tells you nothing about how your system will run games at 2560×1440 and up. At that point it’s strictly the gpu that makes the difference. My advice: run an AMD FX-6300 or FX-8xxxx (6 or 8 core) @ ~4.5GHz (where my FX-6300 is running on stock voltage/air) and take the money you save over the fastest Intel cpu and plow it back in a gpu(s) of your choice. The ~$150-$200 you’ll save will put a nice little dent in the cost of a nice gpu and monitor combo. When it comes to cpus for gaming, “bang-for-the-buck” is the highest and best consideration–imo.

    The gpu you buy is 50% of the gaming graphics equation and the monitor you choose is the other half of the picture. Both are equally important.

    • TacticalNuclearPenguin says:

      Nobody claims the contrary and everyone knows the GPU is the most important thing.

      You’re still downplaying CPUs though. In a perfect world, you are 100% right, but when it comes to badly multithreaded games there’s still only one way to go, which is a good overclocked i5-i7.

      One day AMD will probably find a way around their obnoxious fake-8-physical-cores architecture and allow for decent IPC and single threaded performance. That day is not today.

      • waltC says:

        Lol…”fake-8-physical-cores”….? Whuuuut? Heh…;) Oh, I get it–because AMD uses modules, 2-genuine-cores-to-the-module, you think there aren’t really two separate cores in each module. Trust me–AMD *only* has slower single-threaded performance when you compare 1 AMD core to 1 Intel core. If we use your measure and pretend that both cores on each module count as only a single core, then AMD’s “single-core” performance wipes the floor with Intel’s…;) But then that’d be unfair to Intel because we’d be comparing two AMD cores to one Intel core. Nope–building in two cores “per module” doesn’t mean it’s “fake” by any means…

        If anything is fake, it’s dual-core Core 2’s with Hyperthreading turned on telling Windows that it is using 4 cores…;) Now *that’s* what you call “fake”…;)

        • TacticalNuclearPenguin says:

          Yes, hyperthreading is even faker, i’m not denying that, but 4 strong cores are far more valuable in more gaming scenarios than 8 weaker cores with architectural limitations. You can research that yourself, a single “core” is not a complete one, it can’t do floating point operations for starters.

          And no, they aren’t weaker only if compared to Intel, single threaded performance on Bulldozer was weaker than their previous gen, the Phenom, which was no champion by itself.

          Sadly, that level of performance has proven to be a bottleneck in some gaming situations.

          HT is worse for multithreading than AMD’s architecture, but the issue with the latter is that it’s decent only on that scenario, which is uncommon in games. Also, don’t think the world will change that much, game engines will get smarter and more cores will be better used, but NOTHING will ever change the fact that any game still runs on a main thread that requires everything else to be in synch with it.

          While the other auxiliary threads will get benefits from more cores, as long as you don’t ALSO have good per-core performance your main thread will hinder you.

    • fish99 says:

      I know when I upgraded from an I5-760 to a I5-3570K (stock) I saw a flat 20% framerate increase in almost every game, even though the vast majority of games were not CPU limited on the I5-760.

      Now that upgrade also included an upgrade from DDR2 to DDR3, and PCI-E 3.0, but you can’t just say GPU is everything and other upgrades don’t matter. The old tech on the mobo, the old CPU, the old ram, were all holding back my GPU.

      • waltC says:

        I can guarantee you that at the very high gaming resolutions I was talking about, say, 2560×1440, that you would never see that kind of performance difference between those two cpus. Sure, at much lower resolutions you can see some performance differences–no question. The lower the resolution the larger the performance differences will be. But the fact is, the higher the resolution, the less difference the cpu makes (talking about all fairly recent cpus, of course) to the performance of the game, and the accordingly greater difference the gpu will make. For *high* resolution gaming, it’s the gpu, not the cpu, that makes the greatest performance difference.

        • fish99 says:

          But hamstringing your GPU by running it with slower ram, slower cpu, PCI-E 2x slot, older chipset etc, makes even less sense when you’re dealing with a higher resolutions. It’s like when I see people saying “I have a Q6600 running at 4GHz therefore I don’t need to upgrade”, but the fact is yes you do – firstly you can’t compare CPUs just on frequency – each generation gets way more transistors and more efficient design, so they do way more work per clock cycle. Then add quicker ram, quicker chipset, PCI-E 3.0 etc and you’ll find the latest tech will blow that Q6600 out of the water, and that’ll translate into better framerates across the board.

          The point here is that there’s more to the equation than whether you’re CPU or GPU limited. Like I said when I went from I5-760 to I5-3570K (without changing GPU) there was a framerate increase across the board in games which weren’t CPU limited on the I5-760. And if you’re not CPU limited then the only thing determining framerates should be the GPU right? So why would framerates increase? Like I said it’s because the equation isn’t that simple – running your GPU with older tech is reducing its performance.

  24. Dr I am a Doctor says:

    I fucking love my apple thunderbolt display.

  25. TacticalNuclearPenguin says:

    “Finally, Intel’s new we-still-love-PC-enthusiasts CPU models are out.”

    Nobody told me that Haswell-E was released!

    • Jeremy Laird says:

      Haswell-E are rebadged server chips.

      • TacticalNuclearPenguin says:

        And DDR4, and a lot of real PCI-E lanes, possibly some decent overclocking too etc etc. Eitherway, mine was mostly a word play, simply because Intel likes to call “Enthusiast level” only the E segment. If you’re trying to tell me that they should just call it “insane level”, though, we might be in agreement i guess.

        ‘sides, then also this whole Devil’s canyon deal is just Intel reashing in a proper form what was a stupid idea in the first place.

  26. markerikson says:

    My entire PC is over 7 years old now, and it still runs most games just fine. I’m currently enjoying Borderlands 2 with zero issues.

    This, I think is one of the benefits of consoles dominating the gaming market these days. Hardware does not need to be upgraded nearly as frequently. I am planning to buy a whole new PC soonish, now that the next gen consoles are out.

  27. PC-GAMER-4LIFE says:

    21:9 aspect ratio ultra wide is the new hotness & cheap as well compared to 4k why no mention of them RPS?

    • Jeremy Laird says:

      Well, I’ve mentioned them in the past. This wasn’t intended to be a comprehensive ‘what to buy guide’, more a call to arms.

      Also, I haven’t yet seen one of the new 34-inch superwides, so it’s a bit early to recommend them.

  28. caff says:

    I currently use an old Sharp 32″ TV with 1920×1080 at an unknown refresh rate (it says 60Hz in display settings but I don’t really believe it). The colour balance is poor but generally the large size suits me and it performs ok in games.

    If I wanted to upgrade to a similar or better sized monitor (32-40inch) with good resolution, good refresh and nice quality – but not so ridiculous that I couldn’t run games on it (I’m thinking 1440p might be suitable?) – is there a good model out there right now?

  29. gunny1993 says:

    Bought a 1440p korean Qnix import monitor for 280 quid, not looked back since, shuch an improvement over my old benq 1080 gaming whatsit.

    Now to wait for 4k

  30. Emeraude says:

    Finally exchanged my old faithful 21″ CRT for a BenQ XL2420T at the end of last year – couldn’t find a way to repair it. – and frankly, after some months of use, I must say I’d rather have kept the old one.

    I don’t really see any gain of worth for the inconveniences it brings.

  31. Scilantius says:

    Any 16:10 brothers and sisters present? I jumped straight from 1280×1024 to a 1920×1200 Dell u2410 – and I /love/ the display, even though it heats up like a small sun at times.

    • Fry says:

      Here. I have no idea why anyone buys 16:9 when 16:10 is available. 1920×1200 is the shit.

      4K is silly right now. Give it a few years.

    • tehfish says:

      I’m still rocking with my 1920×1200 Dell WFP2408

      Was expensive, but the best tech buy i’ve ever made by a long shot :)

    • Juke says:

      Representing 16:10 for as long as it’s available. I’ve had 1920×1200 displays at home and office for a while now. Been thinking about moving up, but 2560×1600 is a rarer panel res than 2560×1440, and is often a step up in overall screen size (& more importantly to me, cost.)

      Korean import options are compelling, though part of me wonders if a bit of waiting to see if this G-Sync (and comparable) tech arrives with enough of a difference to gaming to justify including its support in my next display purchase. Might it bridge the gap in display tech until the arrival of an affordable 4K display + GPU pair?

  32. Whelp says:

    I got no money for a better GPU, so I’m keeping my shitty displays.

  33. Shooop says:

    link to tftcentral.co.uk is the most comprehensive site I’ve ever seen for monitor reviews. They even cover response time for gaming.

  34. Low Life says:

    Seiki announced new 4k monitors in multiple sizes with VA panels: link to pcper.com

    It’s going to be a while, though, since they’re shipping next year. After using an IPS monitor for about six years I’d rather take VA as my next one.

  35. Smoky_the_Bear says:

    Upgrading to a good monitor is fine if you are covered for the future, as you said, your Dell for £800 cost only 100 a year. With these new 4k panels, the top end is very expensive, the lower end still expensive but more affordable. The worry I have is that, are these panels good enough quality to last for long enough to get the most out of them and still be working at the point where 4k resolutions become the standard?.

    My last screen just flat out died after about 4-5 years. If I shell out for a 4k screen now and that happens it’s a complete waste of my money, I may as well buy a cheaper 1440 panel now and then in a few years upgrade to a 4k screen which will invariably be much cheaper by that point, costing me probably about half as much. The only way it’s cost effective is if the 4k panel I buy now lasts for 8+ years like your Dell has. Information on the longevity of these screens just doesn’t exist because they are so new.

  36. Lomaxx says:

    I’m still using my Eizo L557 17″@1280×1024 which i bought at 10.09.2003 for 550 euros (including a zero-deadpixel-guarantee). Onscreen-info says it ran for 27661 hours. That’s ~ 50 euro per year or if you really want to calculate that ~2 cent per “on-powered” hour. ^^

    But yeah, I am looking for a suitable new monitor since some time now. :P

  37. choie says:

    Coincidentally, just last week I completed my hemming and hawing about whether to spend some dough on a new monitor and finally upgraded my 2007 19″ 1440×900 Viewsonic to a 24″ 1920×1200 ASUS PA248Q. At first I had been thinking, “hmm, maybe I should go for a 27″, will a 24″ really be that much of a difference?” But the price was too high for a 27″ that was also 16:10 (my old monitor was 16:10 and I really don’t like the shorter versions–for my non-gaming life I edit documents and do graphic design, so I need more height than a widescreen can afford me). And now that I have this new baby on its own cart (I have a studio apartment and can just roll the monitor from desk to couch when I wish), I am so incredibly happy! This screen is HUGE. And crystal clear and immersive as hell since I sit only about 18″ away. The first things I tried on it: DX:HR and my “The Two Towers” DVD, both of which were splendid. Anyway, I have to agree that a screen upgrade makes a great deal of sense if you are super out-of-date (as I was). It’s lovely.

  38. mandrill says:

    I have two monitors. A Dell 17″ (not WS) that is nearly a decade old and came bundled with a desktop, it is still going strong, and until recently an Iyama 22″ widescreen. The Iyama died last year and needed replacing so I did so with a 23″ LG Monitor.

    Only replace your screen if you need to. Be nice to the one(s) you have and they’ll last as the Dell proves.

  39. aperson4321 says:

    4K useless?

    Unless you are a half blind old guy then playing Elite in medium/high with track IR in 4K is utterly ridiculously amazing. And that is coming from a guy with a 24inch/1920×1080 triple monitor surround setup.

    The internet and its ability to pull endlessly baseless assumptions out of its ass 24/7.

    If you focus on the whole screen then yes, 4K will not have a big effect beyond removing jaggies, but if you use that mystical ability to focus on a small part of the screen using those focus lenses for looking at distant things that your human eyes got that you didn’t know about, then you will realise that the blurry distant things in games are now perfectly detailed, that the textures and small detailes that are right next to you in games are now so much more detalied. 4K compared to FHD and XHD is just more, more detail, more visual information.

    The feeling of immersion you get in games like BF Heistday/RO is just unique in gaming when you instead of puching a button to focus on something distant, simply use normal human way of simply focusing on that small something in the far distance. The emergent game play of becoming flanket because you looked at something small in the distance on a 4K monitor is rather exciting, adds depth I did not predict.

    • Emeraude says:

      It may surprise you, but people *have* different priorities.

      I mean, I’m among the people who still think the whole HD fad broke what I was perfectly satisfied with to give me what I find perfectly superfluous The only thing 4K is to me is yet another norm bringing nothing of worth and to which I’m probably going to be forced to upgrade to anyway sooner or later – as if audiophiles had been able to force SACD to be the new norm to a population that for the vast majority do not in the slightest care about that level of audio quality (hence why the mp3 format won, convenience was deemed better than superfluous quality).

  40. Premium User Badge

    phuzz says:

    I was planning on getting a new monitor this year, as my venerable old Sony 19″ is really starting to have problems with ‘screen burn’*, and dead pixels.
    Then I ordered an Occulus Rift, which basically takes all the money I was going to spend on a nice screen, but it is kind of a screen itself.
    Fortunately my 1600×1050 Dell is still going strong, and only cost about £150 new, so that was a real bargain.

    *I can’t imagine how a TFT is getting screen burn, but if I leave a static image on screen for 20mins then it’ll be faintly visible for the next half hour or so.

  41. acu says:

    Still don’t feel like getting a new monitor. I’m waiting for good G-Sync/Freesync monitors – because with the current one I have terrible screen tearing. It doesn’t even matter if i turn on V-Sync in some games and I have no idea why. I tried forcing v-sync in skyrim via the config and with the nvidia control panel. Still screen tearing. :(

  42. Advanced Assault Hippo says:

    If your PC is mainly used for gaming and some non-important browsing/regular stuff, a well-calibrated quality TN screen can still be the way to go – assuming it’s just going to be you using it head-on.

    Iiyama make some cracking TN monitors for gaming.

  43. UKPartisan says:

    I’m waiting for my new Samsung 1080p to be delivered as I type this. I’ve been using an old Philips 190C 1280×1024 monitor since 2004, occasionally I hook up my PC to a Toshiba 40” TV. Great for watching films, but overkill for gaming. I went with 1080p, as I don’t think 4K gaming is particularly sustainable for me just yet. The yellow triple fanned card pictured in the piece, isn’t a R9 290…It’s a Sapphire R9 280X Toxic Edition 3GB, I just recently purchased one, hence the new monitor to do it justice.

  44. Frank says:

    I usually have no idea what the columns are about, but I get rotating screens! Yes, people, get one. I got a cheap ( < $200) 1050×1680 acer and keep it permanently rotated for reading websites and documents, while my main screen, 1920×1080, runs all my games in 1600×900 windows for easy switching (and because my video card is weaksauce).

    Wait, I just searched the page for "rotat*" and "pivot" and found nothing. Oh well.

  45. hatseflats says:

    Once again, I find myself disagreeing with Jeremy. First of all, I don’t think it’s worthwhile to buy a high-end monitor. Yes, screens can last you a long time. But the same holds true for a mid-range screen. You could’ve bought a decent 1080p screen for £200 7 years ago rather than an £800 1440p screen. Whether the difference in price for the higher resolution (and some other advantages) worth the 400% price difference? The time element makes no difference here as both last for a long time; if you think the price differential is worth the benefit, then buy it, if not, then don’t. I personally think it’s much better to spend £200 and spend £300 twice on GPUs, as you get much better image quality at the same price.

    Secondly, I think 1440p is rather pointless, and 4K is going to be great, and now _is_ the time to wait. The market for monitors had been stagnant for years. Suddenly, there are two exciting developments: freesync/gsync and 4K. 4K is actually far more useful than 1440p: it is actually a really significant increase in resolution (rather than the half-baked 1,5 increase in vertical and horizontal pixels) which is great when reading text on a screen. Yes, for gaming, it’s less useful, due to performance issues, but 4K is easily downscaled to Full HD (by simply making 4 pixels 1 pixel). On LCD screens, up- and downscaling often doesn’t work very well, as somehow 3 pixels have to be made into 2, or into 1.4, or anything else, if you change the resolution.

    Actually, for gamers on a budget, a 4K screen is a far better choice than 1440p. Although 1440p offers only 1,5 times as many pixels in width and height, the total number of pixels goes up by 80%, and this affects performance in games dramatically for a relatively small gain. Again, due to problems with downscaling, going back to 1080p isn’t a great option either. And some (especially older) games don’t even support 1440p, so you get the same issues. If you can’t afford to buy a high-end GPU every new generation, you’re better off buying a 4K screen and enjoying that in everything but gaming, then switching to 1080p in gaming. 4K becomes even better is you realise that new 4K screens are hardly more expensive than 1440p screens.

    And then there’s the fact that 4K is obviously going to be the new standard: 1440p has been available for years now and yet hardly anyone actually has a 1440p screen. Yes, Windows doesn’t scale very well. But 4K has only been available for a short time. As the market share of 4K increases, developers will start to adapt their software to make use of the higher resolution. Prices will also go down.

    Then there’s freesync/gsync. These technologies do not only make for a much more pleasant experience, lower framerates are also less of an issue. So it’s worthwhile to wait for monitors supporting freesync: even if gsync is better, the free alternative will make the price difference between gsync- and other monitors go down. But freesync is still some way off.

    So at this point, it’s absolutely worthwhile to wait a bit: in a year, maybe two, 99% of all software will work will at high resolution displays, and prices of 4K monitors will have come down significantly, and you get freesync with it as well. If you buy a 1440p monitor today, you’re going to hit yourself over the head in two years time for not buying 4K.

  46. Foosnark says:

    My pair of monitors may be a decade old and 1680 x 1050, but my eyes are 42 years old and 20/50.

    It would be extremely silly to upgrade these monitors, and then have to upgrade my video card to drive them, and then probably have to upgrade my whole computer to drive the video card, and probably have to rebuild my desk to fit bigger monitors (I also have audio monitors for music production, and they need to be placed just so) when I’m perfectly happy at the resolution I’m running now.

    …even if my Nexus 5 pushes more pixels.

  47. Radiant says:

    You have to talk about lag and screen delay when you’re providing monitor advice on a pc gaming website.
    It’s a gaming website. Lag and screen delay is incredibly important.

    • Jeremy Laird says:

      We haven’t just spoken about it here:
      link to rockpapershotgun.com

      I’ve even done a video:
      link to youtube.com

      As I mentioned above, this post wasn’t about providing a detailed buying guide, which is available in many forms in recent posts covering all the relevant topics. It was a call to arms.

      • Rikard Peterson says:

        If you don’t mind me saying so, it’s a pretty misguided call to arms. Are you saying a monitor should be replaced just because it’s a few years old? Ordering us to buy a new screen simply on the basis of age? That’s silly. As long as it’s working properly, it stays.

        And I find “That’s just £100 a year for the privilege of running a glorious 30-inch IPS display. Personally, I think that’s a bargain.” almost offensive. £100 a year is a lot of money. If it’s a bargain to you, good for you, and I’m not suggesting that you should be ashamed of that or anything, but I would like you to think a little more about your words the next time. (Reading the following paragraphs makes me wonder if you maybe were making some kind of joke rather than being serious? In that case, it’s simply bad taste.)

  48. Kittim says:

    I want a graphics card that will play games AND do 10 bit per channel for when I’m editing photos. AMD and Nvidia both force you into the “Pro” range of cards for 10 bit per channel yet they are worthless for playing games with.
    Why can’t I have the best of both worlds? As far as I can tell I can’t even house a gaming GPU and a “Pro” GPU in the same PC.

    My camera has 14bits of data per channel, yet I can’t get an affordable graphics card that can display that color depth and play games.

    AMD and nVidia need to step up, the first one that offers such a thing has my purchase hands down!

    • joa says:

      The human eye can barely see 8 bits of colour, never mind 10 bit or 14 bit. That’s why most monitors are 6 bit. The fact that you have bought a camera that reproduces more colours than humans (who I assume are your audience) can observe is a bit silly.

  49. fingerboxes says:

    Suggesting that people buy a new monitor before it is realistically possible to buy anything with an adaptive sync implementation? What madness is this?

  50. TheBuff1 says:

    I am so tempted to buy one of the ultra cheapo Korean 1440p monitors that are available on eBay but find that there are many different sellers offering what seems near enough identical products. Can any kind RPS readers who have gone ahead a bought one point me in the direction of the sellers that they have used??? Also how reliable are they?? Has anyone been using them for a few years yet??