Week in Tech: Why PC Monitors Aren’t Going to Get Better

Equitable though Her Majesty’s United Kingdom of Great Britain and Northern Ireland may largely be, a few isolated injustices still stalk the land. That I have to work for a living hardly seems fair, for instance. But even more odious is the fact that consumerist tat like smartphones, ultrabooks and tablets now have better screens by many metrics than our hallowed PC monitors. What gives? A recent interview I did with monitor maker Iiyama for ye olde PC Format mag dug up some answers. I also discovered why things aren’t likely to dramatically improve any time soon. Meanwhile, the roller coaster ride for AMD’s fortunes continues. This week, I predict survival!

Panel prognostications
First, display tech. As our very own Alec ‘Fingers’ McMeer will confirm, I’ve an unhealthy fetish for flat screens. I want them bigger. I want them better. I’m never really satisfied. So the state of smartphone, ultrabook and tablet screens has got my gander.

Take high-end smartphones. 1080p is now the default resolution for top handsets. That’s as many pixels as any PC monitor up to and including 24 inchers. OK, there are a few 1,920 by 1,200 panels around. But it’s very much the same ballpark.

Odds are this has at least as many pixels as your desktop panel…

Meanwhile, tablets and ultrabooks are now popping up with 2,560 by 1,440 and 2,560 by 1,600 panels. That puts them up with 27-inch and 30-inch monitors. Google’s Chromebook Pixel takes things a step further with a 2,560 by 1,700 panel.

Obviously pixel density isn’t everything. Especially not for gaming where big resolutions give graphics cards a pummelling. But mobile displays are often simply better quality thanks to newer panel tech, like the latest iteration of IPS or screen technology you just don’t get in monitors, most obviously AMOLED.

Inevitably, it all comes down to money. According to Steve Kilroy, UK manager for display specialist Iiyama, part of the problem is that the market for monitors is shrinking as both private punters and businesses plump for portable devices over traditional desktop PCs.

…and this very likely has a lot more

So, with all the money sloshing around in the mobile market, the bulk of the investment in terms of modern panel manufacturing is focussed on laptops, tablets and smartphones.

What’s more, even if sales of portables and desktops were more evenly matched, there’s an incentive to major on mobile. “Screen substrates are produced in long sheets. And you can cut more screens for tablets out of a single sheet than for monitors,” says Kilroy. More screens per sheet means more money.

Don’t ask, don’t get
Then there’s the fact, according to Kilroy, that there’s not much demand from mainstream consumers for high res or high DPI monitors. The general upshot is that panel makers reserve their best technology for phones and tablets.

Kilroy also doubts we’ll see any major changes in basic screen technology. “There’s no sign of OLED making it into the mainstream,” he says.

Indeed, according to Kilroy the trends most likely to transform the monitor market in the near term are things that aren’t necessarily of much interest for gaming or trad desktop computing. Like interactive touchscreen displays for Windows 8. Ghastly.

All pretty grim, then. But there is one glimmer of light among the gloom. Kilroy sees IPS completely taking over from TN in the next few years.

Apple’s 13-inch MacBook Retina packs a potty 2,560 by 1,600 pixels

“Once IPS panels become capable of 2ms to 3ms response over the next two to three years, you can expect to see IPS completely replace TN technology for all but the most price sensitive parts of the market,” Kilroy reckons.

Of course, the snag with that notion is that the new generation of cheap 6-bit IPS panels isn’t all that wonderful. Whisper it, but they’re not that much better than the latest TN efforts. It’s really only viewing angles where cheap IPS retains an obvious advantage.

It’s not great news but I thought it was worth sharing. At least you know that buying a monitor today doesn’t expose you to much risk of waking up a few months later and finding the market has been revolutionised and you’d have been better off waiting.

A couple of promising panels
Having said all that, I just so happen to have had a play with two new 27-inch screens recently. I suspect both are based on the same 2,560 by 1,440-pixel IPS panel, probably from LG. Anyway, I’m talking about the Iiyama ProLite XB2776QS and ViewSonic VP2770-LED.

The latest 27 inchers from Iiyama and ViewSonic are perfectly peachy

Both are roughly £400 panels, so not exactly cheap. But I think they may just be the most beautiful screens I’ve ever seen for pure image quality. Unlike earlier 27 inchers, they have nice, smooth anti-glare coatings rather than that sparkly gunk. I reckon the whites are a bit cleaner and brighter than before. And the contrast looks improved, too.

Anyway, they’re stunning, so there is still progress being made. If you can afford them and have a graphics card that that handle the heat, I very much doubt you’ll be disappointed.

AMD, again
So, AMD. Have reports of its imminent demise been greatly exaggerated? One intriguing way of tracking the fortunes of a company is to observe which employees are walking through the door and the direction they’re going.

A few years ago, AMD suffered a major brain drain as Apple snapped up some of it finest and brightest CPU and graphics engineers. Well, it appears at least some of those people are returning to AMD.

Last year Jim Keller, who worked on chips for iPhones and iPads while at Apple and was the lead architect on the Athlon 64 (AMD’s last really successful CPU design), returned to the AMD fold. Then late last week came news that graphics guru Raja Koduri has also returned from Apple.

Ultimately their identities and occupations aren’t the point. What matters is that they clearly judge AMD is enough of a goer to make returning worthwhile. And you have to assume they’re in a pretty good position to make that call. Ditching Apple for AMD says something.

I know I go on about this AMD roller coaster a bit and that it entails a certain amount of flip-flopping. But it really is critical for the health of the PC to have at least two players in both the high performance CPU and graphics markets. And that means AMD must survive.


  1. Koozer says:

    :( We still have a long wait for the day when AA is obsolete then…

    • Meat Circus says:

      The PC is dying, so why waste effort bringing retina displays to the form factor?

    • elfbarf says:

      You can already downscale from 1440p to 1080p. It makes games without proper AA (Dishonored for example) look great.

    • buxcador says:

      AMD mus survive, but Orcs must die… I mean, iZombies.

    • Apocalypse says:

      AA is for me obsolete since 3 years. 27″ and 30″ monitors are really cheap for a while, and while they offer no significant higher DPI values, they offer just fine resolutions with their 2560×1444 or 2560×1600 displays.

      They are dirt cheap as well, and they work fine in pairs, or even better in setups like 20 30 20 with 1200×1600, 2560×1600, 1200×1600 setups

      4960×1600 is a very fine resolution, it is not really expensive anymore, and if you prefer 4k single display solutions, you can get some for about 1000€+vat as well soon.

  2. Ignatius J. Smiley says:

    Is the main reason not because Windows can’t really handle the high pixel density?

    If we had a Chromebook-like 2,560 by 1,700 display on a 22″ monitor, everything in Windows would be too small to read.

    I know there are setting in Windows to increase font size but in my experience this doesn’t work properly, with dialogue boxes truncating text and some applications simply ignoring the setting.

    • Malcolm says:

      The stupid thing is that Windows has had support for arbitrary dpi settings for many years, but with the occasional exception of “Large fonts mode” (120dpi) it’s very poorly supported. All the APIs have been there for an age but using them is difficult and for most UI frameworks (WPF/Metro excepted) it requires significant effort to achieve good results and as so few people have high-dpi monitors developers rarely bother.

      It’s a bit of a chicken and egg thing I expect.

      • LionsPhil says:

        Yeah. It doesn’t help that most developers don’t even test with non-default colourschemes, let alone sizes.

        The Linux world gets a slightly easier job of it here, since GTK+ and Qt both offer dynamic spring-based layouts as their normal way of building a UI, and have offered and lived in a world of varying themes (like light-on-dark colours) from the get go.

        Mac development, IIRC, is all static sizes like Windows. Longer translation strings? Larger font? You’re scuppered. In theory linearly scaling literally every UI measurement should theoretically work but I’ve yet to see a non-mobile/non-everything-is-the-web platform do that.

    • PoulWrist says:

      No, it’s not. Windows uses a bunch of vectorgraphics for its interface and even without any scaling it’s still easily usable on a high DPI monitor. Perhaps even more so than on a 22″ 1920×1080.

      link to youtube.com win 7 on a 2800×1800 Macbook Pro.

    • stahlwerk says:

      Windows and HiDPI are just a mess, arguably worse since metro because it freed developers from the onus to actually care about desktop DPI settings and introduced two scaling factors at 1.4 and 1.8, which seem to be determined by astrology. There was a post by Steven Sinofsky on the Building Windows 8 Blog about a year and a half ago detailing Win8 and HiDPI [1] but for the life of me I cannot figure out how to get my 1080p TV to even run metro apps in the 140% Mode, not to mention the 180% scaling.

      Apple did it just right by requiring straight 2x resolved images from devs wanting a retina “certification”. You get a performance hit from rescaling the 2x images to your current display DPI, but it works now and looks gorgeous, and the scaling hardware and software will get there eventually. Hell, I can force my 2007 Macbook on Lion to output to the TV in HiDPI mode (960×540) and it is workable, even on the crappy GMA 950 iGPU!

      [1] link to blogs.msdn.com

      • LionsPhil says:

        2007 Macbook on Lion to output to the TV in HiDPI mode (960×540)

        …are you missing a pair of zeroes there?

        • stahlwerk says:

          Not at all, good sir.

          The HiDPI mode effectively halves the resolution visible to the OS’s clients, but forces a 2x-overdraw. Hence, 1080p turns into 540p, yet viewing content in a retina aware app will output as if overdrawn, for example I can still view 720p movies in full resolution in VLC. It’s all pretty clever, since it’s mostly transparent to the developer (provided standard system calls are used for display related stuff). No mucking about with 1.0, 1.4, 1.8… scaling factors.

          • LionsPhil says:

            Hunh. So it’s a fake-the-measurements trick, like PalmOS used to do yonks ago. Didn’t realize that was available on desktop OS X.

  3. Lenderz says:

    I can’t agree more on the importance of monitor DPI and quality, I went all out and brought a Dell Monitor a few years back (before 1080p was as ubiquitous as it is now) with a res of 2048 x 1152, whilst not a massive leap over 1080p I found out last year when I attempted to buy a second I found they’d stopped producing the model I had in favour for 1080p monitors in the same price bracket, so I was forced to get one of those instead.

    Edit : the model of my 2048 x 1152 is the Dell SP2309W – A lovely 23 incher.

    Having the two side by side theres no contest I’d love to get another one. :(

  4. ukpanik says:

    Bought the VP2770-LED last month, it’s a lovely monitor.

  5. Strabo says:

    You are sitting 30-50 cm from your desktop screen away, which makes high ppi-numbers less important. I’d rather see more RGB-LED, 2560×1440/2560×1600 screens with a big colour space, better response time and less energy consumption, maybe even higher refresh rates than an increased resolution, and I say that as someone who owns and loves high-resolution devices like my iPad 4, Nexus 10 and Macbook Retina 15″ (and if it ever is available HTC One unlocked).

    • Caerphoto says:

      > You are sitting 30-50 cm from your desktop screen away, which makes high ppi-numbers less important.

      Somewhat less, sure, but the difference would still be noticeable, particularly with things like text, which is something almost every computer user spends a lot of time looking at.

      On the other hand, I have a 27″, 2560×1440 screen and find that antialiasing isn’t really needed in most games (that my system often can’t really handle it is of course a factor, too).

      I do still want a 27″ 5120×2880 screen though. That would be fabulous.

      • Premium User Badge

        phuzz says:

        If I’m sat in my normal ‘playing games’ pose, the closest bit of my monitors is 59cm away from the tip of my nose, up to about 70cm from my nose to the edges.

  6. wuwul says:

    There will be affordable 4K monitors when 4K TVs become affordable.

    Also, of course you can buy 4K monitors today if you are willing to pay $5000+, and it seems 8K TVs should come out soon (likely for 50-100k$).

    But yeah, it kind of sucks that we don’t already have (or even expect to have) cheap 16K 48” monitors.

  7. Muzman says:

    Interesting. I’ve been 2,560 x 1,440 IPS for 2 years or so now and wouldn’t look back. When I bought it everyone said 120hz was the thing and all of these would be dumped pretty soon. I guess not then.
    All of the production and TV world is talking 4k screens now, so obviously someone is making big high res panels.
    Hard to imagine someone won’t jam one of those panels into a PC monitor sooner or later

    • Malcolm says:

      A 55″ 4K TV panel is only 80dpi. Practically the same pixel density as a 32″ 1080p panel – it’s just a larger slice from the same sheet.

      • Muzman says:

        Yeah but if the standard gets adopted (which is the plan) you’ll presumably see a variety of screen sizes.

    • fish99 says:

      Personally I would like to see all monitors transition over to 120Hz, rather than having to choose 120Hz TN or the better colours and viewing angles of 60Hz IPS. It’s not just the extra smoothness, it’s that 120Hz means you never have to worry about the whole tearing / framerate-drops-from-v-sync thing ever again (Triple Buffering isn’t a perfect solution since it causes extra lag).

      For the record I have both a 60Hz IPS (Dell U2312HM) and a 120Hz TN (Asus VG236HE) and I game on the 120Hz. Luckily it has the best viewing angles I’ve seen on a TN panel. 3D is a nice bonus too when it works well, via nvidia 3D Vision.

  8. Mr. Mister says:

    Some time ago (probably years), I read that next thing after OLED would be OLET or somethhing like that, with transistors instead of diodes. Was there any truth in that?

  9. Kobest says:

    Weird coincidence, but just two hours before I read this article my good, 7-year-old, 19” Samsung LCD (900NW, 1440×900) just decided to give up on me. :(

    A quick recommendation for a replacement for 150 €? My computer can handle 1080p. I would highly appreciate it! :)

    • Hypernetic says:

      I’m not sure about that price range, but link to asus.com is an excellent monitor and well worth the extra investment. It’s an IPS panel rather than TN so it has excellent viewing angles and much better colors. It comes factory pre-calibrated and is great for gaming, watching movies, and graphic design/photo editing. It’s also 1920×1200 so higher than 1080p.

      • Kobest says:

        Sadly, not within my budget range, but thank you, good sir! :) I think I’ll go with a Dell U2312HM for 180€ here, in the home of Huns.

        • Silver says:

          Def go with the Dell U23, it is awesome :)

        • Sampy says:

          I received my U2312HM yesterday. Great looking monitor but had terrible backlight bleed in the bottom left corner, so I’m sending it back. Some googling revealed that it was a fairly common issue, and I found a post on the Dell forum where someone with the same issue (the photo they posted matched my monitor’s problem exactly) was told that it was still within Dell’s quality guidelines. Apparently they only check the central 2″ circle of the screen and if that passes then the monitor is good to go.

          I’m currently looking for something else, although most recommendations I see in that price range are for the Dell!

      • Casimir's Blake says:

        Hypernetic: Thank you for posting that recommendation. I currently have an older 24″ NEC monitor with a 1920×1200 res, but it’s not IPS. I may well move over to that Asus one eventually, unless they make something similar with a larger res.

        I bought that NEC monitor 2 years ago. Finding a 24″ monitor with that screen res, and not 1920×1080, was tremendously difficult. It’s reassuring to find SOME 19:10 ratio IPS monitors out there!

      • LionsPhil says:

        …why would a carefully calibrated for visual designers monitor then have “a color engine that automatically analyzes and determines the nature of the user’s current task, thereafter adjusting the display’s parameters” crammed into it, with a little before/after thumbnail of it oversaturating a photo of some scenery? :|

        • Hypernetic says:

          Because it’s also intended as a consumer monitor and that feature can be turned off? It’s not perfect, it’s only $300, if you want a perfect monitor you will have to spend $2K plus.

          • LionsPhil says:

            It just seems a strange combination of features, like trying to advertise a musclecar that’s also an economical town-and-city cube.

  10. BobbyKotickIsTheAntichrist says:

    Seeing that gaming in anything other than native resolution tends to lead to bleeding eyes, why would i want a monitor with more than 1080p resolution if i don’t have the funds for or don’t want to spend the money on an appropriately potent rig. Not to mention that ultrahigh resolutions tend to suck if you sit a foot or two away from the display(which would be standard for PC) and UI scaling worth a damn seems to be low priority.

    • Jekhar says:

      With the two displays mentioned you should theoretically be safely able to drop down to 1280×720 without much of a quality hit, as this would be exactly half the native resolution.

    • Muzman says:

      Why do ultra high resolutions tend to suck?

  11. Lord Custard Smingleigh says:

    Why, oh why, is nobody buying these super-high-pixel-density monitors they aren’t manufacturing because there is no demand, to increase demand to the point where they make super-high-pixel-density monitors for us to buy?

    • stahlwerk says:

      One could even argue that 1366×768 TN-Panels in notebooks are doing just fine in the market, so why even bother?

    • LionsPhil says:

      Because they’re expensive.

      I will be sad when I finally run out of CRTs.

      Also, can we please get over widescreen already and go back to 4:3? Please?

      • Lord Custard Smingleigh says:

        Widescreen TVs have shown me how many of the people I formerly respected and cared for are perfectly happy to watch a 4:3 aspect picture stretched out sideways.

        Now I have new friends and family.

  12. mont3core says:

    Why in the world would you even mention a monitor with a 12ms response time?!

    • stahlwerk says:

      Let’s math!

      Frequency f = 1 / T (Length of refresh interval in seconds)

      given f = 60 Hz, solve for T.

      • LionsPhil says:

        Given response time is a duration during which pixels are changing, wouldn’t that mean 60fps content would be a continual smear, though? A white/black flash would get smoothed out to a constant greyscale fade up/down. It’s the bare minimum; anything slower wouldn’t actually be able to display the signal.

        (I am aware that in practice “response time” is a more complicated and less constant affair.)

        • stahlwerk says:

          Ha, you’re right, as is Hypernetic. A misunderstanding on my part, thanks for the head’s up.

          So, the ramp at the refresh signal change would be steeper for lower response times?

          • LionsPhil says:

            I would certainly assume so, but I seem to remember that in practice there’s a lot of very non-linear oddness going on.

      • Hypernetic says:

        That’s not what response time is. Response time is typically measured in the time it takes for a pixel to change from one shade of gray to another or the time it takes a pixel to go from black-white-black (g2g is the standard now usually because it’s a lower number). Refresh rate has little to do with it. Anything over 5ms g2g and you are going to notice it in gaming.

  13. Bobtree says:

    > Why PC Monitors Aren’t Going to Get Better

    Someone is confused about how headlines work.

  14. jimbobjunior says:

    Where are you seeing the VP2770-LED for £400? All the ones I see are at around the £550 mark.

  15. roryok says:

    I downres most games for best performance. Gimme more frames and textures over more pixels any day

    • Hypernetic says:

      Why not have both?

    • LionsPhil says:

      Are flatscreens back to nearest-neighbour upscaling again yet, then? Downrezzing was fine for CRTs and old laptops, but for over a decade the blasted things seem to be doing crappy linear interpolation, which turns everything into blurry smears. (Doing fancy interpolation risks introducing latency. Just give me pixels. If I didn’t want to see pixels I wouldn’t run you at a lower resolution.)

      • TechnicalBen says:

        Some screens scale internally (on their boards?) but most rely on the graphics card/drivers to scale. My ATI card can be set for either scaling (with or without aspect ratio lock) or centered (bordered/pillbox).
        Lets me play classics without them being postage stamp on the screen. The only annoyance is when they don’t scale well with the pixels. Being able to select “double size only” etc would be a nice additional option.

        • LionsPhil says:

          That’s where nearest-neighbour scaling comes in; it was a bit ugly back in the day of doing 640×480 to 800×600, but as the scaling factor becomes larger (because the target resolution is), and the display DPI gets higher, the fact that there’s some quantization noise becomes less and less noticable. Meanwhile your pixels all have nice, crisp borders.

          And it’s really, really simple and fast to implement.

          I’m not sure I’ve ever seen an “always really output as native display resolution, and upscale when the OS tries to modeswitch” option in my nVidia drivers. In theory, the GPU has the power to maybe do smarter upscaling tricks without adding latency since it is a big fat parallel processor.

  16. phenom_x8 says:

    Regarding to AMD, this 2 part story will make us misses the innovation they’ve made (and enjoyed today) :
    1st Part :
    link to arstechnica.com

    2nd part :
    link to arstechnica.com

  17. uh20 says:

    its even worse for touchscreen monitors, by simply adding a touchscreen the manufacturers use it as an excuse to ramp up the price extremely high.

  18. Kreeth says:

    According to TFTCentral that ViewSonic monitor is using a Samsung PLS panel, not an LG IPS one. Has anyone seen both of these panels to compare them?

  19. TT says:

    The 1080; 16:9 “norm” was one off the cheapest tricks the market imposed on us. FHD! crap
    The minimum should be 1920×1200 pixel, 16:10 up.

    Constraining the usability of the Desktop monitors to a 16:9 strip is one of the reasons the industry is down.

    PS: As for gaming on big monitors, you can always “trim” (black frame around) the monitor resolution playing at lower than native resolution without losing fidelity.

    • fish99 says:

      It’s even worse on laptops. 1366*768 is now the standard for 15″ and smaller laptops and it’s getting harder and more expensive to find higher res screens. It’s the vertical resolution that’s the issue – trying to do anything productive like coding, web design, photoshop, 3D modelling, music production etc with a screen height of 768 is just unpleasant. You’re pretty much forced to connect an external screen.

      • Apocalypse says:

        My about 10 year old g4 macbook still has a 1440×900 display iirc ;)
        That worked great. Would not accept anything less when buying a new one.

    • Tams80 says:

      Don’t get me started on 16:9! Ahhhhhh!!!

      1920 x 1200 is hard enough to find (especially for laptops), let alone 2,560 x 1,600. I’d wish for 3840 x 2400 (double 1920 x 1200), but it’s not going to happen.

  20. barney says:

    I really don’t understand why people here are acting like it’s a tragic economic necessity that screen PPIs aren’t doubling over night. Pixels still too big? Is this really anywhere near the top million in the widest scope of the video games industry’s problem?

    Carmack has talked about the fallacy before when discussing 3D: people complain that games have a realism problem when faced with the huge pixel densities already in existence, and then we ask for things like doubling the number of output pixels (hardware still struggling!) for a barely perceptible at best (nauseating at worst) bonus.

    Then of course, with this ludicrously high level of possible detail, occasionally you’ll notice that the textures seem to look a little off, a little meta-pixelated, almost. So studios have to employ even more manpower in the sisyphean journey towards photorealism, and can only afford to output generic shooters if they’re expecting to recoup the astronomical costs.

    The net benefits, other than the bizarre intellectual satisfaction that the God of Progress must be pleased by the objective notion that The Numbers Are Getting Bigger, elude me.

    • Low Life says:

      This is a gaming website, sure, but high resolution displays have other benefits than providing the ability to render a game at higher level of detail.

    • Dark Nexus says:

      I really don’t understand why people here are acting like it’s a tragic economic necessity that screen PPIs aren’t doubling over night.”

      Overnight? My 1920×1200 24″ LCD is over 5 years old, and has a PPI of ~8900. The 20″ CRT I had before that did 1600×1200, was made in 1993, had a PPI of 10k.

      A brand spanking new 24″ 1920×1080 monitor (as a 16:10 is hard to find these days) would have a PPI of ~8425.

      Not only are the PPIs not going up, they’re going down.

      • LionsPhil says:

        I don’t disagree, but I probably worry about colour reproduction more.

        Not even from a fancypants visual designer angle; just having blacks that are black and a proper 24-bit true colour range rather than quietly degrading it and dithering (in some cases over the time domain, apparently). My graphics card’s been capable of it for almost 20 years…

        • Bigmouth Strikes Again says:

          Yeah. If I could pick one thing to magically change in my monitor (Asus VL229HR — IPS), it would be making the blacks deep. It was a huge disappointment when I first saw it, coming from a CRT.

          As for the resolution — this being a gaming site –, without the GPUs to feed them, what would be the point? Having the privilege of experiencing the lowest settings in all their glorious detail?

      • Caerphoto says:

        > My 1920×1200 24″ LCD is over 5 years old, and has a PPI of ~8900. The 20″ CRT I had before that did 1600×1200, was made in 1993, had a PPI of 10k.

        Um, somewhat pedantic but I think you have your PPI numbers wrong – 8900 PPI is more than most high-end laser printers. LCD screens are generally in the 90–130 PPI range.

        A 1920×1200 screen at 8900 PPI would be about 5mm wide.

        PPI is pixels-per-inch, not… imperial mile or whatever you used for your numbers.

  21. HardDominator says:

    It’s all fun and games that the smaller screens are getting higher and higher resolutions but the fact is still that I’d rather watch something happening on a 24″ monitor than something on a tablet or phone sized screen. What’s the use of having even higher resolutions on a screen that size in the future? I can’t think of any.

  22. GreatGreyBeast says:

    The improving-yet-still-pathetic black levels of all flatscreens are the bane of my existence – I actually still keep a CRT around because sometimes it just bothers me that much. Consequently, I’ve considered OLED the Holy Grail of monitor (and HDTV) tech, and I’ve been slobbering over it for years and years. Very sad to hear that manufacturers still aren’t pushing that direction. I’d hoped its success on smartphones would help accelerate production in larger sizes. Blah.

  23. noelkd says:

    smart phone news is good for those awaiting the oculus rift, maybe PC screens will be obsolete for gaming in 5 years?

  24. geldonyetich says:

    Personally, I’d like to see stereographic monitors getting really popular. All these 3D games and we’re still living in the 2D age in terms of the average consumer’s monitor.

  25. wodin says:

    I have a 19inch 2ms LG monitor and would love something abit bigger..mainly because with wargames you need lots of real estate..The picture quality is great but oh for a 24inch screen..even a 22 or 23 would do..just haven’t got the money..one day though..one day.

  26. waltC says:

    Anyone who compares 13″ or < "monitors" with 27"-30" monitors, regardless of pixel density, is sorely in need of glasses…;) The larger displays are far better, imo.

    Yep, AMD is on a roll–of course, it *is* baffling why some i-pundits seem oblivious to the awarding of the PS4 contracts, and possibly the new xBox contracts, too, to AMD. Over the lifetime of these consoles it will mean billions of $ in income to AMD.

  27. edwardh says:

    Well… as long as I can’t get OLED monitors or some technology just as good, I won’t buy a new one and will always be on the hunt for big CRT ones.
    I know that since the vast majority of people don’t see a difference, they’re not going to lose a lot of money but hey… it’s at least one new monitor less they’ll sell.

  28. Syneval says:

    Bought a Dell Ultrasharp u2711 two years ago for 600 euro.

    Best IT purchase I ever made, although maybe it could be tied with the radeon 9700 pro from way back ; )

    I wouldnt bother with other displays using the same LG panel as the Ultrasharp (ie all other 1440p as far as I know).

    – It comes decently calibrated straight from the factury
    – A full 3 years next business day dead pixel guarantee
    – Built like a tank with more ports than you can use

    Quite why anyone would try to save 150 euro by going for something cheaper with lesser build quality and without the best guarantee on the market is beyond me …

  29. Don Reba says:

    If virtual reality takes off, there will be even less demand for large high-quality displays. What we will need is tiny super-high-density, low-latency VR helmet screens.

  30. Clavus says:

    Some good news though: since the best display tech is focussed on the mobile market, the Oculus Rift indirectly profits from this. If there’s one thing that really needs higher pixel density, it’s the Rift.

  31. Carra says:

    So since there are no affordable, mainstream 27″ IPS screens I had to buy one from Korea…

    • Apocalypse says:

      Dell UltraSharp U27xx are available since ages. And they are quite affordable.

      • Carra says:

        It all depends on your definition of affordable. I paid about half of what the Dells go for.

  32. harbinger says:

    I don’t think you could be any more wrong with this article. This is especially curious since it comes mere days after Seiki put their 50″ 4K display SE50UY04 on sale for only $1300 and PC Perspective tested it with games like Battlefield 3, Crysis 3, Skyrim and Tomb Raider: link to youtube.com

    It’s also mere weeks since Sharp put their 4K display PN-K321 on sale: link to youtube.com albeit as “new tech” it seems to still be rather expensive at around $4500 and ViewSonic and several other companies announced their own models.

    • Jeremy Laird says:

      I was aware of the Sharp. Consider this. 30-inch panels haven’t enjoyed enough demand to make them affordable since they appeared seven / eight / however many years ago. They’re only really a bit cheaper today than they were sooner after LG did the first 30-inch 25×16 panels. They never truly caught on.

      So what makes you think people are suddenly going to buy $4,000 39-inch panels in droves? No doubt prices will come down, but they’ve a long, long way to go before they become remotely relevant. Personally, I doubt a screen of 39-inches will ever become mainstream or even close to that.

      Obviously anything can happen, but I don’t see any reason to think the new 39-inch 4k segment is going to become remotely affordable any time soon. As Kilroy says, where’s the demand?

      As for the 4k TV, I didn’t know about that, but then HDTVs have never provided much insight into PC monitor pricing. Just observe what £400 currently buys you in the TV market vs the monitor market…

      • harbinger says:

        The Sharp isn’t 39″, don’t know how you got that idea. It’s a 32″-er (technically 31.5″) and the price is solely because it’s the first existing model of its kind in recent years and it’s obviously only meant for the high-end market right now (Battlefield 3’s Rendering Architect for instance got three of them apparently :P link to twitter.com ) but I expect prices to come down rather soon.

        And I disagree about your TV remark, I very much think it has an influence on where the market is heading. (god knows it had an influence in suppressing 16:10 in favor of 16:9, so it might as well do something positive for once) Remember around CES earlier this year there were new stories about these “new 4K TVs” coming out that cost “as much as a car” at $25000 and over: link to theverge.com with TVs like LGs 84LM9600, soon after we got TVs like Sonys XBR-55X900A for $5000 and just a few months down the line we’ve already arrived at bottom products around $1300.

        You have to remember that this isn’t some sort of amazing new technology that has to be developed or necessarily has low yield rates like OLED, they’ll still produce LCD/LED panels with the same old technology for the most part, just a higher density of pixels and probably some factory restructuring. The only thing that needs to change is the market demand for it.

        I very much believe that by this time next year we’ll have various ~30-32″ monitors at 4K resolution possibly at around $1000.

        • harbinger says:

          Here’s also an AMD tech playing around with 3 of the Sharp panels. :P
          link to pbs.twimg.com

          And another review for the 50″ 4K TV wondering about how quickly it got so cheap: link to gizmodo.com

        • Jeremy Laird says:

          Sorry, the 39-inch thing was in my head because Asus told me last week they’re about to launch a 39-inch 4k monitor. No idea on pricing yet.

          What you have to do is explain why 30+ inch 4k is going to experience enough demand to push prices down but 25×16 never did. 4k on the desktop in a year for $1,000? That’s a huge reach. What do you know that the manager of Iiyama in the UK doesn’t?

          And the point I made re HDTV relevance was pricing. Which it isn’t. Relevant, that is.

  33. Rian Snuff says:

    As far as I’m concerned the Rift is an example of PC “Monitors” getting better.
    Plus honestly, I’ve seen a lot of little advancements lately..
    Thinner bezels, AMAZING price-drips on 2-5ms screens.. All kinds of lil’ things.
    Ultra-wide screen monitors.. Theres a good selection of really high resolutions at mixed price-ranges.

    What percentage of users are even utilizing 120hz right now? Probably not many..
    And when people don’t really seem to adopt superior technology, why except huge pushes?

    Yes I’d love to see wicked DPI’s and what-not but also, it’s not even that necessary as far as I’m concerned. I’m more than happy with my monitors and won’t need to be purchasing anything that will vastly change, or improve my gaming experience till..

    My body is ready..

    • FuzzyPuffin says:

      The Rift, at least at the moment, has a terrible resolution, though.

      I agree with you though that high DPI screens aren’t the best thing for gaming–120hz, or a (higher res ) Rift would be better. When I get one I expect I wouldn’t be playing in a HiDPI res because my graphics card would puff smoke.

      That said, I would very much rather have a high DPI screen because a) I do a lot of writing and reading on my computer and b) once you experience one, going back to an inferior screen is like looking through a foggy window.

      No more low-DPI screens for me. I’m using this 20″ Dell until it breaks or HiDPI screens are affordable, and my next laptop will be a rMBP.

    • Low Life says:

      “What percentage of users are even utilizing 120hz right now? Probably not many..”

      The main problem with current 120 Hz monitors is that they have cheap TN panels. It might be worth it if I only ever gamed on my computer, but I’ll stick with my IPS for now, thanks.

      Those cheap Korean displays have shown that it’s possible to reach higher refresh rates with IPS panels, too, but I’d rather not gamble with them.

  34. macks says:

    I would easily drop a grand on a high-density monitor. I have a 2560×1440 24″ which is okay for games/motion/movies, but for text and static images, it’s sub-par.