Are the latest UHD TVs any good for PC gaming?

Occasionally, it falls upon me to make solemn sacrifices in the name of empirical endeavour, the advancement of science and the betterment of mankind. It is very much in this altruistic spirit that I recently embarked upon an exhaustive and forensic investigation into the merits of the latest UHD / 4K / HDR TVs as PC monitors. Yes, I’ve been playing games on TVs. This is what I learned.

Broadly speaking, I subscribe to the notion that big screens are fun. TVs are big screens. Therefore using a TV as a PC gaming screen is a bit of a no brainer, right? Would that it were so.

In reality, TVs have historically represented a flawed proposition as gaming monitors. What they gave in sheer scale they often took away in resolution, accuracy, response or practicality. But TV technology has changed out of all recognition of late, especially in the last 12 months.

TVs now routinely offer native resolutions to match or better most PC monitors. There are no overwhelming interface issues. And in many ways their primary panel tech is superior – on paper, at least. It’s time to revisit the TV as PC gaming device.

Back in ye olden times, hooking up your PC to a TV via S-video was fun for about five seconds. Once the novelty of seeing your favourite games on a very large goldfish bowl (28 inches, gasp, etc) wore off, the ghastliness of the blurry, compressed image quality was all too obvious.

Then the LCD revolution happened, digital interfaces became the norm and you could at least drive an HDTV with a clean, digital signal. But even back when 1080p seemed exotic, stretching out 1,920 by 1,080 pixels over, say, a 40-inch panel made for ugly, chunky pixels.

What’s more, crap colour accuracy, awful input lag and obvious image compression were often in the mix. Superficially big and beautiful but fundamentally borked was pretty much what you got.

My funky 40-inch Philips next to my puny old 30-inch Samsung…

As I’ve learned, that’s no longer the case. As it happens, for a couple of years my primary production PC has been running what is really a UHD TV in disguise. It’s a Philips BDM4065UC, which is a 40-inch 4K panel that’s really designed for TVs, but with some PC-friendly electronics including DisplayPort. It’s flawed, but I love it.

However, my main muse for this particular ruse is a Samsung UE49KS9000 9 series beast. With quantum dots, HDR support, 4K native and a curved panel it’s pretty close to, if not quite at, the very cutting edge of current LCD TV tech (OLED is another bag of subpixels).

Anyway, initial impressions reveal that it’s painless right out of the box. There are no issues driving it at 4K and 60Hz over HDMI 2.0 via an Nvidia Pascal GPU, in this case a GeForce GTX 1080. It’s likewise immediately obvious that the screen’s basic calibration is more than merely acceptable, it’s good.

The colour balance is pretty much in line with a PC monitor, there’s no silly over saturation, no hideous compression in gradients and, perhaps the biggest worry, no discernible input lag.

And then there’s the sheer giddy scale of the thing. I’m used to 4K. But this thing is spectacular, it’s genuinely exciting even for a dessicated seen-it-all-before hack like me. It makes you really want to play games. It makes you excited about playing games. This, I think, counts for something.

There’s also bona fide utility. The combination of scale with a big pixel count is fantastic in strategy games like Total War. At one and the same time, you can observe the action in fairly close detail while also maintaining a pretty broad overview of the battlefield. And all the while with oodles of space for menus and tools. It lends a heightened sense of all-seeing omnipotence to proceedings, which is the point of this kind of game, after all.

What’s more, as good as all this sounds, it could have been even better. This TV is HDR capable, but I wasn’t driving it with HDR content.

Anyway, 50-inch gaming at 4K is spectacular for pretty much any genre. And yet that’s not to say it actually makes sense. For starters, you still have to make sacrifices. As things stand, you can’t have 120Hz-plus refresh or adaptive frame syncing with this type of screen, for instance.

But it’s really the PC’s multi-purpose remit where the proposition stumbles. Over the years, I’ve been more tolerant than most in regards really big screens in practical terms. I remember how frequently people used to dismiss 30-inch monitors as simply too large for the desktop. That seems quaint, now.

But 50 inches and beyond? Even with the benefits of a curved panel, there are undoubtedly ergonomic issues when sitting in front of a screen this big when situated at monitor-ish distances. Most obviously, an awful lot of head movement is implicit – eye swiveling doesn’t get the job done, not even close. That’s exacerbated by the fact that TVs usually sit on static stands, though various mounts and arms can be used to optimise your setup, at a cost.

Then there’s the fact that with anything much beyond 40 inches, the pixel pitch is becoming a little coarse for desktop work even with a 4K native resolution. It’s the same old problem – what’s the point of a bigger screen in productivity terms if it doesn’t bring any extra pixels?

While 6K or 8K would solve that problem, it would exacerbate another, which is driving the bloody thing smoothly in games, a hard enough task for even the latest graphics cards at 4K. I should also point out that your mileage may vary when it comes to metrics like input lag with these things. I certainly wouldn’t assume any given TV didn’t suffer from input lag, that’s for sure.

This is when it stops making sense…

All of which means that a good UHD TV is a fabulous gaming device, but not a great PC monitor. And that makes me a little uncomfortable. Anything that bifurcates the PC makes me uncomfortable, in fact. It’s the PC’s multi-tasking prowess that for me makes it such a compelling platform. But a PC running through a UHD TV is really only optimal for games and even then not without its shortcomings.

TL;DR
– The latest UHD / HDR / 4K sets are the best TVs yet for PC gaming
– But they’re still not suitable for multi-purpose computing

From this site

68 Comments

  1. Person of Interest says:

    Really though, what’s the difference between a 28″ 4K screen half a meter from your face, and a 55″ 4K screen one meter from your face?

    • po says:

      Absolutely right.

      When it comes to screen size, what really matters is how much of your field of view it occupies (24″ on your desk > 40″ the other side of the room).

      Meanwhile resolution should be thought of as ‘pixels per degree of field of view’, where too few is blurry or blocky, and too much is pointless as it’s beyond what they eye can perceive (unless you happen to be Legolas).

      Perhaps monitors should be rated by optimum viewing distance, and FoV covered at that distance.

    • Siimon says:

      A 28″ looks tiny when I’m sitting in my living room couch :)

    • brucethemoose says:

      The article touched on that: 10/12-bit VA panels, 1000-nit+ backlights with local dimming, quantum dots.

      Theoretically 28″ 4k panels should have all this too. The unfortunate reality is that they mostly don’t, not right now anyway. The advantage doesn’t have anything to do with size, but that’s the reality of the market atm.

    • Jeremy Laird says:

      Actually, no, the whole point of this article is that with PC monitors, you sit more or less the same distance away regardless of size. You are close to the screen. With a TV, you sit at distance, relatively speaking.

      For games, a 50-inch screen at monitor distances makes for a spectacular experience where a 50-inch screen utterly dominates your field of view, while a 27-inch screen does nothing of the sort.

      The problem comes with multi-purpose computing where the fact that you’re not getting any additional pixels or desktop real estate is much more apparent.

      Morever, by your rationale, screen size is irrelevant. That’s very obviously not true.

      • Person of Interest says:

        That’s not my personal experience, though: whenever I upgrade to a larger screen, I tend to set my monitor back a bit further, so my viewing angle doesn’t increase at the same rate as my screen size. (To answer my own original question: the difference between the two screens is eyestrain.)

        Eyestrain notwithstanding, I do enjoy pressing my face to the monitor for some games, which I think gives the same impression as sitting close to a large TV, but with the added benefit that I can still lean back to return to a normal viewing angle. When using my TV as a monitor, I have to move my furniture around to make any significant viewing angle changes.

        Slightly different topic: are your opinions on curved monitors for desktop work any different than they were two years ago? A lot of good screens now on the market have no “flat” option.

      • LexW1 says:

        That assumption is just false, John. I’ve used a 37″ HDTV as my main monitor for the last seven years, and I obviously put it further away than a normal monitor because it would be bonkers not to.

        But I sit a hell of a lot further away to watch it as a TV.

        It’s also untrue to say 1080p has “ugly chunky pixels” on 40″, well, certainly not on a 37″. Visible pixels maybe, but chunky? Pull the other one mate, it’s got bells on.

        • Jeremy Laird says:

          I’m afraid I would say the opposite: what’s bonkers is to buy a larger screen and then situate it further away to the detriment of apparent size. Totally pointless. And I am afraid the pixel pitch of 1080 on anything beyond about 24 inch would be unacceptable for me, let alone 30 inches and beyond.

          • speedrcr says:

            What’s more bonkers is destroying your vision by having the monitor/TV closer than you need or whats suggested. Having objects or the screen closer messes with how your eyes naturally focus. That “Eyestrain” some speak of is because you’re not letting your eyes focus the way they were meant to be used. So in effect a 55″ TV like in my case at 1080P@ 120HZ = Fantastic performance at 2m or 5-6ft from screen.
            I use my XBR55850C and it works famously. No I don’t need to run fanboy 4K because my nose isn’t on the screen and I still get great visuals at 1080P at that distance. I can run games with HDR and Ultra/High without having to push my system as well and most detect the 120HZ. Granted you have to customize your driver and both Nvidia have native support for this now without another tool.

    • riadsala says:

      Some relevant background reading: link to ncbi.nlm.nih.gov

      • Jabb Stardust says:

        Brilliant! Nothing like a pinch of objectivity and curiosity to liven up a debate. Seriously.

        So the tl;dr is that having a larger display stand further away (than a smaller display closer to you, visual angle being the same) somehow makes you feel more immersed in the movie. I bet the same goes for video games.

        I might add my opinion on the eye strain effects: I don’t enjoy looking at a display at my arm’s distance. The eye is more at rest when looking further away. That’s why I use my PC with an all-in-one keyboard-touchpad on my living room couch. Super nice. No-one will convince me back to a plain office desk. :o)

  2. GenialityOfEvil says:

    Personally I’d rather have 3 32″ high refresh-rate monitors than one big TV.

    • causticnl says:

      most uhd screens Ive seen have 120 Hz as a minimum.

      • Siimon says:

        Most are not real 120hz, and definitely not when getting PC input.

        • GenialityOfEvil says:

          Yeah, they’re interpolated. A lot of them don’t even do that, they just use a strobing backlight to repeat the last frame.
          There are only a handful of real 120Hz TVs and they’re all 4k, so good luck getting games to run at 120 FPS.

        • speedrcr says:

          Not exactly…they also have a default EDID detection that without some configuration in your driver utility, you will only get default 60Hz. Most UHD TV’s have a faux 240Hz like what you are mentioning, but a real 120Hz.
          And yes not all UHD’s are the same, some do some trickery for higher refresh rates or even UHD or HDR.

  3. internisus says:

    I bought a 2016 Vizio P65-C1, and it has been excellent for PC gaming. I play everything I can at 2160p/60, but if I can’t run something at such a high resolution or the game doesn’t support it then I have the option of 1080p/120. It’s a screen that offers low latency options as well; I’ve been very happy with it.

    But yes, it is of course unsuitable as a general purpose monitor.

  4. dongsweep says:

    I implore anyone who is looking for a new monitor to look at the 21:9 monitors. Ultrawide gaming is a joy and I wish I would have made the purchase much earlier. That feel going from 1080p to 1440p was great, but going from 1440p to ultrawide 3440×1440 was even better.

    • Premium User Badge

      AutonomyLost says:

      +1

    • Xerophyte says:

      Because I’m an overpaid engineer working for a minor evil empire in San Francisco I’ve got both a 65″ 4k TV with DisplayPort (one of the Panasonic Vieras, which have absolutely spectacular image quality and so-so everything else) and a 3440×1440 Ultrawide desktop monitor, so I can offer some comparisons. They both have their advantages, but I do think a nice TV with beefy amplifier and speakers is great for the sort of games I want to play on the couch.

      I’ve been playing Dark Souls 3 on the TV, and it makes for a very impressive, operatic experience. I’ve also been playing World of Warcraft on the desktop, and for that sort of game I do love the field of view and just the extra real estate.

      I guess it boils down to what sort of games you want, and how you want to play them. I will say that a big problem with playing in 4k is just the pure framerate issues you’ll have in a lot of games, especially if you don’t have an absolutely top of the line system. Dark Souls 3 won’t run well at max settings, for one, no matter your system.

  5. Premium User Badge

    Drib says:

    Every time I see stuff like this, I look at my predominant playing of OpenXCom and Dwarf Fortress and Minecraft, and I feel a bit disconnected from everyone.

    But then I realize I’m not out like four thousand dollars for these kinds of TVs, and I feel a bit better.

    • fray_bentos says:

      You still are disconnected from everyone: you can buy a 50-60″ 4k TV for less than $500 these days.

      • Premium User Badge

        Drib says:

        You damn kids with your cheap TVs. I used to play NES on a black and white 10″ CRT and I liked it that way.

        • jTenebrous says:

          10 inches – !10! inches!? That’s TWICE the size of the black and white tv my dad made me use with my Atari for more than a year until I was allowed to play on the living room tellie.

          • Premium User Badge

            particlese says:

            It’s a slightly different proposition, but I once played Super Metroid on a friend’s 1.5-ish inch black and white travel CRT. It had a collapsible antenna for broadcast TV viewing, great pixel density, and even a loupe you could clip on to make it look HUGE for the one eyeball you held it to.

            Retina Display my eye!

    • caff says:

      I have the same 40″ 4K monitor that Jeremy mentions, and there are frequently times I fancy booting up DOS window/pixel games like DF that just look rubbish on my screen. No amount of tinkering cam make some games feel right.

  6. Premium User Badge

    Don Reba says:

    I remember how frequently people used to dismiss 30-inch monitors as simply too large for the desktop.

    You still want 30 inches or less for a 4k monitor, unless you’re happy with low-ppi text. A 30-inch 8k monitor would be even better.

    • Jeremy Laird says:

      At the moment, you are compromising whatever you choose. If you go sub-30-inches in return for high DPI, that comes at the cost of practical screen real estate, especially given how terrible both Windows and the web generally is at scaling for multiple DPI.

      In an ideal world my 4-inch monitor would be 20K. But that isn’t available and 40-inch 4K running standard DPI settings is far preferable in Windows to a smaller screen running scaled DPI settings.

      As it happens, I chose Mac for my laptop very much for quality of fonts etc with the retina display. But then Mac OS renders fonts and generally scales to multiple DPI so much better. Even at high DPI, Windows fonts are pretty crappy, so it’s not something I chase with choice of PC monitor. Until Windows gets better at scaling, I aim for a screen pixel pitch that allows fonts to be comfortably legible at default DPI settings. It’s a compromise, but then so are all the other options.

      • Premium User Badge

        Don Reba says:

        Do you really have problems with high DPI on Windows 10? Personally, I was surprised by how easy the transition to a 27-inch 4k monitor turned out to be. I do a lot of text editing on my machine, reading, browsing, playing new and old games, and everything works correctly. High DPI-aware UIs get sharper, the rest get upscaled by 150% and still look tolerable.

        • Jeremy Laird says:

          The scaling isn’t a complete catastrophe, no. But it is patchy and, like I said, I bought a Macbook with “retina” (the first laptop I’ve bought for a decade and indeed my only majorPC-related purchase for a decade) primarily because Windows doesn’t do fonts terribly well, even with high DPI, and generally makes disappointing use of high-DPI displays. Mac OS renders much, much nicer fonts and indeed handles DPI scaling more smoothly all round.

          • Premium User Badge

            Don Reba says:

            Wait, Windows has always had the better fonts, with Cleartype. Mac’s blurry rendering is only saved by high-density displays. I know some people like it anyway, but “much, much nicer” is pushing it.

  7. fray_bentos says:

    Really nice article. I made similar contemplations recently. In the end I forked out for a 27″ 1440p, G-Sync monitor, 144Hz+, which I think is the sweet spot in terms of GPU power required, general usability, and not moving my head around to look at the screen. Nowadays, I also prefer the smoothness of higher framerates of 100fps over say, 60fps but with more eye candy. I’ve had to sell my PS4 as I cannot bare movement in games at 30fps anymore.

    • Person of Interest says:

      I cannot bare movement in games at 30fps anymore.

      This is the reason why I’m not in a hurry to upgrade from a 60 Hz screen. It doesn’t seem worth spoiling all the low frame rate games for the sake of a few high frame rate ones.

      G-Sync / FreeSync sounds nice, though.

      • Ghostwise says:

        I’ve found upgrading from 60Hz to 100Hz (with flicker control and blue light filtering) to be wonderful for visual strain.

        (Though I spend 80+ hours/week looking at this screen).

      • Premium User Badge

        Grizzly says:

        As someone who recently switched to 144 hz: This didn’t happen for me. It just made older games look a lot more fluid!

        • vaivars says:

          I am quite certain that it wasn’t a refference for anything <120hz looking worse than it did before due to the monitor, but rather that since you now have experienced 120hz, it makes you feel like its worse, when in fact – nothing has changed.

          IMO if you are upgrading your monitor, and you can get a >60hz one, go for it. In reality you don’t lose anything – only gain. And not just for gaming, the whole pc feels “smoother” overall.

  8. Longestsprout says:

    The best decision I ever made was park my 32 samsung tv onto my desk and plug it into my computer. Don’t remember how many years ago that was but still using it.

    The size maybe a matter of preference, but having to use a smaller screen like that on a laptop makes me feel almost claustrophobic. Like really uncomfortable.

  9. foszae says:

    The concept amuses me, but there’s a basic limit of desktop space i need to pay attention to. My gaming rig is my workstation and so it needs to be practically usable with my comfy chair and keyboard etc. Perhaps more importantly, i’m never giving up my multi-monitor setup; no matter how compelling you think your game is, it’s bloody likely i’m also watching a movie on my other monitor (and probably dipping into chat on my other computer). I don’t exactly live in a low-rent town where i can just get a spare room that’ll fit a banquet table in for my home office set-up.

    • vaivars says:

      Good monitor arms can be a sort of solution for this.
      You can have the big monitor behind your multiple smaller ones, and when its time for gaming or movies, just move the smaller ones out of the way.

  10. Tomo says:

    I bought a 48″ 1080p Sony TV last year, with a great response time and it’s been a revelation. Only cost £350. Think as 4K is becoming more and more of a thing, 1080p sets are super cheap now, even for good ones.

    Handy as I don’t give a shit about 4K yet – way too pricey. Instead, I sit on my sofa playing Witcher 3 and Titanfall 2 and it’s absolutely fantastic. My girlfriend has happily played through Limbo, Portal, Virginia, The Witness, and more. No chance she’d have done those hunched over a monitor. Or, 4 player Gang Beasts/Overcooked with ‘non-gamers’ using a bunch of 360 controllers. Having my PC in the living room is just the best.

  11. grundus says:

    The term you’re looking for is “pixel density”, and that’s the difference between two screens of the same resolutions but at different sizes. At the same distance the 27″ will have less need for anti aliasing because individual pixels will be smaller and therefore jagged edges will be harder to see, but if you put one further away the result is the same because of Father Ted’s “small… Far away!” thing. This, incidentally, is why different screen sizes exist; small for your desk, large for your living room.

    I personally was looking at a new TV for racing sims, the one I have has awful ghosting which meant that turning the wheel reduced all the graphics to a brown smear sliding across the screen. I found it’s impossible to shop for TVs by response time as they just don’t mention them anywhere in the spec, nor in most reviews, so I figured it would probably just be better to make my desk wheel-friendly and get a new monitor, namely a PG279Q. I ordered one from Scan and waited, it never came, so I cancelled my order and bought a Vive instead. Which, by the way, is spectacular.

    Now I’m waiting for the PG27UQ’s price to be announced, because a) not everything works with the Vive and b) I’ve got a GTX 1080 running a 23″ 1080p 60Hz monitor, which feels like a waste.

    Edit: This was supposed to be a response to the first reply to the first comment.

  12. Themadcow says:

    Heh, well timed article. My new PC build is a 10cm wide RVZ02 case running a Gtx1070. It sits discretely beside a 49″ Samsung KS7500 SUHD and I love it. Need to sort out a the occasional bit of screen tear which appears in cut scenes, but otherwise runs really well and looks amazing.

    • castle says:

      I’ve found turning off Vsync in the game and enabling Adaptive Vsync in the NVidia control panel to be helpful for getting newer games to run smoothly in 4K with a 1070

      • Themadcow says:

        Cheers – will give it a shot!

      • Person of Interest says:

        The cutscene screen tear seems to be caused by Adaptive VSync, not alleviated by it. It’s the major downside I’ve noticed to setting the global VSync setting to Adaptive in the Nvidia control panel. (The other downside is the occasional game that ties load times to refresh rate. GTA IV and Startopia come to mind.)

  13. castle says:

    I recently made this transition myself, and I’ve got to say, it made a *huge* difference.

    I have a 65” KS8000 in my living room, which at least in the USA can be found on sale for surprisingly cheap (under $1500), given that it has an excellent screen and low input lag. It’s a perfect TV for gaming. I hooked it up to a gaming PC with a gtx 1070. Playing games in 4K on the KS8000 is absolutely gorgeous, and every game I’ve tried runs a smooth 50-60fps in 4k, although newer games require some tweaking. Steam also natively supports the dual shock 4 controller, meaning that nearly every game has seamless controller support. You don’t even need to touch a keyboard–just sit down on the couch, push the home button, and steam big picture pops up. I can’t understate how great this setup is, using a DS4 to play in 4K on a 65” screen from my couch. Emulated games via RetroArch are similarly wonderful and user-friendly, if that floats your boat.

    I’m someone who spent most of my life gaming at a desk, so this was a huge change for me, but I’m loving it. For multi-purpose computing, I simply sold the GPU from my old computer and connected it to a standard 24” monitor on a desk (using integrated gfx). This is fine for work, and if I want to play a game with mouse and keyboard, I can stream it from my gaming rig using Steam in-home streaming (no lag with both computers wired). But to be honest, I haven’t played a single game with mouse and keyboard aside from a few runs of Devil Daggers after switching. Playing on the TV is just too good. This sounds as heretical to me as it probably does to you.

    TL;DR: If you can afford a gaming PC with a 1070 or 1080 and a nice 4K TV with low input lag, go for it. You can use your old desk setup for work and can still play mouse/keyboard games via steam in-home streaming.

  14. tslog says:

    Thanks Jeremy.

    Watching CES just passed, it looks like 4K HDR screen battle is really gonna heat up later this year. So could you please revisit this topic back again later on.

    I PC game laying down in front of a 40 inch screen ( yes that mostly means controllers, get over it ) and I’ll be upgrading my 3 yr old 1080 p Samsung.

  15. Catterbatter says:

    I can’t be the only one utterly intrigued by Jeremy’s cave there. It’s such an unusual room. Could we have a pic of it from the next room? In the article about the racing setup I thought I saw some natural light. It seems just perfect for a VR game where you play as Quasimodo.

  16. Thirdrail says:

    I’ve been using a 40 inch Sony Bravia as a desktop monitor for many years now. It’s fantastic. It’s always been fantastic. There are some disadvantages, occasionally. When you put the map in the corner of the screen, yes, I have to turn my head to see it, instead of it just being an eye flick, but for every minus there is at least one plus, if not more. You may be a hidden sniper in the distance on everyone else’s little screen, for example, but you’re so obvious to me it borders on being comical. Or if I buy a skin in a MOBA, I actually get all the tiny details I’m paying for. (I can see the difference when I’m stuck on my notebook, and I have no idea why anyone with a regular sized screen would pay, say, $25 for Elemental Lux, when she’s always going to be an HO scale character from their pov.) The best part, though, is in open world games. On my screen, a vista is truly a vista, not a postcard of a vista, not a framed picture of a vista, but an oh my god wow pull the car/cart/space buggy over so we can stare at the sunset type of experience.

  17. ThePuzzler says:

    “stretching out 1,920 by 1,080 pixels over, say, a 40-inch panel made for ugly, chunky pixels”
    That’s my current set up. I suppose if I put my face a few inches from the screen I can see the pixels. You guys must have great eyesight to be bothered by it.

  18. godunow says:

    Which Total War is shown on first screenshot?

  19. CaidKean says:

    I use a TV as my ‘monitor’ for my PC, mainly to save space.

    Anyway, I’d just like to chime in any recommend that anyone considering using a TV for PC (or in general) gaming should check out DisplayLag’s Database: link to displaylag.com To be frank, I’m surprised RPS didn’t recommend them given that they’re the largest site on the subject.

    They test and collect input lag data on a huge range of both monitors and TVs. Whilst it’s true that HDTVs used to have fairly crappy response times, some manufacturers have really upped their game as of late.

  20. Alien says:

    The next screen/TV I will buy is an OLED.

    I have been waiting for over a decade to be able to play dark games in a dark room. Black should not be some glowing grey mist, it should be “black”.

    The problem is: The most important thing for gaming is Input Lag (not only for competitive gaming but for overall immersion). And OLEDs currently have over 25ms lag. That`s just not acceptable.

    A low input lag OLED would be the dream monitor for every gamer…

    • Dave L. says:

      I picked up a 4K OLED a couple weeks ago and I haven’t had any noticeable input lag issues playing Dishonored 2 or Mirror’s Edge on it (running it at 1080p because my Radeon 380x doesn’t have an HDMI 2.0 port, and I’m not in the mood to pick up a DisplayPort->HDMI 2.0 cable).

      It’s gorgeous, btw. Coming from an edge-lit 40″ LCD, having that stealth ‘shroud’ thing that Dishonored does when you’re crouched no longer blown out by a combination of the flashlighting at the corners and the just generally messed up blacks of the seven year old LCD is a literal gamechanger.

  21. OmNomNom says:

    For first and third person games i cannot stand using a TV. It’s less about the resolution and even refresh but the ghosting / blur even on the very best TVs is not currently as good as an ULMB monitor. That and the input lag is so noticeable

  22. pillot says:

    Not including OLED makes this a bit of a waste!

    Even the very best pc monitor can’t do a black level to save itself.

    • Asurmen says:

      This is why I can’t wait for proper quantum dot displays, but they’re a few years off yet.

      • Alien says:

        Do “quantum dot” displays have black levels like OLED displays?

        • Asurmen says:

          Present ones don’t, as they’re photoluminescent so still require an LED backlight but they’re better than current LCD screens. Future quantum dots will be electroluminescent like OLEDs, so a pixel can be completely off or on as required.

  23. The Bitcher III says:

    Recieved wisdom when I was buying is that a good 1080p TV is preferable to an average 4k. Sorted panel and processor trumps ‘specs’.

    I got a cheapish Panasonic 42″, it has a great panel and processor, and only skimps on the smart functions. Hooked up with a 10m HDMI cable, playing TW3 and GTA V @ 60fps from the couch was fantastic, especially with (very) decent surround sound.

    Ghosting and response was never an issue, and aliasing is, for some reason, generally less noticeable. Black levels easily surpass my “best in class” £600 LG 2nd gen Ultrawide UC88-B.

    Said Ultrawide is a thing of beauty in itself, however.

    Abzu was sublime. Dishonored and DX:HR were given new life, and something like BF1 is stunning to the point of potentially traumatic. It’s the only way to play racers, and RTS. What surprised me was how great pixel art games look, in particular Fez.

    And productivity wise it’s immense. Who cares, for example, that Explorer doesn’t do multiple tabs, when you can tile 3/4 very usable windows.

    Still, I went back to the sofa – TV – wireless Xbox pad combination for Blood And Wine.

  24. The Bitcher III says:

    Received wisdom when I was buying is that a good 1080p TV is preferable to an average 4k. Sorted panel and processor trumps ‘specs’.

    I got a cheap-ish Panasonic 42″ on the AVForums crew’s assurance that it had a great panel and processor, and only skimped on the smart functions.

    Hooked up with a 10m HDMI cable, playing TW3 and GTA V @ 60fps from the couch was fantastic, especially with (very) decent surround sound.

    Ghosting and response was never an issue, and aliasing is, for some reason, generally less noticeable. Black levels easily surpass my “best in class” £600 LG 2nd gen Ultrawide UC88-B.

    Said Ultrawide is a thing of beauty in itself, however.

    Abzu was sublime. Dishonored and DX:HR were given new life, and something like BF1 is stunning to the point of potentially traumatic. It’s the only way to play racers, and RTS. What surprised me was how great pixel art games look, in particular Fez.

    And productivity wise it’s immense. Who cares, for example, that Explorer doesn’t do multiple tabs, when you can tile 3/4 very usable windows.

    Still, I went back to the sofa – TV – wireless Xbox pad combination for Blood And Wine.

    But anyone thinking about displays ought to make at-least-trying freesync/gsync their number one priority.

  25. mtcicero says:

    During black friday I got a samsung KU6290 (40″ 4k TV) for ~$300. It’s amazing as a gaming monitor– it has low input lag and the sensation of being surrounded by your games is pretty incredible when you are sitting at a normal monitor distance from it. It’s probably not good for competitive counterstrike, but I play overwatch and Titanfall 2 on it no problem.

    It’s also great for productivity– I do a lot of coding for bioinformatics so it’s amazing to be able to have so much screen space (essentially the same as 4 20 inch 1080p monitors). In terms of dealing with text, it has a somewhat better pixel density than my previous 1440p monitor and can handle chroma 4:4:4 (important to check!), so it works great.

    Something to note is that it’s definitely worth looking into the suitability of different TVs as monitors (esp. for input lag and color), I think a lot are pretty subpar (for example, the next level up of samsung TV has worse input lag for some reason). I did a lot of research before picking mine up.

  26. mpb says:

    You write about “obvious image compression” and “hideous compression in gradients” with some HDTVs. What causes this? HDTVs are not storing images, they are just displaying and discarding them. So why compress? Is there an internal bandwidth bottleneck inside the HDTV itself?

    Or is the compression happening in the computer in order to fit the image data over an HDMI or DisplayPort cable?

    Or is the HDTV not able to support the full bandwidth of the HDMI or DisplayPort cable?

    I’ve done some searching and found the following links, but I have not found a clear explanation of where the compression happens, and what the cost advantage is to doing compression.

    link to rtings.com

    link to en.wikipedia.org

    I have noticed blurry text on 768p 26 inch HDTV. I never thought that it might be a compression artifact, but I guess that makes sense. I’m trying to think if I also noticed blurry text on my 40 inch 1080p HDTV, which I purchased 5+ years ago. However, it is very rare for me to use my HDTV as a computer screen. I am pretty confident that HDMI can support 720p, 768p, and 1080p at 60hz, so if compression is happening at these resolutions, then it must be due to some limitation of the HDTV, and not the computer or HDMI cable.

    Thanks in advance for any answers anyone can provide!

  27. Jeremy Laird says:

    Several factors can be at play here. For instance, you may have a an LCD panel that’s only capable of rendering 6 bits per colour channel, which means loss of image data which in turn is analogous to compression. TVs often also have fairly aggressive image processing, which can give more superficial punch at the cost of accuracy and fidelity. Etc.

    Whatever the specific reasons, the result when you pull up something like a test gradient can be really obvious banding. And that means image data has been lost or compressed.

  28. Owl Mark says:

    After I tried gaming on projector and 100 inches picture, every tv and monitor seems like a small window link to optoma.eu