HDR for PC games is a hot mess (but it’s nice when it works)

assassins-creed-hdr

A few of the things I have had to do in order to get a workable version of HDR (also known as high dynamic range), the new-ish display technology that significantly ramps up brightness, darkness and vibrancy, on my PC (not including the acquisition of a fancy monitor):

– Try four different display cables
– Adjust as many as seven different brightness/contrast/colour etc shaders per game. (I have spent long, unhappy hours doing this to date)
– Manually turn on HDR on the monitor, manually turn HDR on in Windows then manually turn on HDR in the game settings. Or sometimes HDR off in Windows but on in the game then alt-tab back to Windows and turn HDR on, and off, and on, and off. Or sometimes alt-tab and alt-tab and alt-tab and alt-tab and alt-tab until HDR suddenly, randomly kicks in. When I exit the game, I have to manually turn it all back off again or Windows is unusable.
– Install an unfinished preview build of Windows 10 whose HDR isn’t totally broken on Nvidia cards.
– Almost completely lose my sense of whether anything is actually different after all of this.

The egg yolks in Final Fantasy XV were a bit shinier, though.

Important note: any HDR images in this piece are photographs on a game running on an HDR monitor, purely there to very slightly demonstrate the contrast between light and dark, as an actual HDR screenshot can’t be viewed correctly on a non-HDR screen. These images cannot replicate the sense of brightness or darkness.

HDR is, for my money, a far more exciting display tech for games than is 4K, whose mammoth pixel counts often don’t truly come into their own unless you’re using an absurdly large TV. I know others – those who demand the sharpest possible image – will disagree, but I’ve found 4K simply pleasant rather than revelatory.

HDR though – if all the stars align – boosts the sense of depth and presence to games (and films) in a way there a mere pixel count cannot. It’s a tricky thing to describe, but alas even trickier to demonstrate unless you already have an HDR screen to hand, and if you do you’re probably the one boring everyone who doesn’t know what ‘peak brightness’ means to tears.

What HDR does, in a nutshell for the uninitiated, is expand the range of brightness, contrast and colour displayed, bringing it more in line with what the human eye can perceive in meatspace. Brighter whites, darker blacks, more shades of colour and, all told, you get a more true-to-life picture. In theory.

Hitman HDR (photograph of screen)

Hitman HDR (photograph of screen)

In practice, there are four ways it can work.

  1. The above, basically, and you fall in love with it and become absolutely convinced that watching SDR, or standard dynamic range, content, is the most cruel and unusual punishment you’ve ever known, even though you were perfectly happy with it for decades before you ever heard of HDR. People who’ve gone out and dropped five grand on an enormous OLED TV often feel this way, though for me in my tiny terraced house it was the AMOLED screen on my Samsung S8 phone. (Though it probably says much that I ended up sticking with my three-year-old, non-AMOLED phone after the S8 suddenly died a few months back).
  2. You spend your time squinting at the screen in search of almost imperceptible differences in how bright the neon signs in a bar scene in Jessica Jones look, and being too distracted by this to actually enjoy the programme as a result.
  3. The neon signs in a bar scene in Jessica Jones are so blindingly bright that it’s like watching TV with a bunch of people shining laser pointers into your eyes.
  4. Everything apart from the neon signs in a bar scene in Jessica Jones looks washed-out and gloomy.

Scenarios 2-4 invariably lead to the sort of obsessive and painstaking tinkering usually reserved for audiophiles, Linux users or people who wear fascinators to posh racing events. It can, like those comparisons, end up being a gateway drug to a permanent state of gnawing dissatisfaction with the thing you’re supposed to be watching/playing/have squatting on your head like technicolour spider.

However, over in console-land, the PlayStation Pro and Xbones One and X do at least make their end of the HDR bargain work out of the box, meaning your endless voyage through The Menus Of Misery is at least restricted to the TV’s settings.

This is presuming you’ve bought yourself a TV with the right sort of HDR, and enough peak brightness to actually make it worthwhile, and oh boy is that its own minefield – which I am going to sidestep somewhat here as we generally play our PC games on monitors rather than tellies. In short, don’t buy a telly unless it definitely, definitely supports something called HDR10, and do your research to check it’s not fibbing before you by. If at all possible, get one with an Ultra HD Premium sticker on it, because that’s a proper standard and is currently the best thing we’ve got to separate the HDR wheat from the HDR chaff.

In terms of PC, though, I mostly need to scream this: “Dear God Jesus Christ Windows why do you have to be such a frigging nightmare?”

windows-hdr

Now, Windows 10, which I am often something of an apologist for, makes many things that used to be tricky about using a PC an awful lot easier. But that does not change the fact that Microsoft’s ship is still slow to turn when it comes to new technologies. HDR has been officially supported by Windows 10 on an operating system-wide level since last year, but it’s been restricted to merely On or Off and, frankly, looks absolutely hideous for everything except HDR games or videos even if you do get the latter stuff working well.

We’re talking insipidly grey whites and a general sense of looking at your screen through a piece of tracing paper. Loading up an HDR video from YouTube or Netflix (big caveats, which I’ll come back to) means a burst of joyous colour in the sea of gloom, and it’s hard to say if that’s entirely because of the HDR or just because it’s the only thing on the screen that looks like stuff normally looks.

HDR then, is simply not fit for purpose on the Windows desktop yet. There are real technical reasons for that, to do with the different ranges of colour used by HDR and SDR applications that I won’t go into here because I’d probably embarrass myself, but on the other hand Android, iOS and the PS4 and XBone operating systems manage to not make everything that isn’t HDR look like crap on an HDR screen.

Resident Evil 7 HDR (photograph of screen)

Resident Evil 7 HDR (photograph of screen)

Unfortunately, even some HDR stuff looks bad too. Netflix, for instance, can only run in HDR if you have the 4K package subscription, use the Edge browser and have a seventh-generation or later Intel Core processor and/or an Nvidia GTX 10-series graphics card. Throw an HDR monitor into the mix and the total package to watch 4K HDR video is significantly more than big 4K smart TV – and, in my experience, with far less satisfying results. There are some issues between Windows 10 and Nvidia drivers when it comes to HDR, with the latter laying the blame squarely at the feet of the former.

In the most recent public update to Windows 10, you can expect this mean colour and brightness issues with HDR all over the shop, including a washed-out Netflix and deeply unpredictable games. YouTube HDR was OK, as were downloaded HDR videos played in VLC, but Netflix was lousy and most of the games I tried looked as pale as a student on results day. I also had to cycle through four different, increasingly expensive HDMI and DisplayPort cables before I found one that worked – no pound shop treasures will do the trick here.

My dream of just flicking a button and being presented with a glorious wash of new colour died on the spot.

There were two semi-solutions to this, neither of which was quick or easy, and neither of which were ultimately that rewarding. The first was to opt into Windows 10’s Insider builds (you’ll find the option in Windows Update), which, after a lot of waiting and worrying, installed a preview version of an upcoming version of the OS, replete with improved HDR support and which played a little better with Nvidia cards. Clearly, being on a glorified beta of an operating system isn’t ideal, but, for what it’s worth, I’ve had no stability problems after the better part of a week with it.

The new build of the OS offers a new slider that enables you to boost the brightness of SDR desktop content while HDR is turned on, thus reducing that pallid grey affect. It still looks bad, but it is at least usable where the live build is not. There are a lot of issues affecting where or not this situation can ever be made perfect, which relate to the different colour spaces of HDR and SDR and a type of, sort of, colour compression required over the current version of HDMI in order to retain the bandwidth for a 4K/60fps signal, but I live in hope that something better could come along later. HDR is a lot newer in monitor-land than it is in TV-land, with few supporting screens available at the moment, but that’s beginning to change fast, and hopefully Microsoft will roll with it as it does.

Netflix's Jessica Jones

Netflix’s Jessica Jones

The second thing I had to do was dive deep into Nvidia’s control panel and manually fiddle, across the space of hours, with the brightness, contrast, gamma and vibrancy sliders until I found a not entirely happy medium between totally washed-out and dark or light areas being blown-out, masking all detail. You would not believe how I long I spent staring at a paused image of Jessica Jones’ black hair, trying to find the teeny, tiny point at which it stopped looking dark grey but before all the follicles disappeared into a sea of obsidian. This was not, I assure you, a sexy time.

Even with this sweet spot found, the likes of the aforementioned neon bar sign certainly popped in a way it didn’t in SDR, but everything that wasn’t a light continued to look less vibrant than watching the whole thing in SDR did. I felt a bit like an 80s dad who’d spend a marathon weekend tuning his hi-fi system just so, but who then had to pretend to everyone including himself that it actually sounded any better.

The same is true of many HDR games. I know I’ve been round the houses in getting to this point, but I wanted to establish a context for how miserable setting HDR up on PC is even before you get to the point of firing up a game that supports it. There aren’t many HDR games yet – the PC Gaming Wiki has an e’er-updated list – but they are snowballing and I think, by the end of the year, we’ll be close to taking it for granted that any big new release will have at least some version of HDR. As it stands, I divide HDR PC games into three categories:

1) Those with their own internal brightness settings to help you get ’em looking suitably eye-searing without everything else looking grey or blown-out
2) Those that have little more than an HDR on/off button and leave the brightness/colour fiddling to Windows and/or graphics card drivers.
3) Those that are only really HDR in name only.

In fairness, there aren’t too many of the latter – hellooo Chess Ultra – but I certainly have a background worry that, until HDR really beds in, we’re going to see a lot of cheeky stuff in which lamps and candles are a teeny bit brighter but the overall scene doesn’t really have any of the added depth or vibrancy that good HDR can bring. I’ve seen plenty of 1) and 2), however, and both have led, in their own ways, to slider-based misery.

Resident Evil 7 HDR (photograph of screen)

Resident Evil 7 HDR (photograph of screen)

Those in the first category, I generally get much better results from once my maudlin task is complete, not least because their settings menus offer reference images with instructions about what part of the picture should be visible or invisible and the like, so I’m not trapped in a frenzy of alt-tabbing. Assassin’s Creed Origins and Resident Evil 7, for instance, have sliders both for peak HDR illumination – i.e. how bright the brightest sources of light can get, which is less to do with personal taste and more to do with how bright your screen can go before the source becomes a wash of white (although ACO’s slider, at least, is clearly geared toward TVs with brightness levels extending way beyond 1000cd/m2, which most HDR monitors simply can’t do) – and for overall scene brightness. The latter behaves similarly to the gamma settings familiar to us from most PC games but does more with how white whites are, and is absolutely crucial to avoiding the washed-out effect that has blighted me so.

In ACO and RE7, I can find a sweeter sweet spot than I can with the Nvidia settings, and additionally they seem to override whatever Windows is doing. The games that do this are doubly good because they’re not dependent on flicking on Win10’s HDR first, but instead automatically enter and leave HDR mode. This has not, however, saved me from having to find per-game settings for seven different sliders – in-game HDR peak illumination, in-game HDR brightness, in-game general brightness, Nvidia brightness, contrast, gamma and colour vibrancy – in order to find something I’m more or less happy with. The effect is nice – the punch of ACO’s big sky, with a sun almost too bright to stare off, and the almost searing light of a lamp or fire amidst RE7’s perma-gloom – but I hesitate to say that it was truly worth the hours I put into it.

Category 2 games I’ve been less happy with the results of, as they require Nvidia’s settings to get rid of the grey, and the only advice here on how the scene should look comes from your own eyes. I’ve found these settings to be a bit of a blunt instrument, tipping the scene too easily from inspid to a sea of black. Hitman and Final Fantasy XV have been my main test beds here, and though I can end up with some lovely lighting effects in the smoke flares and hanging lights of the former’s Marrakesh level, everything else still looks paler than it does in SDR.

Final Fantasy XV HDR (photograph of screen)

Final Fantasy XV HDR (photograph of screen)

FFVX, meanwhile, I can fiddle into offering me strong blue skies and dazzling headlights, but something’s wrong somewhere, as alt-tab or turning off the monitor often turns HDR off entirely and leaves the colours unpleasantly desaturated whether you’ve got HDR on or off. Multiple alt-tabs will randomly fix it, but it’s very much a crap shoot rather than a science.

I’d gone into this whole HDR thing in the first place specifically because I wanted to see FFXV’s beautiful, glistening food look as beautiful and glistening as it possibly could, I’m not sure I can say the quest was entirely worth it. Especially not when compared to the friend who simply hooked his PS4 Pro up to his 4K TV and was immediately rewarded by super-food.

Something else I should mention that, for any of this to work at all, I need to manually switch the monitor into HDR mode with its own controls (and we all know how horrible monitor buttons are). Otherwise all is pale and grey. Clearly, this will differ from screen to screen, and certainly in telly-land there’s more going on in terms of automatic mode detection and switching, but all told running each new HDR game is currently a gigantic faff. What I’ve tended to use more is the monitor’s ’emulated HDR mode’, in which it makes its own best-guess on light, dark and some colours in SDR games and amps ’em up in a way that is less convincing than true HDR but is just a one-button fix. Some HDR games, like FFXV and Hitman, actually look better to me run in SDR with emulated HDR than they do after my hours of tinkering for full HDR.

Needless to say, your mileage is going to vary enormously depending on both your screen and your graphics card. Radeons, I hear, play a fair bit nicer with Windows 10 right now, although you’ll be sacrificing HDR (and 4K) Netflix there unless you have an Intel Kaby Lake or later CPU.

Meanwhile, the screen I’ve been loaned to do this test is a BenQ SW320 30″ 4K IPS monitor. It a very lovely screen indeed, designed for photography rather than games and with a price to reflect that, but the colours and viewing angles in SDR games are certainly some of the best I’ve ever seen, as well as its huge size making 4K far more worthwhile than it usually is on the desktop. I shall be extremely sad to say goodbye to it, and it has made my general resolve that 4K isn’t really worth it begin to falter. But it is a late-2016 panel, and as such its HDR pre-dates the tech’s move into the TV mainstream mid-way through 2017, even though it does boast the superior (and vital) HDR10 standard.

assassins-creed-hdr

I have had some nice results in HDR- the blinding sun on ACO and forceful glare of RE7’s infrequent light sources particularly stands out as giving me a startling sense of depth and contrast – but the problem is its relatively commonplace brightness of 350 nits. That is more than you need for any SDR use unless you’re basically playing outdoors, but, by contrast, truly decent HDR tellies start at 1000 nits, go as high as 2000, and 500 is considered the bare minimum. There are good reasons for that, in that sitting ten feet away from a very bright telly is very different to sitting 30 centimetres from an equally bright monitor. Even at merely 350, sitting with my face stuffed into the SW320 when it’s properly HDRing or its SDR brightness is cranked up makes my eyeballs throb unpleasantly. Still, more consistent settings, in the OS and the games, will help to sort this stuff out in time, as will newer monitors with more fulsome HDR settings of their own (it’s simply on or off on this one).

Unfortunately, monitor-land is currently denied the best of the best, which is to say the OLED panels used in higher-end TVs, HDR games and films on which offer an almost painted-on effect that have PS Pro and Xbone users in raptures. There are more HDR monitors coming out month by month – we’ve tried and so far failed to get others in for review apart from the BenQ EL2870U which Katharine reviewed and is currently our only 4K candidate on our best monitor list for that very reason, but will be revisiting this subject once I can. Right now, though, my feeling is that all our powder should be kept dry because the situation’s still evolving and you could very easily regret spending £700+ now when you could have had something better in a year’s time.

This is even before we get into stuff like how HDR or framerate higher than 60fps is a hard choice due to cable bandwith, or how different and upcoming HDMI and DisplayPort versions change things again, and… Look, we’ll get there, OK? HDR is nice, but on PC, right now, it just isn’t showtime yet. If you’re not unhappy with how games look on your monitor right now, hold fast until later in the year and let’s see where we are then.

21 Comments

  1. Linfosoma says:

    I’ve got my PC hooked to a 55 inch 4K TV (it’s basically a really, really good console) and I’ve shared your pain in config town.

    If I want HDR to work on Windows, I need to set it to 30hz. Even after switching to a HDMI cable with enough bandwidth for true HDR, I can’t get it to work.
    And, as you have described, it looks worst than SDR.
    I also tried RE7 and ACO.
    ACO is the one that fared better in my tests, but I’m still not sold on HDR.

    I wish I had a way to see what a proper HDRI display looks like so I could compare, but that’s hard to come by where I live, so I just have to believe online comments that “it looks better”.

    It makes the whole thing more complicated than it needs to be.
    Without a proper demo function, I simply don’t know what I’m looking for.

    For the time being, I can say that playing games in SDR but in 4k on such a big screen is a revelation.

    Ghost Recon Wildlands running on ultra, 60 FPS at 4K is basically real life.
    It looks completely different than in standard Full HD resolutions, it’s mind-blowing.
    I can see a whole range of details that I couldn’t before (have you ever zoomed into something to see its textures? Well it’s that, except that now you can see those details everywhere).

    So….all is not lost?

    I’m still hoping I’ll be able to experience true HDR on Windows at some point.

  2. DanMan says:

    Solid information. I’ve hooked up my 1200nits TV to my Win7 PC and I can’t say I had to tinker with anything a lot. Certainly not with any image settings in the NV driver to make it look right.

    But I’ve also only played Shadow Warrior 2 and a bit Mass Effect Andromeda in HDR yet. The former looked good after some tinkering with the in-game sliders, the latter was ok out of the box, with nothing to tinker with anyhow.

    As you say, you really need to put the money down for a good screen to get a good outcome. It does look amazing, when it works though. Looking forward to some Shadow Of War.

  3. jj2112 says:

    Bah, at least the games run. I remember when we had to spend hours working on autoexec.bat and config.sys and games still didn’t work. Kids these days…

  4. Don Reba says:

    I saw properly calibrated displays at CES and talked to the VESA people — even they had trouble with calibration. As far as the effect goes, I found the DisplayHDR rating very helpful:

    – 400 is barely noticeable,
    – 600 is clearly visible, and
    – 1000 is mesmerizing!

  5. Faldrath says:

    This article was equal parts amusing and painful to read. Well done on wringing entertainment out of such a subject matter, Alec!

  6. Darth Gangrel says:

    I don’t see much difference with these alleged graphics enhancers, so I’ll say that the yolk is on you.

  7. Aurensar says:

    I’m one of the tedious people in the “HDR is transformational” category. I’m using a 7-series (2017) LG OLED TV, which makes a big difference as the article points out. The TV has been professionally calibrated.

    I haven’t changed anything in Windows. I used a HDMI 2.0 cable (Amazon Basics, no less). 4k60 with HDR works fine. I didn’t need to download a prerelease version of Windows.

    Games I’ve tested:

    Middle-earth: Shadow of War (looks good)
    Final Fantasy XV (looks OK, probably the least impressive upgrade)
    SW Battlefront 2 (looks incredible, the best upgrade)
    Destiny 2 (looks excellent, big improvement)
    ME Andromeda (big upgrade. Incidentally, this game supports the supposedly even more advanced Dolby Vision HDR format, but I can’t get this to work at all. Still looks great using HDR10)
    RE7 (looks excellent, no tweaking required, particularly enjoyed how the flames show up in the generally dark scenes)

    Other notes:
    Most games that run borderless without HDR look just fine. I play a ton of Factorio on the TV without issues.
    Wolfenstein: The New Order just straight up crashes on launch with HDR enabled.

    I think what I’ve taken away from this is that “high-end” monitors are ridiculously overpriced for the horrible pictures they produce, and LCD is just a lamentable technology at this stage. I have a ROG PG348Q, generally accounted a good monitor, and the picture looks plain nasty compared to the TV in SDR.

    Even the long-awaited PG35VQ HDR 35″ (which they will probably charge £1500 for, the same price as an EOL LG B7 OLED) will have a hard time convincing me that HDR on monitors (or HDR on non-OLED) is a thing. This monitor was announced 15 months ago at CES 2017 and still shows no sign of coming to the market, as do the slew of others that use the same underlying panel.

    • iainl says:

      The worries about image retention on OLED are somewhat overblown for TV content, but still serious enough that full-time use with a Windows taskbar is asking for trouble; I don’t see them going mainstream for a while yet.

      Re: your Dolby Vision issue, the LG 7-series can’t handle it at 60Hz; try dropping to 30 as Alec had to for HDR 10 on his screen. I -think- we have to wait for HDMI 2.1 in the 2019 sets for it at 60.

  8. asgiov says:

    Looked up the BENQ SW320. It’s worth noting that it doesn’t support any type of local dimming and IPS inherently has greyish blacks… probably the worse of any modern display technology. It’s going to be one of the worse candidates for viewing any kind of HDR material. Minimum you want a display with some type of local dimming support… because it’s going to have to blast the backlight to maximum in HDR mode…. so without local dimming, HDR will look washed out on anything except very bright full screen scenes. This is why OLEDs and full-array local dimming displays are considered the best for HDR. You are honestly probably better off not using HDR with a display like that.

  9. caff says:

    I like Alec’s articles about modern gubbins for being refreshingly no nonsense. Like saying: “HDR is nice, but on PC, right now, it just isn’t showtime yet.”. That sums it up nicely in a simple way that makes me feel better about leaving my wallet shut and sticking with my current monitor for another year or three.

  10. BadCatWillum says:

    > I certainly have a background worry that, until HDR really beds in, we’re going to see a lot of cheeky stuff in which lamps and candles are a teeny bit brighter but the overall scene doesn’t really have any of the added depth or vibrancy that good HDR can bring.

    Reminds me a lot of ‘AGA-Enhanced’ games on the Amiga 1200.

  11. lrbaumard says:

    I tried hooking up my computer to my HDR 4k TV with a very long HDMI cable, and playing the witcher in HDR mode

    Turning on HDR mode in windows and withcer settings created a lot of artifacts on the screen: it looked like there was a white constant snow.
    I think i just gave up with the faff of it after an hour and a half. Anyone had more luck?

  12. Caiman says:

    I think I can safely avoid this for a while longer then, like 4k. Plus, it remains true that you never miss what you never had.

  13. Cederic says:

    Hmm. Photographing a HD scene is a flawed approach due to the limitations of your camera. Unless you take multiple shots at different exposures and stitch them together to create a HDR photograph.

    HDR should also not include vibrancy. That so many people creating HDR images do boost the vibrancy is their choice to add that effect, not a requisite part of introducing a high dynamic range. I feel that most HDR photographs look better without the overprocessed vibrancy anyway.

    • Don Reba says:

      Photos of HDR images convey brightness as glare, which is what a lot of games did with bloom before HDR and, well, keep doing.

  14. Jernau Gurgeh says:

    Don’t buy a cheap (sub £400) 4K TV ‘with HDR’ like I did. It fails at making HDR content (Netflix in my case) look like anything other than overly contrasted nastiness. I actually only wanted a bigger screen to replace my ailing 7 year old 1080p 37 incher, and wasn’t all that bothered by all this new fancypants tech, though inbuilt 4K Netflumps was a draw, and I was considering upgrading my PS3 to a PS4 Pro at some point in the not-too-distant future. Have been pretty blown away by the detail in some of the 4K YouTube stuff I watched, and some 4K stuff on Netfloogles does look good, but I just couldn’t get any HDR stuff to look as it probably should, and I want to enjoy watching stuff, not obsess about the image quality not being optimum.

Comment on this story

HTML: Allowed code: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>