HDR Gaming And The PC: It’s Complicated

There was a time when all you had to worry about with an LCD display was whether you cared enough to pay extra for a monitor with an IPS panel. Well, that and its size. And resolution. And maybe its native colour depth. And brightness. And contrast. And pixel response. And inputs. OK, it was never that simple. But it’s certainly not getting any simpler: the last few years have added further unfathomables including frame syncing, higher refresh rates, new display interconnects and the 4K standard.

Now there’s more for you to worry about in the form of HDR. Or should that be UHD Premium? Or Rec. 2020? Or BT.2100? Maybe SMPTE 2084 or HDR10? Whatever, it’s mainly about colours, lots and lots of lovely colours. This is already a big thing in HDTVs. It’s coming to the PC. But what’s it all about and is there any chance of making sense of what is, currently, a bit of a mess?

HDR stands for high dynamic range and in this context, that ‘range’ refers to the range of colours a display can generate. The net result is massively more visual pop and vibrancy. It’s a bit like that thing that’s going on with smartphone cameras at the moment. HDR isn’t about more pixels. It’s about making each individual pixel work harder and look better.

The confusing bit right from the get go is that HDR capabilities are not entirely synonymous with the colour depth of a display or the range of colours calculated from the colour depth per channel.

You’ll probably have heard of things like ‘native 8-bit’ displays. Maybe you know that the latest UHD Premium HDTVs support 10-bit colour or ever 12-bit colour. Well, that’s not actually the same as HDR capability. For a flavour of the complexity here, try this excerpt on the new Rec. 2100 HDR standard (which is part of the UHD Premium HDTV standard, more on which in a mo’) from Wikipedia:

“Rec. 2100 defines the high dynamic range (HDR) formats. The HDR formats are Hybrid Log-Gamma (HLG) which was standardized as ARIB STD-B67 and the Perceptual Quantizer (PQ) which was standardized as SMPTE ST 2084. HDR10 uses PQ, a bit depth of 10-bits, and the Rec. 2020 color space. UHD Phase A defines HLG10 as HLG, a bit depth of 10-bits, and the Rec. 2020 color space and defines PQ10 as PQ, a bit depth of 10-bits, and the Rec. 2020 color space.”

Asus MG279Q: Once the messiah of monitors, now utterly outmoded?

Sorry, what? In layman’s terms the distinction to grasp is that HDR as a feature in a display involves both the number of colours on offer and the intensity of light – the latter being that range from the deepest blacks to the brightest highlights. To achieve a proper HDR display using conventional LCD technology, you need both lots of colours and a backlight that can be controlled across a wide range of intensities, including brightnesses at least three times higher than a conventional PC monitor.

The backlight bit is simple enough, conceptually – you need a backlight that’s both brighter and also capable of detailed modulation. So it’s upping the colour fidelity that I’ll primarily focus on. More colours are better. Better as in more realistic, more vibrant, more, well, dynamic. There is, of course, a limit to all of this. And that limit is the human eye.

In simple terms, display technology is constantly striving for and indeed converging with that limit, what you might call the acuity of the human eye. Put another way, the human eye itself has a dynamic range, be that detail, brightness, colours. Once you have a display that achieves or matches that capability, further advances get you nowhere. Humans will not be able to discern them.

Apple cleverly hooked into this notion with its ‘Retina’ displays. The idea with Apple’s Retina displays is that if the pixels are sufficiently close together then their density when viewed by the human eye matches or exceeds the density of the photoreceptor cells in the retina. More specifically the pixel density will match the fovea, which is a small area of the retina that is densely packed with receptor cells and enables the most detailed central area of human vision, known as foveal vision, funnily enough.

Achieve that and you have a display that can generate shapes and curves and forms and details that are indistinguishable from real-world objects. The limitation is then the human eye, not the display.

When it comes to the impact of pixel density, much depends on the distance twixt eye and screen. Apple’s rough definition for its handheld Retina displays was 300 pixels per inch viewed at 10-12 inches. In fact, human vision can probably discern up to 900 pixels per inch at that distance. Likewise, I’m not sure if Apple really sticks to any particular definition for its Retina displays these days. But the concept is obvious enough.

When it comes to colour perception, however, forget viewing distances and pixel pitches. Instead, it’s all about colour space or the range of colours in question. Again, the two important metrics are the range of colours a display can produce and the range of colours the human eye can discern. When the former matches the latter, you have as good a display as we humans can discern.

The relevant target here is known as Pointer’s Gamut. That’s a set of colours that includes every colour that can be reflected off a real-world surface and seen by the human eye. God knows how that is calculated or indeed how it maps to the fact that the ability to sense colour varies fairly dramatically from one person to another. Moreover, the range of reflected colours doesn’t include luminescent colours that can’t be reflected.

Battlefield 1 is set to be the first HDR PC game. Er, I think…

Whatever, what really matters about Pointer’s Gamut is that it’s much larger than the standard colour gamuts or ranges of colours that PC monitor are typically designed to achieve. And as it happens, the full UHD Premium specification includes a colour space known as Rec. 2020 that very nearly covers 100% of Pointer’s Gamut. For the record, UHD is often used interchangeably with 4K, the latter being simply a resolution or number of pixels. But UHD is actually much broader than that and covers everything from resolutions to colours and dynamic range.

Anyway, the most common colour space most PC monitors aim to achieve is sRGB. That only covers a little more than two thirds of the colours of Pointer’s Gamut. Very likely the monitor you are looking at right now can’t even achieve that.

So, your monitor probably can’t achieve sRGB, which is a significantly smaller colour space than Pointer’s Gamut, which in turn doesn’t actually encompass every colour the human eye can perceive. In other words, the range of colours your monitor can display is, quite frankly, pants.

If you want to put numbers on all of this, actually you can. Displays that can natively achieve the sRGB standard have 8-bit per channel colour depth. There are, of course, three colour channels, and if you do the maths, that works out at a little over 16 million colours.

To achieve the Rec 2020 standard, at least 10-bit per channel colour is required, or just over a billion colours. Rec. 2020 also includes 12-bit colour which works out at a staggering 68 billion colours. However you slice it, it’s way more colours than whatever monitor you are using right now.

UHD displays with that near-Pointer’s Rec. 2020 capability are the latest big thing in HDTVs. And that same technology is coming to PC monitors. Unfortunately, it will likely come in many confusing forms. Already, there are inconsistencies in the monitor market with terminology like ‘4K’ and ‘UHD’. Exactly how PC monitor makers are going to deal with marketing Rec. 2020 support as opposed, perhaps, to full HDR support, which itself is open to interpretation, I have no earthly idea. Will they all be sold as ‘HDR’ panels? Or ‘wide gamut’ panels? It’s all very early days.

Where things become further complicated involves the technologies needed to achieve UHD colour depths in practice. HDMI 2.0, for instance, can’t do the full 12-bit per channel at 60fps and 4K resolution. I think DisplayPort 1.4 can. But I’m not totally sure, because it’s all so confusing. Indeed, a list of things your display probably needs to be considered to have basic HDR capability is almost overwhelming:

– HDMI 2.0a or DisplayPort 1.4
– 10-bit per channel colour
– Ability to display very nearly all of the Rec. 2020 or DCI P3 colour spaces
– At least 1000cd/m2 brightness combined with black levels below 0.05cd/m2

Currently, the closest thing I am aware of in terms of a catch-all that tells you a display has at least some HDR capability are the ‘HDR10’ and ‘Dolby Vision’ standards, both of which cover resolution, colour depth and brightness. But I’m not necessarily expecting them to be adopted by monitor manufacturers. Of course, with then latest HDMI standards supported on some PC video cards, there’s always the option of getting in early by using an HDTV as a monitor.

What’s for sure is that these massive new colour depths will be competing with high refresh rates for available bandwidth to your display – more colours means more data sent to your display. But higher refresh rates mean more data, too. And there’s only so much bandwidth. In other words, a display that does it all – 120Hz-plus, adaptive-sync, HDR, the lot – isn’t coming any time soon. When it does, your existing video card almost certainly can’t cope!

Remember the good old days when G-Sync versus FreeSync was all you needed to know?

Then there’s the content side of things. There’s very little HDR or wide-gamut video content out there. It was only in 2014 that the Blu-ray standard was even up-dated to support 10-bit per channel colour. The exception, however, is games. Potentially, at least. HDR gaming is not a new notion and in many ways, games have been ready and waiting for HDR displays to catch up.

In fact, many games have been internally rendering in HDR form and then compressing that down to SDR formats for output for years. More recently, HDR capability has been a big part of the sales pitch for the latest gaming console refresh from Sony and Microsoft, although I’m not sure how good the actual implementation is at this early stage.

As for PC games, my understanding is that there’s nothing that will actually output in HDR currently, but that Battefield 1 will support it at launch. Feel free to correct me! However, if HDR monitors become a big thing, I expect HDR patches for existing games could well become the norm. Put it this way. Of any platform, the PC is arguably the best placed to make full use of HDR as soon as compliant displays appear.

Of course, you’ll also need a compliant video card to achieve HDR on the PC. For Nvidia GPUs, that means Maxwell or Pascal families (GTX 960, GTX 980, GTX 1070, GTX 1080, etc). For AMD, its Radeon R9 300 Series cards can do HDR at 60Hz up to 2,560 by 1,600 pixels. For full 4K 60Hz HDR output, only the latest ‘Polaris’ boards such as the RX 480 can pull it off.

It’s also worth mentioning that OLEDs will do the HDR thing a bit differently and will only add to the complexity when it comes to buying and configuring a system for HDR games or movies. All of which means it’s about as confusing a technology as I can recall and it will take several years for all the competing standards to shake out.

TL;DR
HDR is coming to PC displays and PC gaming and it will mean brighter, more vibrant image quality. But you’ll probably need both a new monitor and graphics card and competing standards will make choosing hardware very complicated.

Sponsored links by Taboola

More from the web

From this site

107 Comments

  1. Jediben says:

    My Hp 2475W was doing 10 bit 8 years ago. All that tech is new in telly but the gamut tech has been around for yonks.

    • Jediben says:

      Only the bandwidth to push the data at higher resolution has been the problem.

      • Jediben says:

        And it’s all academic anyway because Vive v2 will arrive and render monitors (and Vive v1) obsolete.

        • BrickedKeyboard says:

          Eventually. Apparently the Vive v2 is mainly an ergonomic improvement and won’t increase resolution. But I agree that ultimately computers monitors are obsolete because HMDs can be both effectively much larger, in 3d, have better color, and of course be far more immersive. But for that to happen, we need at least 4 to 16 times as many total pixels, foveated and multi-resolution rendering to be standard, and some massive boosts in GPU power. And then once the hardware is there, several more years before there’s major software support. 5-10 years away…

          • P.Funk says:

            You’re never going to render monitors obsolete because HMDs are extremely inefficient at both viewing non 3D interfaces and non 3D content and more importantly being able to multi task and share the output between people or between tasks.

            Having to take a monitor off your face rather than merely cast your eyes in a different direction is a big issue for HMDs. HMDs are the most anti social display in the world compared to a monitor. The question “What are you watching?” doesn’t get more intrusive than when you’re using an HMD.

        • Asurmen says:

          They’re never going to render monitors obsolete.

        • Cinek says:

          What a nonsense. I got Vive myself, am a big fan of VR, but saying that Vive v2 will render monitors obsolete is an utter BS. First of all – price to image quality monitors are impossible to beat. Secondly – they don’t enclose half of your face, what makes them far more comfortable (that’s aside from the fact that next to Cardboard Vive is least comfortable VR headset on a market), thirdly – they are cheaper. And finally, most importantly – there is no GPU on market or planned to be announced that could handle resolutions needed to approach the angular resolution of monitors. Just use one of these VR desktop apps – they are a useless shit that makes you feel like a cripple, cause you need to have text in a font size 72 to make it at least somewhat readable. Not to mention that writing or using keyboard in VR is a nightmare, and motion controllers never match the input speed keyboard offers.

          Maybe, just maybe, Vive/Oculus v. 50 will make me use headset as my main view screen, but we’re really, really, really far from that, and even then – it still won’t make monitors obsolete, they’ll just have some competition.

          • Sizeable Dirk says:

            The I/O ergonomics should be solved before 1989 with the Steam Cyberdeck (made in Nippon with arcane Dragon Magitech).

        • Don Reba says:

          Vive v3 maybe. It will have HL3 as a launch title.

    • Don Reba says:

      As Jeremy said, colour depth is only a part of it. And having done a fair bit of work with 565 colour on phones, I can definitely say that you couldn’t tell the difference between 5-bit (565) and 10-bit colour with reasonably high pixel density and dithering.

      • Don Reba says:

        I want to weaken that claim a little. My experience is with mostly static UI. I don’t know if video could have special cases where dithering would be highly visible.

    • jezcentral says:

      God, the HP2475W is a masterpiece. I commend your obvious good taste. (And it’s 1920×1200’n’all).

    • frenchy2k1 says:

      HDR is a *LOT* more than just 10 bits precision.
      it’s about color mapping. Your monitor is most probably calibrated, targetting either sRGB or AdobeRGB (or a switchable mode).
      HDR adds a new calibration target, with “wider” colors.
      The problem is not that your reddest red is now 1024 instead of 256, it is based on what your reddest red looks like.
      The biggest problem with HDR is that today, even with sRGB, most monitor are rendering colors poorly (cannot display or just wrong calibration). HDR may even make things worse.
      More complicated even is color mapping software and management: we need to know what color were targeted by the content (your movie or image) and what capabilities your display has. This is a current problem for Cell phones, particularly with OLED screens, as those have wider gamut (they can display “wider” colors), but most content is created for sRGB, so if your calibration is wrong, all colors will be “off”, meaning different than the captured image.
      Here is an older article explaining the concept, with color graphs showing coverage:
      link to anandtech.com
      and another about the problems of color management for wide color display:
      link to anandtech.com

      That HDTV will standardize to a single wide standard will be a big help going forward, but the transition will be difficult.

  2. UniuM says:

    All this 10 and 12bit color depth and i’m here sitting with my 144hz, 16.7 million color (6-bits per subpixel plus dithering) looking at band and thinking;
    Why do i need more colors? then i look to my IPS second monitor and i remember… oh so pretty!
    Console peasants don’t buy into PC’s cause its to complicated, wait until you tell them about this!

    • Sakkura says:

      This same thing is going on with TVs and consoles.

    • Premium User Badge

      particlese says:

      Don’t forget these console peasants are buying standardized boxes by the millions, which gives the display industry ~two essentially static pieces of rendering hardware to attempt to form standards around, plus a consumer pool which will actually make the effort worthwhile in their eyes. The TV and cellphone peasants help quite a lot with the money part, too.

      I’m happy to sit back as a PC gaming peasant, watch them all weather the storm for a bit, and save my money for something decently well developed.

      • nottorp says:

        As a both console and pc peasant, I’d like to state that it’s next to impossible to figure out what TV would work with the HDR in my ps4 right now. As for my PC, I wouldn’t even consider researching it in less than 3 years.

    • Karyogon says:

      You don’t need it. But it’ll be capable of looking nice once it does get here. Keeping in mind that afaik “regular” (sRGB) games and content apparently doesn’t quite look good displayed on 10-bit+HDR screens, at least when running in 10-bit+HDR mode.

      It’s not something I’ll worry about for the next decade or so, and it’ll probably be longer than that till both content and hardware has taken to it.

  3. Premium User Badge

    Andy_Panthro says:

    I’m sure it’s nice… but it’s not enough for me to spend the extra cash.

    It sounds like TV/tech companies desperately trying to come up with a reason to buy new stuff. The move to 1080p was good, but everything since has been too little improvement for too much money.

    • Sizeable Dirk says:

      Agreed. I like to keep casually up to date on hardware but panels, not so much.
      To me it’s like when stumbling across an audiophile forum and people are going to discuss the pros and cons of gold plated subpixels if I scroll down.

  4. engion3 says:

    I’ve put off buying a new monitor for the past year because it’s so intimidating, still running the ol’ 60hz 1080p’s.

    Probably go with 2560×1440, 144hz, freesync, I don’t even want to think about all this.

    • BrickedKeyboard says:

      One viable option is to get a 4k 55 inch HDTV and use it as a monitor. Just make sure you get a model with low input lag. I’m personally waiting for a low input lag OLED 4k model to come down to a reasonable price point.

      You end up getting more for your money with a TV because of market size – a lot more people buy TVs than monitors.

    • kekstee says:

      I’m in the same boat. If I upgrade my existing 1200p60 MVA display it should offer 1440p and decent colour (100% sRGB). 144Hz would be nice. I think the MG279Q mentioned in the article is pretty much the only candidate.

      And while I wait that someone puts this stuff into a gamer-less design they throw native 10bit depth at me to wait for :(

      • Agnosticus says:

        I think will take about two more years for HDR to take off. Right now, it is just an enthusiast thing like VR, too young and too expensive IMHO.

        Gameplay-wise it’s much more important to get a higher refresh rate, but if you are in for the looks, HDR could still be the thing for you.

        I’ve recently bought a Asus MG279Q and well pleased with it. I think it’ll serve me the next 5 years, by then HDR 4k monitor will be plentiful, nice and cheap ;)

  5. valourfrog says:

    Thanks for the well written article.

    I just wanted to nitpick a little and point out that the size of the gamut doesn’t necessarily correlate with higher bit depths, so the example given here about number of colors with different standards is kind of misleading.

    To be honest, I’ve found the HDR television sets I’ve seen a bit underwhelming. During normal lighting conditions the difference is pretty hard to see and only visible in small spots on the screen because the backlight can’t blast the whole display with full brightness.

    • BrickedKeyboard says:

      Sure it can. The problem is that with LCD TVs, if you make the backlight brighter everywhere, the black levels go bad (black pixels become dark grey because the bright light is leaking through)

      This is why HDR is really meant for OLED displays, which don’t have this inherent problem. They can in theory have a searing bright patch on the screen right next to a pitch black region. In practice there are a lot of problems and they are currently very expensive to purchase but maybe in 3 years things will be different (and this standards war will have shaken itself out)

    • caerphoto says:

      the size of the gamut doesn’t necessarily correlate with higher bit depths, so the example given here about number of colors with different standards is kind of misleading

      You’re half right. Yes, bit depth does not necessarily correlate with gamut, but Rec.2020 isn’t a gamut, it’s a set of standards which includes a gamut, and also specifications about bit depth.

      The article is a bit misleading about this earlier on when it calls Rec.2020 a gamut.

      • Sakkura says:

        I don’t envy the task of getting all this across in a still human-readable article, though. It’s a mess.

        • caerphoto says:

          Certainly is. It’s hard enough getting colour management stuff straight by itself, never mind connectors, framerate, resolution, brightness range, and a whole pile of marketing bullshit.

          • cakeisalie says:

            Will be even worse when the marketing people start spewing out deliberately misleading nonsense about how product x is better than product y because it has z. This looks likes it’s going to be a total minefield. One that I shall ride out until the standards war that will inevitably ensue declares a victor.

  6. petes117 says:

    NBA 2K17, Forza Horizon 3 and Gears of War 4 have HDR support first, since you asked

  7. dee says:

    i wish they’d named this something else, like hardware HDR or screenHDR or something. the first i heard of it, i thought “haven’t we had that in games since, like, 2005?”

    • GenialityOfEvil says:

      It’s also easily confused with dynamic contrast range, which is strictly a brightness technology but has existed as long as LCD displays have.

      • brucethemoose says:

        The buzzwords are colliding.

        It’s like how the same manufacturers shot themselves in the foot with “LED” branding. Most people don’t know the difference between that and OLED.

        Now they’re stuck with 2 buzzwords that sound like tech they’ve been advertising for ages.

        • Sizeable Dirk says:

          Extra fun (not really) with OLED in my language where if you put “o” in front of a word it’s a negation prefix, like “un-” in English.

          “So this screen is LED-free then?”

    • RadicalHorse says:

      But it’s all related, old “gameHDR” means that game internally use many more colours and then scale that number down for our monitors and this new “screenHDR” means screens can display more colours and games don’t need to reduce them as much.

    • Frank says:

      Yeah, as a gamer and not a technophile, I was surprised the post didn’t bother explaining what it was on about to those of us who haven’t heard of HDR since HL2 Lost Coast.

    • DanMan says:

      True. That’s why I like to call it “HDR output” and “HDR rendering”, which makes it clear IMO.

  8. Plank says:

    Just commenting to say that that GirlsX Battle ad at the top of the main page is a little troublesome.

  9. fahrenheit says:

    Just a tiny correction, HDR (at least one of the formats) is not tied to 4K displays and/or hdmi2.0(a).

    Its possible to have HDR and 1080p over hdmi 1.4 as the recent PS4 update shows (it’s mostly a bandwidth thing).

    The issue is that there aren’t many 1080p, if any, displays that can actually do HDR.
    This not considering the vast amounts of HDR content available…

  10. Mont says:

    Well I’m on the 40′ 4k 10 bit color Iiyama monitor, so I should be good for the time coming right?
    link to iiyama.com

    • TheManko says:

      Even though it’s 10 bit, it might not be more than sRGB. There’s loads of 32″ 3840×2160 monitors with 10 bit panels which only cover 99% of sRGB, and only use the extra bit depth for nicer gradients and image processing. And even if it was wider color gamut, it might not be the correct gamut, if it doesn’t conform to the Rec 2020 standards. It’s all very confusing even to those who have read up about this stuff, so I don’t blame you if you’re stumped.

      Long story short, your monitor will be just fine, but it won’t be able to use these new bells and whistles. You need either a UHD TV for that, or future computer monitors that don’t exist yet, but might next year.

      • Mont says:

        link to hardforum.com

        Here they say that it is indeed able to do it.

        “Today the technician confirmed that the panel of the X4071UHSU-B1 indeed covers the Dolby Vision gamut (Rec.2020).
        But because the backend of the panel is not able to reach the required brightness of the Dolby Vision spec, they can’t claim Dolby Vision.
        Attainable Dolby Vision brightness today is 800 – 4,000cd/m2. With plans for up to 10,000cd/m2 in the future.”

        • Unclepauly says:

          Not only that, it won’t hit the required black level either.

          • Mont says:

            So I’ve been reading a bit and it looks like my monitor supports all of the gamut but falls a little bit short on the nits (brightness) needed for full HDR, though it looks like most of the tvs that have been sold last year only covered about 70% of the gamut required.

            What I was wondering is: even though it won’t be “true” HDR will I be able to notice improvements on my monitor when viewing HDR content?

    • caerphoto says:

      Spinal Tap would be impressed with your monitor.

    • DanMan says:

      No, because the display has to understand that it’s receiving a HDR signal and act accordingly (crank up the brightness, for example).

    • brucethemoose says:

      Either way, that is still a very nice monitor

      A 10 bit, low response MVA panel like that is just about as good as it gets until you start looking at OLEDs.

  11. a very affectionate parrot says:

    I guess it’s about time monitors had a big shakeup but I’ll be ignoring this until one standard wins over everything else. By that time hopefully 4k 144hz will also be within the price range of mere mortals.

  12. Xerophyte says:

    I’d be a bit careful about calling the Pointer’s Gamut “every colour that can be reflected off a real-world surface and seen by the human eye”, since that’s only true 1: in sunlight and 2: on diffuse reflectors. Take a piece of glass or an highly exposed chunk of lava and all bets are off, human eyes or otherwise.

    This is not to say that having a display that can cover a larger portion of the solid of reflectances * the solid of emitters is a bad thing, of course.

    Also, note that there are very significant limits to what “HDR” displays can do — as person who writes rendering software for a living I really don’t like the term as applied to anything but “literally all the colors” which no monitor is going to be capable of. Many real things, such as the sun and white-like things lit by the sun, are really, really bright and even if you could build a monitor that could smash that many lumen into your eyeball you really wouldn’t want it to. Unless you’re building some sort of monitor/furnace/nuclear device 3-for-1, in which case carry on I guess. The rest of us will need to tonemap our nice computer real-world radiances down to some LDR space, albeit hopefully not some anemic 8-bit one.

    • caerphoto says:

      Likewise with photography – it’s easy to create an image that represents an enormous source brightness range, but it needs tone mapping down into something a monitor or printer can show.

  13. Uninteresting Curse File Implement says:

    TBH it took a while for me to be sold on High Colour vs. 256. Looking at some gradients convinced me that it was all worth the performance cost. And the only reason I’m using 32 bit True Colour now is, why the hell not? There is absolutely no way in hell I would give a thought to needing even more. It seems ridiculous.

    • caerphoto says:

      I guess that’s the problem with this sort of technology – unless you can see it first hand, it’s hard to really appreciate the benefits as it’s all a bit academic. Same with large gamut displays like on the iPad Pro (DCI-P3) or the first high-DPI screens.

      • Uninteresting Curse File Implement says:

        The question is, is it going to be noticeable to a regular, non-artist’s eye, even if viewed side-by-side? Then you could at least demo the stuff in stores.

        • DanMan says:

          It totally does. The difference is remarkable, but the TVs that do it best cost you and arm and a leg.

        • KillahMate says:

          Yes. A high-end OLED running HDR-mastered video is totally obviously doing something impressive that no regular screen can’t do. But those are still incredibly expensive, and the HDR screens that aren’t incredibly expensive are much less impressive :-/

          • Unclepauly says:

            “no/can’t do” The pedant in me won’t allow me to leave this alone. :(

    • DanMan says:

      Unfortunately, the article is wrong in equating HDR with “more colors”, because HDR is first and foremost about increasing the distance between the brightest and darkest spot.
      The thing the movie industry is pushing as “UHD” is about both, yes, but you could totally have HDR on a 8bit display, for example. You’d just get a lot more banding in gradients compared to a higher bit depth.

      It quickly gets a bit complicated from here on out, so I’ll leave it at that.

  14. wykydtronik says:

    I’m holding out on 4k and this HDR tech until GPUs are quantum based.

    • khamul says:

      Well, if it helps, the monitors will be!

      The core tech enabling the colour/brightness revolution is Quantum Dots: nanocrystals that fluoresce at precise colours, depending on their size. Which means you can put blue light from an LED into them, and get the precise red and green you want without filtering away most of the emitted energy, as you do with a phosphor: so more colour accuracy, brighter images!

      Most of the ‘display revolutions’ since the move to HD have been, frankly, a bit pants. This one isn’t: I’ve seen HDR at trade shows, and it is genuinely impressive, a really significant and noticable step change (the frustration is that you can’t show it! You can only show what an HDR display looks like on an HDR display!). And it will work really well in games.

      The good news is that it shouldn’t have any meaningful bandwidth impact on top of the move to 10-bit colour. It’s all about the gamma curve in the video, and how the display *interprets* those 10 bits. I would expect the processing overheads on the PC side to be low as well – the main challenge will be in the processing on the monitor, and heat! Expect monitors to get a lot warmer and more power-hungry.

      All in all, though, that was a bloody good article on a very confusing topic. Well done, Jeremy.

  15. Rutok says:

    The problem is not that “it will take a few years for all the competing standards to shake out.” The problem is that way before that dispute is settled, manufacturers will come up with a new thing that they kind of invented and that everyone must have now.

    This has been the main theme in the last few years. 3d vision and all its competitors, gsync, freesync, different lighting and panels. As soon as one companies standard gains the lead, the others are changing the field.

    • Agnosticus says:

      You forget that the HDR thing is already built in the new console gen, which means that it will most likely stain the PC market via hardware and software for the next 4-5 years.

  16. Butler says:

    In one of the rare examples of me actually listening at University, our lecturer, a respected professor that specialised in display technology, used to mock all the ’16 million colour’ marketing nonsense: “how many colours do you think the average human can discern?” (i.e. not many).

    Has anything really changed? Does HDR really change much with regards colour and the perception thereof?

    • Don Reba says:

      The colours displayed are not evenly distributed according to our perception, so it still makes a lot of sense to have greater colour depth to make up for the regions in which the coverage is not so good.

      It doesn’t necessarily mean you have to buy 10-bit displays, though, since dithering works very well at higher pixel densities.

    • khamul says:

      *Yes*

      HDR makes a massive difference – in my personal experience, and opinion.

      There are three clips that you use to show off HDR. The first is jewelry: with the compressed dynamic range of normal TV, gold doesn’t glister – with HDR, it does.

      The second is something like fireworks against a dark sky: with HDR, it’s possible for point light sources to really flash without the background becoming washed out. Another good example is glowing jellyfish in the dark ocean in Life Of Pi: with HDR, they really stand out.

      The third is truly dark scenes. These just don’t work with a normal display – it’s either all grey, or all too dark to see. Whereas in the real world, you can be in a really dark environment, and still pick out objects quite clearly. HDR brings back quite a bit of that dark clarity – I have seen that it does, though I’m not entirely clear on how.

      For all of these things to work, the content needs to be mastered to make use of them, you need to care about the detail, and you need to be in a dark environment to appreciate them properly. All of that is tricky for TV: but for games, they’re all almost a given.

      I’ve seen HDR, I’m impressed by it – and I am generally a cynic on these things – and I think it’s going to make a massive difference to gaming, once the tech has stabilised enough that people can invest.

      It’s certainly something I will go for on my PC, just as soon as I can afford it, and think I’m getting something with a reasonable lifetime: and I am *not* generally an early adopter.

      • RainMan says:

        Would you spring for an HDR LCD though? It just seems like a half way point, where OLED can achieve the full benefits of HDR (e.g. like you described in a dark scene, the dark bits can be dark and other bits can still be light)

        • caerphoto says:

          Showing dark and light together is only one aspect of the display requirements though – there’s still the issue of colour gamut and maximum absolute brightness.

          • RainMan says:

            Yeah, true. And I don’t know much about OLEDs but I seem to remember hearing that they don’t get as bright as the brightest LCDs? Currently, at least.

          • Unclepauly says:

            The problem with the brightest LCD’s though is that when they are that bright, the dark parts of the image suffer and turn grey. Even the best LCD’s with the best backlighting can’t do it as good as an OLED. In fact, OLED’s already look like they have HDR enabled next to an LCD, even when they don’t.

        • khamul says:

          Hmm. In 2016, 1017, yeah, probably. It’s a complex tradeoff, and it will be a little while before OLED wins it outright.

          With LED, you are *at best* – i.e. pure white light – turning 2/3rds of your energy into heat. Because you have 1 LED with three filters on top of it, and each filter gets rid of the red/green it doesn’t want. *BUT* blue LEDs are just about as good as it gets at turning electricity into photons, and quantum dots mean all the rest of the process becomes very efficient.

          With OLED, you’re producing only the light you want, *BUT* OLED produces less photons per watt of input power – and even worse, the ‘O’ in OLED stands for ‘Organic’ – as in long carbon-carbon chains. Which have chemical bonds at energies around that of a blue-wavelength photon. The light that blue OLEDs produce actually slowly destroys the molecules creating them.

          Right now, OLED seems to be able to produce around 300-ish cd/m2 (nits) whereas the latest Samsung SUHD TVs (LED, quantum dot based) are claiming something like 1000 cd/m2

          OLED will always have the advantage in pure blacks, however. It’s not an absolute thing – an HDR LED TV should do direct backlighting, and only light the LEDs behind the areas that should be bright – so you’ll get some bleed, but not the whole screen. But OLED will win.

          So if you know you’re going to always be gaming in a dark room, maybe OLED is the way to go?
          But… this is where timing comes in. The big TV companies have spent a lot of money and a long time getting the process of building glass and LEDs very very efficient. Quantum dot is a minor change to this process. OLED is a complete re-architecture, start to finish. LEDs are going to be cheaper than OLED, like-for-like, for at least a couple of years.

          • RainMan says:

            Ah, interesting. On a general level I suppose OLEDs come with the standard issue early adopter tax. But assuming their weak points can be sorted out, they seem like the future.

            But that doesn’t help for those looking to buy a TV today. I bought one recently and just went for something decisively budget, and it’s been reasonably good. It just seems like there’s no all-round good option at the high end.

    • caerphoto says:

      “how many colours do you think the average human can discern?” (i.e. not many).

      Maybe not many, but we can discern plenty of colours that can’t be displayed accurately on regular monitors: think high-vis jackets, emergency vehicle lights (or really any bright and saturated lights). Heck, there are colours you can print which don’t show up well on screens (some shades of orange and blue).

      You don’t need more bit-depth to display a wider gamut, but because the absolute range is larger, it helps to have more bits so you can display the in-between colours more accurately, even if you can’t actually discern the entire 68 billion or whatever colours.

      Also yes, in many cases 16 million colours is marketing nonsense, and not even true. Just because you can send 16M different RGB triplets to the device, doesn’t mean it can actually display them (see: RGB mice, keyboards etc.)

      • DanMan says:

        It’s very easy to create a gradient that will show color banding even with 16M colors available. Just take two colors which are somewhat close to each other and stretch it out.

  17. RainMan says:

    Here I was thinking that I would wait until I can buy a 1440p (possibly 2160p), 144 Hz, IPS monitor with variable refresh rate, and now I’m thinking that I need HDR as well. I’m never going to upgrade from 1080p 60Hz IPS at this rate…

  18. Sandor Clegane says:

    I’ve dumped cumbersome old screens for projectors but anything that makes people wake up to the fact they don’t need a TV (Tell-A-Lie-Vision) anymore is truly a great thing! As well as revealing the bloated salaries of overpaid employees, the BBC is in dire need of ending its “License Fee” scheme and declaring bankruptcy. Never mind the fact the BBC needs to face criminal charges for promoting pedophilia, state sponsored propaganda, terrorism, cocaine addicted employees, and Chris Evans.

  19. Gordon Shock says:

    I’ll gladly post it again, WATCH IT, IT’S WORTH YOUR TIME!!!

    link to ted.com

  20. DanMan says:

    Obduction claims to support HDR on their blog, so that’d have to be the first PC game with HDR support.

    I don’t care about BF1 supporting HDR, because I “only” have a HDR TV, which I won’t be playing shooters on.

    A lot of sciolism in that article. Just sayin’

  21. Pastuch says:

    It blows my mind that the LG B Series oled TV that is 55 inches, 4k, has HDR color (Both Dolby and the internet one), supports chroma 4:4:4 at 60hz with HDR and has an input lag of 33ms costs only $2500 while the Dell OLED computer monitor that is only 30 inches and doesn’t support HDR costs $5000! It’s total nonsense. The PC monitor industry is a joke of overpriced crap. They charged $900 for a TN Rog swift! Gsync or not, that ridiculous!

    • Don Reba says:

      It blows my mind that the LG B Series oled TV that is 55 inches, 4k, has HDR color (Both Dolby and the internet one), supports chroma 4:4:4 at 60hz with HDR and has an input lag of 33ms costs only $2500

      2 whole frames of latency is quite a lot… And this minotor has a list price of $4000 on amazon.

      the Dell OLED computer monitor that is only 30 inches and doesn’t support HDR costs $5000!

      As far as I can tell, this is not even a thing that exists. Do you have a link?

      • Don Reba says:

        Uh, “minotor” is “TV” with a typo. :)

      • Llewyn says:

        As far as I can tell, this is not even a thing that exists. Do you have a link?

        It exists in the sense that Dell have shown it off and announced it for sale twice, including the price quoted above, but doesn’t in the sense that Dell have never actually been able to put it on sale. It sounds like they’re having issues manufacturing it reliably.

        The model is UP3017Q.

  22. Raoul Duke says:

    This article skips over the one thing that actually matters here, which is, does this subjectively make any real difference to a normal human viewing one of these screens?

    My current sense is that the improved brightness/blacks will make a big difference, but the colour range will not. The former has been available since the plasma TV era (damn you, idiot consumers) and has long been dangled in front of us via OLED screens, but does not seem to be “HDR” specific.

    Also, it’s frigging annoying of them to pick that particular name for this given that HDR photography is a very well known and widely discussed thing on the internet. There is going to be a great deal of annoying confusion.

    • caerphoto says:

      There could be some interesting overlap, in that an HDR display could be an output target for HDR photographs, which would require different processing etc. than an image targeted for print, in that you’d not need to compress the brightness range as much. An image that looks normal on an HDR display would look flat and dull on a regular sRGB screen.

Comment on this story

XHTML: Allowed code: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>