TV screen technology explained
What is HDR TV?
By Martin Pratt
Article 3 of 9
HDR is heralded as the latest ‘must have’ TV feature – but with five different formats, from HDR10+ to Dolby Vision, supported by different manufacturers, what exactly is HDR TV, and do you really need it?
HDR, or High Dynamic Range, is a TV standard that allows screens to give you improved contrast, more accurate colours and more vivid pictures than regular sets.
Almost all 4K TVs also support one of the HDR formats and you aren't paying extra for it. 4K HDR sets start from just £400 – but is this technology as important as manufacturers and retailers tell you?
Content is limited to ultra-HD Blu-rays, games consoles, but the range of shows and movies on streaming services is now quite broad.
The best TVs will plunge to deeper blacks and stretch to brighter whites when showing HDR content, giving you even better picture quality. But 4K HDR picture quality isn’t guaranteed to be better than 4K alone – we’ve seen a few instances of washed-out highlights lacking detail during brighter scenes.
- What's so special about HDR
- HDR formats and how the differ
- The formats each brand support
- Where can I find HDR content?
- Should I buy an HDR TV?
Browse all our TV reviews to find the very best HDR sets.
If you’re a keen photographer, you may have heard of HDR before, but it works slightly differently with video. HDR essentially creates a greater dynamic range between the darkest blacks and brightest whites, with more subtle differences in tones in between.
Although 4K TV is great on its own, a 4K HDR picture will seem even brighter and more detailed, particularly with darker scenes in films and TV shows.
HDR doesn’t just improve the TV's brightness. It can also enhance the colours you’ll see, making them appear to pop with more vibrancy and detail, although that does depend on the quality of the TV, too.
There are five different HDR formats supported by different manufacturers. These formats are: HDR10, HDR10+, HLG, Dolby Vision and Technicolor. All five are fundamentally doing the same thing: improving contrast and colour depth, particularly when it comes to very dark or bright scenes. The difference between the formats comes down to how they use metadata, which media players and studios support them, as well as other factors such as how easy they are to broadcast.
What is metadata?
One of the major differences between types of HDR boils down to whether they use static or dynamic metadata. HDR10 uses static metadata, HDR10+ and Dolby Vision use dynamic metadata, Technicolor can use both, and HLG doesn't use either.
Metadata is the information required to make a standard video file into an HDR video file. Dynamic metadata can adjust the HDR on a scene-by-scene basis, based on the brightness of your TV and what's being displayed. Static metadata can't, which means there's more chance of detail being lost when scenes get particularly bright or dark.
Think of it as going outside on a sunny day. You leave the house and the blaring sun has you reaching for your sunglasses to shield your eyes. When you go back inside, you take them off again. This is what dynamic metadata does. Static metadata is stuck with its sunglasses either on, or off: it can't adjust.
The current HDR standard is HDR10. Every HDR-capable TV is compatible with it, as are most of the places HDR content comes from. This includes streaming services, such as Netflix and Amazon Video, and film studios, including Sony, Universal and Warner Bros, which put out HDR10 ultra-HD Blu-rays.
The PS4, Xbox One S and Xbox One X games consoles are also compatible with HDR10.
Unlike with HDR10, TV manufacturers need to pay a fee to use Dolby Vision, and it uses dynamic rather than static metadata. Despite the cost to use Dolby Vision, most leading TV manufacturers are making compatible TVs. Dolby Vision solves several issues that were preventing HDR content being easily broadcast. It works with older versions of HDMI than HDR10, and it can be transmitted at the same time as standard dynamic range (SDR) content, which is what you're seeing if you watch a TV channel at home. This makes Dolby Vision HDR easier to broadcast.
Dolby Vision HDR also has a higher brightness ceiling. Screen brightness is measured in nits, with some TVs achieving up to 2,000 nits, but HDR10 tops out at 1,000. Dolby Vision moves that ceiling to 4,000 nits, so it takes better advantage of brighter screens.
Some of the 2018 TVs from LG, Panasonic and Sony are compatible with Dolby Vision, with Samsung being a notable holdout, but there's a good reason for that.
Samsung, along with 20th Century Fox, Amazon Video and Panasonic, are putting their weight behind HDR10+, an updated version of HDR10 that uses dynamic metadata rather than static.
In doing so, HDR10+ removes the main drawback of HDR10. It looks as though Samsung sees HDR10+ as a viable alternative to Dolby Vision, and there's no fee for manufacturers and content producers to use it.
With relatively few TV brands and studios signed up for HDR10+, it could be some time before it catches up with Dolby Vision, but the lack of a license fee may prove enough to tempt companies to the new format.
As if TVs technology needed more acronyms, here comes HLG, or hybrid log gamma. It was developed by the BBC and Japanese broadcaster NHK in an attempt to solve the problem of broadcasting HDR content.
In Japan, HLG is already in use, but the BBC is yet to launch an HDR channel or even broadcast a show in HDR. The BBC is still at trial stage with its HLG broadcasts, and it's using its streaming platform to try it out. Late last year it made Blue Planet II available in 4K and HDR on iPlayer for a limited time. More recently it made World Cup and Wimbledon matches available as 4K HDR streams.
Despite the fact HLG content is thin on the ground, TV manufacturers are still prepared for its arrival, since most of the 2018 TVs from Samsung, Panasonic, LG and Sony support the technology. It also remains to be seen whether other UK broadcasters other than the BBC will choose HLG or Dolby Vision for their HDR broadcasts.
Developed in conjunction with Philips, Technicolor HDR is unique in that it can upscale SDR content to HDR. This upscaling means HDR video could be displayed on non-HDR TVs. Whether it's the capabilities of the TV's display or the HDR mastering of the content you're watching that makes the difference isn't clear, but there's the possibility that watching HDR on an SDR TV would be pointless.
Technicolor could solve HDR's compatibility issue, since it can convert one HDR signal to another that's supported by your TV. For example, if a Technicolor HDR broadcast is transmitted to a TV that only supports Dolby Vision, then Technicolor could convert the signal and the TV could display it.
Just because you have an HDR TV, it doesn’t mean that everything you watch will be in HDR. The content must be mastered in HDR in order for you to make use of your TV’s added capabilities. And as with regular 4K viewing, HDR-quality content is only just beginning to trickle out. YouTube, Amazon and Netflix are starting to offer HDR on their video-streaming services, but you’ll need decent broadband (Netflix recommends 25 megabits per second) to stream 4K HDR content over the internet.
The movie studios are distributing new films in HDR quality, as well as re-mastering older titles, and 4K Blu-ray players from the likes of Samsung and Panasonic can play these 4K HDR discs.
Broadcasters such as the BBC have conducted experiments with HDR TV on iPlayer. But with TV infrastructure struggling to cope with even standard 4K broadcasting, we may have to wait a little longer before HDR TV becomes a mainstream reality.
You may not have a choice. With many new 4K TVs also supporting HDR as standard, you'll likely find it in your next set. With prices falling, top-notch 4K HDR TVs are available from around £400, so even though there isn't a lot of content around that supports it right now, the sensible bet is to invest in this for the future.