ARTICLE: What is HDR10+?

Good article Steve and some welcome clarity in this dynamic space. In some respects all this innovation is great. But where it falls down is support and implementation by content creators and then the whole plethora of devices that need to work for a flawless consumer experience. I feel that Dolby has the advantage in the creation space but let’s see how this plays out - George
 
Regulations, standards...it's all very confusing without anything being cemented in terms of universal validation but then again you can argue we're in a state of progressive flux and innovation. It's all transitional but I do agree that the tech is running way ahead of the content currently. The fact that a few weeks ago I had no idea that a tv being labelled with 'HDR' did not mean it could produce the output with any degree of quality blew my mind. As more of a casual compared to many on this forum, I found this alone to be misleading - yet the average user would never know.

I'm grabbing the XE900 this month and already I can list things that I wont have that might be introduced to more tvs over the next year or two (HDMI 2.1, Dolby Vision etc). I think we're always going to be in a perpetual state of waiting for the perfect tv when there's no such thing.
 
Last edited:
This is all becoming terribly complicated and fragmented now. TV manufacturers already have to support a range of different HDR formats, and now we have yet another one. It's insanity.

The industry as a whole should have defined a single standard before a single HDR TV was ever sold. HDR 10 would have done the job quite nicely. It may not be quite as flashy as Dolby Vision, but it's certainly good enough. At least HDR 10 is included as a fallback layer some of the time, but it really should be there all of the time.

This fragmentation is already cutting some people out of the loop. Those who have gone out and bought brand spanking new 4K players, such as the Xbox One X, Fire TV 4K, NVidia Shield, or Apple TV 4K can't watch 4K HDR content from the BBC at all.

Format wars are never a good idea.
 
I'm confused mostly by the actual need for this (and DV).

My understanding is that both technologies will grade each scene "on the fly"... surely the film has already been graded in post production? So why the need to alter it at a later stage?

It's already been graded at great expense by a professional.
 
This amount of different metadata is only confuses me as a customer. I wont invest my money in a tv till the end of 4k cycle.
 
I have a few Dolby Vision UHD titles, though admittedly, there is not lots of content out there yet, but the list is growing - and according to bluray.com Blade Runner 2049 will have Dolby Vision. But I must admit watching in DV, there is a very clear improvement over regular HDR (having watched the same film in DV and in HDR).

I have not seen any examples of HDR+ so cannot comment on this, but i'll be sticking with regular HDR and Dolby Vision for now.
 
Formats will always evolve. The crux is, you'll always need a high end tv to get the best out of HDR and DV. I thought it would be easy to upgrade my tv, nothing to it. I've really been out of the loop for too long. This past two weeks on here has been an education.
 
I'm confused mostly by the actual need for this (and DV).

My understanding is that both technologies will grade each scene "on the fly"... surely the film has already been graded in post production? So why the need to alter it at a later stage?

It's already been graded at great expense by a professional.

The 'Professional' will have graded that content to a certain standard - like 1000 or 4000nits. However, not every TV can reach 1000nits and NO current TV can deliver 4000nits. Therefore, the content has to be scaled down to the TV - tone mapped.

Static Metadata - in HDR10 - will apply a single tone mapping algorithm for the whole movie. If the Movie only reaches 2500nits for example in one scene and that scene is the 'max' level in the whole movie, that could be displayed at the TV's maximum level and scale down from there. In the next scene, the maximum brightness is now 1000nits - something a TV Could display, but because of 'static metadata' - that is now scaled down because it had to fit 2500nits in another scene - therefore not optimal and obviously dimmer.

With Dynamic Metadata, the max peak brightness of that 2500nit scene may well look identical to the Static Metadata but in the scene that only reaches 1000nits, the Dynamic Metadata would allow that scene to be displayed 1:1 and appear much brighter, more optimised.

There are generally 2 main points in HDR tone mapping, the point at which it stops following the curve accurately and the maximum peak brightness or point at which 'clipping' occurs. With Static Metadata, these points are fixed from start to finish. If its set that at 300nits, it starts to fall away from the curve to scale the highlights down, then throughout the Movie, regardless of the scene, at 300nits, the TV will stop presenting brightness at 1:1 and scale down - even if the scene could be displayed 1:1. With Dynamic Metadata, those points can change - vary the amount of 1:1 mapping and highlight scaling to deliver an 'Optimum' viewing experience.

This is much more important on TV's that, at the moment are not able to deliver the necessary performance to match the 'Professional' master. Its more beneficial on the more under-performing HDR TV's and not important on TV's that, in the future, can deliver 4000 (or even 10000nits). Both Dynamic and Static Metadata would map the content 1:1 if the TV's could handle it as both would follow the PQ curve.

If you only have 700nits available, It could make a lot of scenes look much dimmer than necessary with Static metadata but Dynamic metadata would optimise every scene. Its still not as the professional mastered it to be BUT you couldn't display it at that standard due to hardware limitations.
 
Formats will always evolve. The crux is, you'll always need a high end tv to get the best out of HDR and DV. I thought it would be easy to upgrade my tv, nothing to it. I've really been out of the loop for too long. This past two weeks on here has been an education.

The crux is that DV and HDR10+ are both designed to 'optimise' HDR content according to the HDR capabilities of the TV - the worse it is, the more that these methods benefit. If you had a TV that could deliver the standards at which the content is mastered to, you wouldn't see any difference as the 'Dynamic' and static metadata would map the content exactly as the colourist intended, follow the curve precisely. The Benefit of DV at that point would only be 12bit colour depth but HDR10/10+ and DV would all be displaying the content 1:1. Dynamic Metadata optimises the HDR content to that TV's capability but it will not make a 600nit HDR TV look better than a 4000nit static metadata TV. It will make HDR look more optimised on that TV and compared to any other 600nit TV - may even look better than some 1000nit TV's in some scenes too.

Of course to get the best out of HDR - regardless of Static/Dynamic metadata will always be to get the high-end HDR TV.
 
One of the TVs I was looking seriously at is the Sony XE9305, which will get DV soon, if this HDR10+ rolls along, would it be as easy as a software update or would it need new hardware?
 
This is all becoming terribly complicated and fragmented now. TV manufacturers already have to support a range of different HDR formats, and now we have yet another one. It's insanity.

The industry as a whole should have defined a single standard before a single HDR TV was ever sold. HDR 10 would have done the job quite nicely. It may not be quite as flashy as Dolby Vision, but it's certainly good enough. At least HDR 10 is included as a fallback layer some of the time, but it really should be there all of the time.

This fragmentation is already cutting some people out of the loop. Those who have gone out and bought brand spanking new 4K players, such as the Xbox One X, Fire TV 4K, NVidia Shield, or Apple TV 4K can't watch 4K HDR content from the BBC at all.

Format wars are never a good idea.

Yeah, there's a lot going on. My brother bought a 75" Sony 4K TV in '15, and was annoyed recently to find it doesn't do HLG and can't be upgraded to do so. It transpired he didn't even know what HLG is (bit odd he was so annoyed, then!), and hadn't even heard of HDR until I explained it to him about 2 weeks ago. Never asked me about it when he was looking to buy. He thought 4K was it - done, buy a telly. Probably a lot of the public who think it's working like this. I've asked a few people I know who've bought TVs in the last couple of years what HDR is, and none of them have a clue, not even slightly. They know 4K means "a sharper picture", but if I then explain HDR and it's potential impact on the picture in even the most basic way, their eyes glaze over. They're not remotely interested.
 
Thank Christ I'm content with my 7 year old 1080p Samsung, Base Xbox One and Base PS4...... Until Jaws comes out on a new format Lol.
 
Yeah, there's a lot going on. My brother bought a 75" Sony 4K TV in '15, and was annoyed recently to find it doesn't do HLG and can't be upgraded to do so. It transpired he didn't even know what HLG is (bit odd he was so annoyed, then!), and hadn't even heard of HDR until I explained it to him about 2 weeks ago. Never asked me about it when he was looking to buy. He thought 4K was it - done, buy a telly. Probably a lot of the public who think it's working like this. I've asked a few people I know who've bought TVs in the last couple of years what HDR is, and none of them have a clue, not even slightly. They know 4K means "a sharper picture", but if I then explain HDR and it's potential impact on the picture in even the most basic way, their eyes glaze over. They're not remotely interested.

Its not helped even if they do have a vague idea or are looking for a HDR TV because of console gaming that a lot of TV's that are so-called HDR offer 'support' to play HDR content but not necessarily 'HDR'. Some TV's are not that different from a SDR 4k TV but have the ability to play the content with the standard REC709 colour gamut and only brighter because the TV is now on Max.

All HDR TV's can display HDR10 content - whether its downgraded to the point where its more SDR+ or actually delivers at least the 'minimum' UHD Premium standard. It seems that DV and HDR10+ could well be 'rival' options. Sony and LG offering DV with Samsung and Panasonic going down the HDR10+ route. Whether we see a TV or Bluray player with 'both' - assuming it does get BDA approval or not, who knows. Both HDR10+ and DV require both the source and display to support the format but if only one device (source or display) supports either of these formats, you will just get HDR10.

HLG is the format for broadcasters as it doesn't require any post-processing to add in the Metadata. That means broadcasters can still deliver HDR 'live' content like sports for example.

The UHD Alliance was set up to try and put standards in place for both content and displays. The UHD Premium Logo is supposed to represent that a device meets a minimum standard. It seems though that manufacturers are cashing in on 'HDR' though - making sub UHD Premium TV's seem as if they are just as good as UHD Premium TV's - that you are paying extra for some 'design' or feature - like Samsung and the Quantum Dot TV's, LG with OLED vs their LCD TV's, Sony by not putting their TV's up for UHD Premium - its like the 1080p era where you still get the minimum standards for content regardless of model but paying more for better processors or aesthetics. People see its HDR and expect it to be as much a HDR TV as any other.

HDR is difficult to explain and for people to understand. An increase in resolution isn't that difficult as most remember what SD to HD did. Trying to sell a TV by talking about nits or brightness, colour gamuts, colour volumes, contrast and/or increase in 'bit depth' can be difficult. Most who haven't seen HDR think their TV is bright enough, has enough colours etc and that their pictures look 'lifelike' and 'real'. Even if they do buy a HDR TV that literally only has a bit 'brighter' (300-350nits) without the colour gamut can still look like a step up from their 1080p SDR TV - especially if they also jump up in size of screen too. Trying to describe what they are 'missing' is not easy.Its also not helped that many can't see HDR content on their PC or current TV either but you can show what an increase in resolution brings.

Personally, I think those TV's that don't reach a certain standard should be called HDR compatible or HDR supported - something to suggest that the buyer isn't getting a certain standard. You wouldn't buy a 1080/1440p monitor if you wanted 4k and a true 4k monitor was within your budget - even if those monitors would support 4k and super-sample it down and I bet you would be very 'disappointed' if you bought a monitor listed as 4k but only had a 1440p screen. Virtually all HDR TV's have to tone-map HDR content down (not all UHD Premium TV's would need to tone map all content as they can deliver over 1000nits but will need to tone-map 4000nit mastered content) - the equivalent of 'super-sampling' resolution in essence - compressing something larger to fit smaller parameters.
 
Given the average viewing distance is supposed to be 8-9ft and also factoring in the size of the average British living rm, I do wonder whether 90% of the public would still be perfectly happy with a 720p 42" plasma. But, manufacturers just want to sell TV's. They don't care about providing consumers with what they NEED. Their job is to make them WANT to part with their money to pay for TV's that they don't NEED. If you know your stuff its actually a great time to find bargains on ebay and gumtree. I saw a mint, boxed 42" GT60 sold for £150 on gumtree recently. £150!!! For everyone else who insist on paying for a brand spanking new TV, except the rich, its a nightmare.
 
I think the HDR alternatives are being over-stated as a new 'format war'. The standard is HDR10 which is open-source. Being open-source it means manufacturers are allowed to develop it further, to innovate and differentiate. The current such example is the difference approach each manufacturer takes to tone-mapping, ensuring that they can take the HDR10 standard and optimise it so it works best with their own displays.

HDR10+ to me, seems the natural next step, to being able to use dynamic meta-data, and improving displays on a scene-by-scene, content-by-content basis. The important thing here is that this development is open-source which means innovative work and techniques must be published back into the open-source community for their benefit, just as everyone benefits from the original development of HDR10. At least I am assuming this all works the same as the software/development world. HDR10+ 'must' be adopted into the UHD standard, as innovations like this that can be shared and adopted as a standard is the whole reason everyone signs up to these alliances in the first place, and ultimately the consumer benefits, albeit some innovations do inevitably drive manufacturing and newer models to upgrade to.

In actual fact Dolby Vision for me is the spoiler to the party. Much as I support a company spotting a gap in the market or a commercial opportunity, and investing their own money and time in R&D to create a revenue as reward, it is a closed system that Dolby now control, and in terms of a standard where every manufacturer is solely tied to a single company commercially is a bad situation. In fact, given Dolby sits on top of the HDR10 base which is open-source I am surprised a lot of their work is not required to be disclosed/published, rather than being closed to outsiders. Likewise for a modern-day software optimisation techniques to be tied to a hardware implementation, that then requires compatible hardware in each device is such a horribly expensive and anti-consumer approach I am quite surprised it was ratified by the UHD alliance in the first place.

In the fuss over DV versus HDR10+ , nobody seems to be blinking that there is another standard HLG+ for broadcast, and at some point I assume we will need another HDR implementation for Home Projection, which has been somewhat overlooked comparatively.
 
Given the average viewing distance is supposed to be 8-9ft and also factoring in the size of the average British living rm, I do wonder whether 90% of the public would still be perfectly happy with a 720p 42" plasma. But, manufacturers just want to sell TV's. They don't care about providing consumers with what they NEED. Their job is to make them WANT to part with their money to pay for TV's that they don't NEED. If you know your stuff its actually a great time to find bargains on ebay and gumtree. I saw a mint, boxed 42" GT60 sold for £150 on gumtree recently. £150!!! For everyone else who insist on paying for a brand spanking new TV, except the rich, its a nightmare.

Well comparing my decent 4 year old Samsung 46" 1080p F8000 series to my new LG OLED 55C7V 4K HDR watching Blue Planet II UHD on iPlayer and yes, we consumers do want the new tech! Even my wife said "Wow" and she rarely notices stuff like this.

I've wanted 4K OLED (and the HDR) since seeing one in Curry's window and this Christmas with price drop below £1500 jumped on it.

From what I'm reading HDR10+ is only being pushed so the big TV companies such as Samsung don't have to pay the Dolby Vision Licensing fee. For me as a general TV consumer the Dolby name still holds sway rightly or wrongly indicating high performance/quality hence I ensured the TV had Dolby Vision support. So long as BRHD discs and streaming sites support DV or a firmware update can add HDR10+ to my LG then no major issue (for me anyway).

PS. Its rarely as NEED but a WANT! Previously I'd only change TV if the old one died.
 
Last edited:
You can see what will happen here. No one TV will cover all the standards. No one player will handle all the HDR formats as applied to disc. This creates a market oppotunity for the likes of HDFury to step in with yep, yet another conversion box. Personally I trust Dolby Labs and their implementation of Dolby Vision. It's good enugh for the cinema and that is what we are all trying to emulate at the end of the day. As cinematic an experience as our space and funds and significant others will allow. :) If HDR10+ becomes a thing of any significant force then I will buy a conversion box to convert it to a Dolby Vision or HLG stream. If it requires HDMI2.1 then that will be a major barrier to adoption of the standard imo.
 
One of the TVs I was looking seriously at is the Sony XE9305, which will get DV soon, if this HDR10+ rolls along, would it be as easy as a software update or would it need new hardware?
Adding it is entirely possible, but at the moment Sony has shown no interest in supporting HDR10+, but perhaps they might surprise us at CES next week.
 
Are samsung actually going to update their 2016 suhd range with hdr10+ though, as they said it would be delivered by september 2016, and currently its a no show. I'm guessing they will ignore the 2016 sets like they have with the iplayer hlg update for 2016 sets.
 
@Steve Withers , The article says "Samsung have already announced that they will include HDR10+ on their 2018 models and that many of their 2017 models and some of their 2016 models will add support via a firmware update" ... yet previously, Samsung promised that ALL their 2016 models would get it.

Samsung has seemingly done a U-Turn on their promise for HLG support too; they are currently trying to wriggle out of providing BBC iPlayer 4K-HLG support for 2016 models on the flimsy primary-school-style tactic that iPlayer's HLG is not "Broadcast HLG". Despite providing support for "USB HLG" and "YouTube HLG", ie this is one hair they cannot surely split?

Us customers are NOT impressed with Samsung's dishonesty. Their promises mean nothing seemingly even when they were made in official press releases cleared by their legal team. [ref: Blue Planet 2 HLG - No Samsung - Samsung Community with hundreds of FURIOUS customers] . Any pressure you can put on Samsung would be helpful...
 
Last edited:
@BAMozzy

Firstly thanks for taking the time to explain the practical impact in layman's terms which is very useful. (the thread would benefit from having your post stickied at the top)

If the Movie only reaches 2500nits for example in one scene and that scene is the 'max' level in the whole movie, that could be displayed at the TV's maximum level and scale down from there. In the next scene, the maximum brightness is now 1000nits - something a TV Could display, but because of 'static metadata' - that is now scaled down because it had to fit 2500nits in another scene - therefore not optimal and obviously dimmer.

Surely this is necessary in order to keep everything in perspective and so that different scenes are correctly presented relative to one other?

With Dynamic Metadata, the max peak brightness of that 2500nit scene may well look identical to the Static Metadata but in the scene that only reaches 1000nits, the Dynamic Metadata would allow that scene to be displayed 1:1 and appear much brighter, more optimized.

In this case you could have two scenes that are supposed to be presented differently but end up looking similar in terms of brightness? If so then you could end up with an overall experience that is wildly skewed from how it is intended.

If you only have 700nits available, It could make a lot of scenes look much dimmer than necessary with Static metadata but Dynamic metadata would optimise every scene. Its still not as the professional mastered it to be BUT you couldn't display it at that standard due to hardware limitations.

So does it boil down to having two choices?

Either having some dimmer/brighter footage in order to preserve a consistent overall experience relative to how it is intended.

Or

Having a skewed overall experience where scenes that are supposed to look different to each other in terms of brightness/darkness end up looking similar and not at all as intended relative to one other.

Unless I'm missing something I think I'd rather go with the former and experience the movie relative to how it is supposed to look. The latter, whilst being more accurate in limited individual scenes, sounds like a far more compromised overall experience.

It appears that dynamic HDR is a compromise with far from perfect results.
 
Last edited:
It appears that dynamic HDR on current technology is a compromise with imperfect results and can only truly provide it's benefits on screens that can achieve peak brightness levels close to as originally intended.

Yes and the key thing you need to remember is that "screens that can achieve peak brightness levels close to as originally intended" are the mastering monitors which cost £30,000 and more!

The "two choices" are more along the lines of
1. Wait until we all have 4,000-nit capable TVs which are cheap and plentiful. 2030? 2040? In the meantime do nothing. Don't do HDR, or create the content, or create the market.

2. Use tone-mapping, to compromise between preserving brightness or preserving highlight detail*, safe in the knowledge that as we replace our TVs over the years they will get brighter and make the same content we have bought now look better and better. This is an amazing thing! As you bought better and better TVs in the 1980s and 1990s your VHS collection didn't look better did it? :) The limitations of the VHS format were such that the displays were always more capable than the content. What we have with HDR now is the other way around! But this is the early days. Our displays will "catch up" with what is on the disc, and one day display the same content without needing the compromises of tone-mapping and with a colour gamuts closer to rec.2020.

option 1 is the HDR equivalent of "not making any colour TV programmes until everyone who had a black and white TV set has bought a colour TV set". The earlier the content starts to be created, the better. Aren't you glad they made colour TV programmes early, even though hardly anyone had colour TVs, so that we can enjoy Monty Python in colour?

But the crux is that a single bright pixel in a highlight in a bright scene for a fraction of a second could make the rest of an entire movie look very dim with static metadata. But with dynamic metadata, the rest of the entire movie doesn't need to be excessively dim, bringing out more detail which would have been crushed.

So you've already been experiencing static tone-mapping. Dynamic tone mapping (dolby vision and HDR10Plus) just means much more intelligent tone mapping, giving much better results particularly on low-nit displays like OLEDs which can't get very bright.
 
Last edited by a moderator:
So the bottom line is that even though some scenes might be slightly off relative to one another the overall benefit very much outweighs that? I'm guessing it makes for interesting debate amongst cinephiles.
 
So the bottom line is that even though some scenes might be slightly off relative to one another the overall benefit very much outweighs that? I'm guessing it makes for interesting debate amongst cinephiles.

Rather than trying to summarise into the bottom line all the time, please watch Vincent's video. :)
It's not just an academic debate for us here. It's about providing much better picture quality for everyone.
 

The latest video from AVForums

TV Buying Guide - Which TV Is Best For You?
Subscribe to our YouTube channel
Back
Top Bottom