ARTICLE: What is HDR10+?

Discussion in 'General TV Discussions Forum' started by Steve Withers, Jan 2, 2018.


    1. Steve Withers

      Steve Withers
      Reviewer

      Joined:
      Oct 18, 2009
      Messages:
      9,760
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      166
      Location:
      AVForums
      Ratings:
      +9,259
    2. ggrossen

      ggrossen
      Active Member

      Joined:
      Jun 11, 2006
      Messages:
      287
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      46
      Location:
      london
      Ratings:
      +139
      Good article Steve and some welcome clarity in this dynamic space. In some respects all this innovation is great. But where it falls down is support and implementation by content creators and then the whole plethora of devices that need to work for a flawless consumer experience. I feel that Dolby has the advantage in the creation space but let’s see how this plays out - George
       
      • Like Like x 1
      • Thanks Thanks x 1
      • Agree Agree x 1
      • List
    3. spookster

      spookster
      Active Member

      Joined:
      Jan 15, 2005
      Messages:
      1,405
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      66
      Location:
      Epping
      Ratings:
      +150
      Regulations, standards...it's all very confusing without anything being cemented in terms of universal validation but then again you can argue we're in a state of progressive flux and innovation. It's all transitional but I do agree that the tech is running way ahead of the content currently. The fact that a few weeks ago I had no idea that a tv being labelled with 'HDR' did not mean it could produce the output with any degree of quality blew my mind. As more of a casual compared to many on this forum, I found this alone to be misleading - yet the average user would never know.

      I'm grabbing the XE900 this month and already I can list things that I wont have that might be introduced to more tvs over the next year or two (HDMI 2.1, Dolby Vision etc). I think we're always going to be in a perpetual state of waiting for the perfect tv when there's no such thing.
       
      Last edited: Jan 2, 2018
    4. Kotatsu Neko

      Kotatsu Neko
      Well-known Member

      Joined:
      Aug 4, 2006
      Messages:
      1,208
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      116
      Location:
      Leamington Spa
      Ratings:
      +994
      This is all becoming terribly complicated and fragmented now. TV manufacturers already have to support a range of different HDR formats, and now we have yet another one. It's insanity.

      The industry as a whole should have defined a single standard before a single HDR TV was ever sold. HDR 10 would have done the job quite nicely. It may not be quite as flashy as Dolby Vision, but it's certainly good enough. At least HDR 10 is included as a fallback layer some of the time, but it really should be there all of the time.

      This fragmentation is already cutting some people out of the loop. Those who have gone out and bought brand spanking new 4K players, such as the Xbox One X, Fire TV 4K, NVidia Shield, or Apple TV 4K can't watch 4K HDR content from the BBC at all.

      Format wars are never a good idea.
       
      • Like Like x 3
      • Agree Agree x 3
      • List
    5. Roohster

      Roohster
      Distinguished Member

      Joined:
      Jul 1, 2004
      Messages:
      4,305
      Products Owned:
      7
      Products Wanted:
      1
      Trophy Points:
      166
      Location:
      Derbyshire
      Ratings:
      +4,818
      I'm confused mostly by the actual need for this (and DV).

      My understanding is that both technologies will grade each scene "on the fly"... surely the film has already been graded in post production? So why the need to alter it at a later stage?

      It's already been graded at great expense by a professional.
       
    6. boyarin

      boyarin
      Active Member

      Joined:
      Sep 8, 2017
      Messages:
      50
      Products Owned:
      0
      Products Wanted:
      1
      Trophy Points:
      11
      Location:
      USA
      Ratings:
      +21
      This amount of different metadata is only confuses me as a customer. I wont invest my money in a tv till the end of 4k cycle.
       
    7. Winnie1221

      Winnie1221
      Active Member

      Joined:
      Mar 28, 2010
      Messages:
      412
      Products Owned:
      0
      Products Wanted:
      2
      Trophy Points:
      46
      Location:
      Ashton-Under-Lyne
      Ratings:
      +209
      I have a few Dolby Vision UHD titles, though admittedly, there is not lots of content out there yet, but the list is growing - and according to bluray.com Blade Runner 2049 will have Dolby Vision. But I must admit watching in DV, there is a very clear improvement over regular HDR (having watched the same film in DV and in HDR).

      I have not seen any examples of HDR+ so cannot comment on this, but i'll be sticking with regular HDR and Dolby Vision for now.
       
    8. spookster

      spookster
      Active Member

      Joined:
      Jan 15, 2005
      Messages:
      1,405
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      66
      Location:
      Epping
      Ratings:
      +150
      Formats will always evolve. The crux is, you'll always need a high end tv to get the best out of HDR and DV. I thought it would be easy to upgrade my tv, nothing to it. I've really been out of the loop for too long. This past two weeks on here has been an education.
       
    9. BAMozzy

      BAMozzy
      Distinguished Member

      Joined:
      Feb 27, 2015
      Messages:
      3,941
      Products Owned:
      4
      Products Wanted:
      6
      Trophy Points:
      167
      Location:
      UK
      Ratings:
      +2,771
      The 'Professional' will have graded that content to a certain standard - like 1000 or 4000nits. However, not every TV can reach 1000nits and NO current TV can deliver 4000nits. Therefore, the content has to be scaled down to the TV - tone mapped.

      Static Metadata - in HDR10 - will apply a single tone mapping algorithm for the whole movie. If the Movie only reaches 2500nits for example in one scene and that scene is the 'max' level in the whole movie, that could be displayed at the TV's maximum level and scale down from there. In the next scene, the maximum brightness is now 1000nits - something a TV Could display, but because of 'static metadata' - that is now scaled down because it had to fit 2500nits in another scene - therefore not optimal and obviously dimmer.

      With Dynamic Metadata, the max peak brightness of that 2500nit scene may well look identical to the Static Metadata but in the scene that only reaches 1000nits, the Dynamic Metadata would allow that scene to be displayed 1:1 and appear much brighter, more optimised.

      There are generally 2 main points in HDR tone mapping, the point at which it stops following the curve accurately and the maximum peak brightness or point at which 'clipping' occurs. With Static Metadata, these points are fixed from start to finish. If its set that at 300nits, it starts to fall away from the curve to scale the highlights down, then throughout the Movie, regardless of the scene, at 300nits, the TV will stop presenting brightness at 1:1 and scale down - even if the scene could be displayed 1:1. With Dynamic Metadata, those points can change - vary the amount of 1:1 mapping and highlight scaling to deliver an 'Optimum' viewing experience.

      This is much more important on TV's that, at the moment are not able to deliver the necessary performance to match the 'Professional' master. Its more beneficial on the more under-performing HDR TV's and not important on TV's that, in the future, can deliver 4000 (or even 10000nits). Both Dynamic and Static Metadata would map the content 1:1 if the TV's could handle it as both would follow the PQ curve.

      If you only have 700nits available, It could make a lot of scenes look much dimmer than necessary with Static metadata but Dynamic metadata would optimise every scene. Its still not as the professional mastered it to be BUT you couldn't display it at that standard due to hardware limitations.
       
      • Useful Useful x 3
      • Like Like x 2
      • List
    10. BAMozzy

      BAMozzy
      Distinguished Member

      Joined:
      Feb 27, 2015
      Messages:
      3,941
      Products Owned:
      4
      Products Wanted:
      6
      Trophy Points:
      167
      Location:
      UK
      Ratings:
      +2,771
      The crux is that DV and HDR10+ are both designed to 'optimise' HDR content according to the HDR capabilities of the TV - the worse it is, the more that these methods benefit. If you had a TV that could deliver the standards at which the content is mastered to, you wouldn't see any difference as the 'Dynamic' and static metadata would map the content exactly as the colourist intended, follow the curve precisely. The Benefit of DV at that point would only be 12bit colour depth but HDR10/10+ and DV would all be displaying the content 1:1. Dynamic Metadata optimises the HDR content to that TV's capability but it will not make a 600nit HDR TV look better than a 4000nit static metadata TV. It will make HDR look more optimised on that TV and compared to any other 600nit TV - may even look better than some 1000nit TV's in some scenes too.

      Of course to get the best out of HDR - regardless of Static/Dynamic metadata will always be to get the high-end HDR TV.
       
    11. john039

      john039
      Active Member

      Joined:
      Sep 17, 2002
      Messages:
      693
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      31
      Location:
      West Midlands
      Ratings:
      +72
      One of the TVs I was looking seriously at is the Sony XE9305, which will get DV soon, if this HDR10+ rolls along, would it be as easy as a software update or would it need new hardware?
       
    12. johngerard

      johngerard
      Active Member

      Joined:
      Apr 25, 2012
      Messages:
      125
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      28
      Ratings:
      +36
      Yeah, there's a lot going on. My brother bought a 75" Sony 4K TV in '15, and was annoyed recently to find it doesn't do HLG and can't be upgraded to do so. It transpired he didn't even know what HLG is (bit odd he was so annoyed, then!), and hadn't even heard of HDR until I explained it to him about 2 weeks ago. Never asked me about it when he was looking to buy. He thought 4K was it - done, buy a telly. Probably a lot of the public who think it's working like this. I've asked a few people I know who've bought TVs in the last couple of years what HDR is, and none of them have a clue, not even slightly. They know 4K means "a sharper picture", but if I then explain HDR and it's potential impact on the picture in even the most basic way, their eyes glaze over. They're not remotely interested.
       
      • Agree Agree x 1
      • Useful Useful x 1
      • List
    13. Mr Quint

      Mr Quint
      Well-known Member

      Joined:
      Jan 31, 2005
      Messages:
      5,728
      Products Owned:
      4
      Products Wanted:
      10
      Trophy Points:
      136
      Location:
      Amity Island (Martha's Vineyard)
      Ratings:
      +1,694
      Thank Christ I'm content with my 7 year old 1080p Samsung, Base Xbox One and Base PS4...... Until Jaws comes out on a new format Lol.
       
      • Like Like x 2
      • Funny Funny x 1
      • List
    14. BAMozzy

      BAMozzy
      Distinguished Member

      Joined:
      Feb 27, 2015
      Messages:
      3,941
      Products Owned:
      4
      Products Wanted:
      6
      Trophy Points:
      167
      Location:
      UK
      Ratings:
      +2,771
      Its not helped even if they do have a vague idea or are looking for a HDR TV because of console gaming that a lot of TV's that are so-called HDR offer 'support' to play HDR content but not necessarily 'HDR'. Some TV's are not that different from a SDR 4k TV but have the ability to play the content with the standard REC709 colour gamut and only brighter because the TV is now on Max.

      All HDR TV's can display HDR10 content - whether its downgraded to the point where its more SDR+ or actually delivers at least the 'minimum' UHD Premium standard. It seems that DV and HDR10+ could well be 'rival' options. Sony and LG offering DV with Samsung and Panasonic going down the HDR10+ route. Whether we see a TV or Bluray player with 'both' - assuming it does get BDA approval or not, who knows. Both HDR10+ and DV require both the source and display to support the format but if only one device (source or display) supports either of these formats, you will just get HDR10.

      HLG is the format for broadcasters as it doesn't require any post-processing to add in the Metadata. That means broadcasters can still deliver HDR 'live' content like sports for example.

      The UHD Alliance was set up to try and put standards in place for both content and displays. The UHD Premium Logo is supposed to represent that a device meets a minimum standard. It seems though that manufacturers are cashing in on 'HDR' though - making sub UHD Premium TV's seem as if they are just as good as UHD Premium TV's - that you are paying extra for some 'design' or feature - like Samsung and the Quantum Dot TV's, LG with OLED vs their LCD TV's, Sony by not putting their TV's up for UHD Premium - its like the 1080p era where you still get the minimum standards for content regardless of model but paying more for better processors or aesthetics. People see its HDR and expect it to be as much a HDR TV as any other.

      HDR is difficult to explain and for people to understand. An increase in resolution isn't that difficult as most remember what SD to HD did. Trying to sell a TV by talking about nits or brightness, colour gamuts, colour volumes, contrast and/or increase in 'bit depth' can be difficult. Most who haven't seen HDR think their TV is bright enough, has enough colours etc and that their pictures look 'lifelike' and 'real'. Even if they do buy a HDR TV that literally only has a bit 'brighter' (300-350nits) without the colour gamut can still look like a step up from their 1080p SDR TV - especially if they also jump up in size of screen too. Trying to describe what they are 'missing' is not easy.Its also not helped that many can't see HDR content on their PC or current TV either but you can show what an increase in resolution brings.

      Personally, I think those TV's that don't reach a certain standard should be called HDR compatible or HDR supported - something to suggest that the buyer isn't getting a certain standard. You wouldn't buy a 1080/1440p monitor if you wanted 4k and a true 4k monitor was within your budget - even if those monitors would support 4k and super-sample it down and I bet you would be very 'disappointed' if you bought a monitor listed as 4k but only had a 1440p screen. Virtually all HDR TV's have to tone-map HDR content down (not all UHD Premium TV's would need to tone map all content as they can deliver over 1000nits but will need to tone-map 4000nit mastered content) - the equivalent of 'super-sampling' resolution in essence - compressing something larger to fit smaller parameters.
       
      • Like Like x 1
      • Agree Agree x 1
      • List
    15. Ossie77

      Ossie77
      Active Member

      Joined:
      Nov 11, 2012
      Messages:
      1,616
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      66
      Ratings:
      +173
      Given the average viewing distance is supposed to be 8-9ft and also factoring in the size of the average British living rm, I do wonder whether 90% of the public would still be perfectly happy with a 720p 42" plasma. But, manufacturers just want to sell TV's. They don't care about providing consumers with what they NEED. Their job is to make them WANT to part with their money to pay for TV's that they don't NEED. If you know your stuff its actually a great time to find bargains on ebay and gumtree. I saw a mint, boxed 42" GT60 sold for £150 on gumtree recently. £150!!! For everyone else who insist on paying for a brand spanking new TV, except the rich, its a nightmare.
       
    16. scrowe

      scrowe
      Well-known Member

      Joined:
      Jan 8, 2005
      Messages:
      1,417
      Products Owned:
      1
      Products Wanted:
      0
      Trophy Points:
      116
      Location:
      Brighton
      Ratings:
      +892
      I think the HDR alternatives are being over-stated as a new 'format war'. The standard is HDR10 which is open-source. Being open-source it means manufacturers are allowed to develop it further, to innovate and differentiate. The current such example is the difference approach each manufacturer takes to tone-mapping, ensuring that they can take the HDR10 standard and optimise it so it works best with their own displays.

      HDR10+ to me, seems the natural next step, to being able to use dynamic meta-data, and improving displays on a scene-by-scene, content-by-content basis. The important thing here is that this development is open-source which means innovative work and techniques must be published back into the open-source community for their benefit, just as everyone benefits from the original development of HDR10. At least I am assuming this all works the same as the software/development world. HDR10+ 'must' be adopted into the UHD standard, as innovations like this that can be shared and adopted as a standard is the whole reason everyone signs up to these alliances in the first place, and ultimately the consumer benefits, albeit some innovations do inevitably drive manufacturing and newer models to upgrade to.

      In actual fact Dolby Vision for me is the spoiler to the party. Much as I support a company spotting a gap in the market or a commercial opportunity, and investing their own money and time in R&D to create a revenue as reward, it is a closed system that Dolby now control, and in terms of a standard where every manufacturer is solely tied to a single company commercially is a bad situation. In fact, given Dolby sits on top of the HDR10 base which is open-source I am surprised a lot of their work is not required to be disclosed/published, rather than being closed to outsiders. Likewise for a modern-day software optimisation techniques to be tied to a hardware implementation, that then requires compatible hardware in each device is such a horribly expensive and anti-consumer approach I am quite surprised it was ratified by the UHD alliance in the first place.

      In the fuss over DV versus HDR10+ , nobody seems to be blinking that there is another standard HLG+ for broadcast, and at some point I assume we will need another HDR implementation for Home Projection, which has been somewhat overlooked comparatively.
       
    17. psikey

      psikey
      Well-known Member

      Joined:
      Oct 5, 2005
      Messages:
      7,646
      Products Owned:
      1
      Products Wanted:
      0
      Trophy Points:
      136
      Location:
      South Yorkshire
      Ratings:
      +950
      Well comparing my decent 4 year old Samsung 46" 1080p F8000 series to my new LG OLED 55C7V 4K HDR watching Blue Planet II UHD on iPlayer and yes, we consumers do want the new tech! Even my wife said "Wow" and she rarely notices stuff like this.

      I've wanted 4K OLED (and the HDR) since seeing one in Curry's window and this Christmas with price drop below £1500 jumped on it.

      From what I'm reading HDR10+ is only being pushed so the big TV companies such as Samsung don't have to pay the Dolby Vision Licensing fee. For me as a general TV consumer the Dolby name still holds sway rightly or wrongly indicating high performance/quality hence I ensured the TV had Dolby Vision support. So long as BRHD discs and streaming sites support DV or a firmware update can add HDR10+ to my LG then no major issue (for me anyway).

      PS. Its rarely as NEED but a WANT! Previously I'd only change TV if the old one died.
       
      Last edited: Jan 3, 2018
    18. Captain Ron

      Captain Ron
      Well-known Member

      Joined:
      Jul 6, 2007
      Messages:
      2,028
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      114
      Ratings:
      +1,000
      You can see what will happen here. No one TV will cover all the standards. No one player will handle all the HDR formats as applied to disc. This creates a market oppotunity for the likes of HDFury to step in with yep, yet another conversion box. Personally I trust Dolby Labs and their implementation of Dolby Vision. It's good enugh for the cinema and that is what we are all trying to emulate at the end of the day. As cinematic an experience as our space and funds and significant others will allow. :) If HDR10+ becomes a thing of any significant force then I will buy a conversion box to convert it to a Dolby Vision or HLG stream. If it requires HDMI2.1 then that will be a major barrier to adoption of the standard imo.
       
    19. Norman

      Norman
      Well-known Member

      Joined:
      Jul 14, 2000
      Messages:
      1,705
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      116
      Location:
      Herne Bay
      Ratings:
      +562
      Adding it is entirely possible, but at the moment Sony has shown no interest in supporting HDR10+, but perhaps they might surprise us at CES next week.
       
    20. jep

      jep
      Active Member

      Joined:
      Sep 18, 2016
      Messages:
      314
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      46
      Location:
      England
      Ratings:
      +200
      Are samsung actually going to update their 2016 suhd range with hdr10+ though, as they said it would be delivered by september 2016, and currently its a no show. I'm guessing they will ignore the 2016 sets like they have with the iplayer hlg update for 2016 sets.
       
    21. mrtickleuk

      mrtickleuk
      Active Member

      Joined:
      Aug 13, 2016
      Messages:
      184
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      46
      Location:
      UK
      Ratings:
      +159
      @Steve Withers , The article says "Samsung have already announced that they will include HDR10+ on their 2018 models and that many of their 2017 models and some of their 2016 models will add support via a firmware update" ... yet previously, Samsung promised that ALL their 2016 models would get it.

      Samsung has seemingly done a U-Turn on their promise for HLG support too; they are currently trying to wriggle out of providing BBC iPlayer 4K-HLG support for 2016 models on the flimsy primary-school-style tactic that iPlayer's HLG is not "Broadcast HLG". Despite providing support for "USB HLG" and "YouTube HLG", ie this is one hair they cannot surely split?

      Us customers are NOT impressed with Samsung's dishonesty. Their promises mean nothing seemingly even when they were made in official press releases cleared by their legal team. [ref: Blue Planet 2 HLG - No Samsung - Samsung Community with hundreds of FURIOUS customers] . Any pressure you can put on Samsung would be helpful...
       
      • Like Like x 1
      • Agree Agree x 1
      • List
      Last edited: Jan 3, 2018
    22. PC1975

      PC1975
      Well-known Member

      Joined:
      Jan 21, 2012
      Messages:
      3,885
      Products Owned:
      2
      Products Wanted:
      0
      Trophy Points:
      137
      Location:
      Glasgow
      Ratings:
      +1,467
      @BAMozzy

      Firstly thanks for taking the time to explain the practical impact in layman's terms which is very useful. (the thread would benefit from having your post stickied at the top)

      Surely this is necessary in order to keep everything in perspective and so that different scenes are correctly presented relative to one other?

      In this case you could have two scenes that are supposed to be presented differently but end up looking similar in terms of brightness? If so then you could end up with an overall experience that is wildly skewed from how it is intended.

      So does it boil down to having two choices?

      Either having some dimmer/brighter footage in order to preserve a consistent overall experience relative to how it is intended.

      Or

      Having a skewed overall experience where scenes that are supposed to look different to each other in terms of brightness/darkness end up looking similar and not at all as intended relative to one other.

      Unless I'm missing something I think I'd rather go with the former and experience the movie relative to how it is supposed to look. The latter, whilst being more accurate in limited individual scenes, sounds like a far more compromised overall experience.

      It appears that dynamic HDR is a compromise with far from perfect results.
       
      Last edited: Jan 4, 2018
    23. mrtickleuk

      mrtickleuk
      Active Member

      Joined:
      Aug 13, 2016
      Messages:
      184
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      46
      Location:
      UK
      Ratings:
      +159
      Yes and the key thing you need to remember is that "screens that can achieve peak brightness levels close to as originally intended" are the mastering monitors which cost £30,000 and more!

      The "two choices" are more along the lines of
      1. Wait until we all have 4,000-nit capable TVs which are cheap and plentiful. 2030? 2040? In the meantime do nothing. Don't do HDR, or create the content, or create the market.

      2. Use tone-mapping, to compromise between preserving brightness or preserving highlight detail*, safe in the knowledge that as we replace our TVs over the years they will get brighter and make the same content we have bought now look better and better. This is an amazing thing! As you bought better and better TVs in the 1980s and 1990s your VHS collection didn't look better did it? :) The limitations of the VHS format were such that the displays were always more capable than the content. What we have with HDR now is the other way around! But this is the early days. Our displays will "catch up" with what is on the disc, and one day display the same content without needing the compromises of tone-mapping and with a colour gamuts closer to rec.2020.

      option 1 is the HDR equivalent of "not making any colour TV programmes until everyone who had a black and white TV set has bought a colour TV set". The earlier the content starts to be created, the better. Aren't you glad they made colour TV programmes early, even though hardly anyone had colour TVs, so that we can enjoy Monty Python in colour?

      But the crux is that a single bright pixel in a highlight in a bright scene for a fraction of a second could make the rest of an entire movie look very dim with static metadata. But with dynamic metadata, the rest of the entire movie doesn't need to be excessively dim, bringing out more detail which would have been crushed.

      So you've already been experiencing static tone-mapping. Dynamic tone mapping (dolby vision and HDR10Plus) just means much more intelligent tone mapping, giving much better results particularly on low-nit displays like OLEDs which can't get very bright.
       
      Last edited by a moderator: Mar 13, 2018
    24. PC1975

      PC1975
      Well-known Member

      Joined:
      Jan 21, 2012
      Messages:
      3,885
      Products Owned:
      2
      Products Wanted:
      0
      Trophy Points:
      137
      Location:
      Glasgow
      Ratings:
      +1,467
      So the bottom line is that even though some scenes might be slightly off relative to one another the overall benefit very much outweighs that? I'm guessing it makes for interesting debate amongst cinephiles.
       
    25. mrtickleuk

      mrtickleuk
      Active Member

      Joined:
      Aug 13, 2016
      Messages:
      184
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      46
      Location:
      UK
      Ratings:
      +159
      Rather than trying to summarise into the bottom line all the time, please watch Vincent's video. :)
      It's not just an academic debate for us here. It's about providing much better picture quality for everyone.
       
    26. PC1975

      PC1975
      Well-known Member

      Joined:
      Jan 21, 2012
      Messages:
      3,885
      Products Owned:
      2
      Products Wanted:
      0
      Trophy Points:
      137
      Location:
      Glasgow
      Ratings:
      +1,467
      You do not get to dictate what others should and shouldn't discuss on here.

      If you have something of value to add then do so rather than trying to belittle and stifle others input. And please refrain from trying to speak on behalf of everyone on avf, ie 'us here'.

      I've been one of 'us here' for considerably longer than yourself and don't need to be told how to think or conduct myself thank you. And even though you seem to fail to grasp the situation I am not merely trying to provoke 'academic debate' as you choose to call it.
       
      Last edited: Jan 4, 2018
    27. MikeTVMikeTV

      MikeTVMikeTV
      Well-known Member

      Joined:
      Jan 8, 2015
      Messages:
      1,652
      Products Owned:
      2
      Products Wanted:
      0
      Trophy Points:
      117
      Location:
      birmingham
      Ratings:
      +700
      Won't be changing my 2016 DX750 anytime soon, HDR standards are a right mess.

      8K OLED in 2020 I think.
       
    28. mrtickleuk

      mrtickleuk
      Active Member

      Joined:
      Aug 13, 2016
      Messages:
      184
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      46
      Location:
      UK
      Ratings:
      +159
      Wow, where did all that come from? :-( I obviously meant no offence, I was just trying to help, spending a lot of time writing you a very long reply earlier explaining as best as I can, but Vincent can explain it much better. I think your invective was completely unnecessary and very unwelcome.
       
      Last edited: Jan 4, 2018
    29. BAMozzy

      BAMozzy
      Distinguished Member

      Joined:
      Feb 27, 2015
      Messages:
      3,941
      Products Owned:
      4
      Products Wanted:
      6
      Trophy Points:
      167
      Location:
      UK
      Ratings:
      +2,771
      It all about average picture luminosity (APL). The scene that's 2500nits for example could well be a bright day scene with the sun at 2500nits and the clouds, sun reflection of of water etc all requiring to be 'bright' - over 1000nits for example. The Tone mapping algorithm scales all that down so that the sun is 1000 nits and all those bright areas are now 700-1000nits making the scene look very bright overall. The next scene only has a candle glowing in a dark scene - that candle can be displayed at say 650 nits with Static Metadata but the glow around it now looks much dimmer and doesn't appear as if the candle is giving off much light at all as the areas that should be brighter are dimmed down. With Dynamic metadata, that scene is now able to be displayed at 1:1 but still has a lower APL because much more of the screen is dark.

      As explained above, you can still have two very different scenes - both with exactly the same Peak Brightness. SDR content is mastered to 100-120nits and yet you can display a 'bright' daytime scene with a lot of the screen being 'bright' areas and you can display a very dark scene with just a single light source - like a candle, torch, moon etc that can be just as bright as the sun was in the daytime scene but overall the scene looks much different.

      Its not just about the peak brightness, its also about the APL and the 'scene' itself - what the rest of the pixels are doing. A single candle can seem intensely bright in a very dark room but barely noticeable if lit during a sunny day. The Sun may have had to be 2500nits to stand out against the bright clouds, sunlight reflections on the water etc, stand out on a 'bright' scene. The Bright scene still has a much higher APL than the Dark scene even though they now have the same 'Peak brightness'



      It doesn't boil down like that at all. What actually happens is that the dark scenes in HDR are actually better optimised with Dynamic HDR rather than having much impact on the 'bright' scenes. The problem with Static Metadata is that it applies a single tone mapping algorithm based on the whole movie so the Dark scenes tend to look less impactful because of the tone-mapping down required for the bright scenes. The warm highlights from a candle flickering or fire burning etc are reduced down so Dark scenes have even lower APL.

      Dynamic Metadata would still tone map those bright scenes down - probably in a very similar algorithm to the Static metadata so you see very little difference between both. However when you compare Dark scenes, the Dynamic metadata optimises the picture much more to give a higher APL and more impact than Static does.

      HDR10 itself still looks stunning but Dynamic Metadata works too improve the overall content. Its not 'skewing' the overall experience but 'enhancing' it. As I said it mostly helps with the Darker scenes, giving them the 'impact' the 'Professional Colourist' intended. Some Dark scenes may not look much better than SDR because the limited highlights are now dimmed down because of the 'bright' scenes need to fit in those highlights and inflexibility of Static Metadata where as Dynamic Metadata allows those scenes to have more impact.

      Its maybe difficult to explain. If you look at a lot of demonstrations of what Dynamic Metadata brings, these tend to focus more on the dark scenes - the glow of a fire and the highlights it casts for example. Its not just the 'peak brightness' that can be 'optimised' but also the point at which the content follows the curve 1:1 too. In static Metadata, they may decide to fix that point at 250nits because 90% of the image is 250nits of less (the other 10% just highlights), 90% of the time. A few scenes may have 90% of the image at 350nits and with Dynamic Metadata, that can adjust those scenes to follow the curve up to 1:1 to optimise those scenes better - make some of the 'mid' range stand out a little bit more compared to static Metadata. It doesn't 'skew' the overall experience.

      The point is, Dynamic Metadata is designed to optimise every scene or even 'frame' to the capability of the TV, to enhance the overall experience. Its not designed to skew or compromise the experience at all. Its all about 'enhancing' the overall experience within the limitations of the current hardware. The Professional will master the content to a much higher standard than the TV's can display but they also have control over the Dynamic Metadata too. I am sure that if a 1000nit scene followed directly after a 2500nit scene and that it would 'skew' the image if it was displayed 1:1, the Dynamic metadata would display it at 800nits and under - still better than the Static Metadata's 600nits for example.
       
      • Like Like x 1
      • Thanks Thanks x 1
      • List
    30. Garioch

      Garioch
      Well-known Member

      Joined:
      Apr 25, 2003
      Messages:
      792
      Products Owned:
      2
      Products Wanted:
      31
      Trophy Points:
      96
      Location:
      Scotland
      Ratings:
      +960

    Share This Page

    Loading...
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.
    Dismiss Notice