I'm confused mostly by the actual need for this (and DV).
My understanding is that both technologies will grade each scene "on the fly"... surely the film has already been graded in post production? So why the need to alter it at a later stage?
It's already been graded at great expense by a professional.
Formats will always evolve. The crux is, you'll always need a high end tv to get the best out of HDR and DV. I thought it would be easy to upgrade my tv, nothing to it. I've really been out of the loop for too long. This past two weeks on here has been an education.
This is all becoming terribly complicated and fragmented now. TV manufacturers already have to support a range of different HDR formats, and now we have yet another one. It's insanity.
The industry as a whole should have defined a single standard before a single HDR TV was ever sold. HDR 10 would have done the job quite nicely. It may not be quite as flashy as Dolby Vision, but it's certainly good enough. At least HDR 10 is included as a fallback layer some of the time, but it really should be there all of the time.
This fragmentation is already cutting some people out of the loop. Those who have gone out and bought brand spanking new 4K players, such as the Xbox One X, Fire TV 4K, NVidia Shield, or Apple TV 4K can't watch 4K HDR content from the BBC at all.
Format wars are never a good idea.
Yeah, there's a lot going on. My brother bought a 75" Sony 4K TV in '15, and was annoyed recently to find it doesn't do HLG and can't be upgraded to do so. It transpired he didn't even know what HLG is (bit odd he was so annoyed, then!), and hadn't even heard of HDR until I explained it to him about 2 weeks ago. Never asked me about it when he was looking to buy. He thought 4K was it - done, buy a telly. Probably a lot of the public who think it's working like this. I've asked a few people I know who've bought TVs in the last couple of years what HDR is, and none of them have a clue, not even slightly. They know 4K means "a sharper picture", but if I then explain HDR and it's potential impact on the picture in even the most basic way, their eyes glaze over. They're not remotely interested.
Given the average viewing distance is supposed to be 8-9ft and also factoring in the size of the average British living rm, I do wonder whether 90% of the public would still be perfectly happy with a 720p 42" plasma. But, manufacturers just want to sell TV's. They don't care about providing consumers with what they NEED. Their job is to make them WANT to part with their money to pay for TV's that they don't NEED. If you know your stuff its actually a great time to find bargains on ebay and gumtree. I saw a mint, boxed 42" GT60 sold for £150 on gumtree recently. £150!!! For everyone else who insist on paying for a brand spanking new TV, except the rich, its a nightmare.
Adding it is entirely possible, but at the moment Sony has shown no interest in supporting HDR10+, but perhaps they might surprise us at CES next week.One of the TVs I was looking seriously at is the Sony XE9305, which will get DV soon, if this HDR10+ rolls along, would it be as easy as a software update or would it need new hardware?
If the Movie only reaches 2500nits for example in one scene and that scene is the 'max' level in the whole movie, that could be displayed at the TV's maximum level and scale down from there. In the next scene, the maximum brightness is now 1000nits - something a TV Could display, but because of 'static metadata' - that is now scaled down because it had to fit 2500nits in another scene - therefore not optimal and obviously dimmer.
With Dynamic Metadata, the max peak brightness of that 2500nit scene may well look identical to the Static Metadata but in the scene that only reaches 1000nits, the Dynamic Metadata would allow that scene to be displayed 1:1 and appear much brighter, more optimized.
If you only have 700nits available, It could make a lot of scenes look much dimmer than necessary with Static metadata but Dynamic metadata would optimise every scene. Its still not as the professional mastered it to be BUT you couldn't display it at that standard due to hardware limitations.
It appears that dynamic HDR on current technology is a compromise with imperfect results and can only truly provide it's benefits on screens that can achieve peak brightness levels close to as originally intended.
So the bottom line is that even though some scenes might be slightly off relative to one another the overall benefit very much outweighs that? I'm guessing it makes for interesting debate amongst cinephiles.