BAMozzy
Distinguished Member
HDR10+ is basically Dolby Vision tech in an HDR10 set isn't it?
In simple terms - I guess you could say that its similar.
HDR10+ is HDR10 with Dynamic Metadata. Dolby Vision also has Dynamic Metadata but also uses 12bit colour depth and mastered (currently) to 4000nits but could be mastered up to 10000nits. HDR10 can be mastered at 1000 or 4000nits and 10bit colour depth. Its also open source where as Dolby Vision is licensed and requires a fee to be paid by studios and manufacturers to use Dolby Vision.
If/when HDR10+ comes to Bluray, it will be handled in the same way. the base layer will be HDR10 so anyone with a 4k HDR TV and Player will get HDR10. those with a Dolby Vision or HDR10+ TV and player will get access to a layer on top of HDR10. The layer is different for Dolby Vision (SMPTE ST-2094 10) and HDR10+ (SMPTE ST 2094-40). Dolby use Parametric Tone Mapping whilst HDR10+ use Scene-based Color Volume Mapping - both different types of metadata so essentially they are different.
For the average consumer, the two formats are very similar and should be delivered by similar methods but the metadata itself is different. The end result for us though is that we get a 'better' HDR experience as the scenes themselves will be optimised for our TV's.