1. Join Now

    AVForums.com uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

What is Dolby Vision?

Is Dolby’s proprietary version of HDR the future of television?

by Steve Withers Sep 1, 2016


  • Everyone is talking about High Dynamic Range (HDR) but did you know that there’s is more than one version?
    At the moment the version of HDR that is primarily being used by manufacturers and studios is HDR10, a generic open-source format. There is another version called Hybrid Log Gamma that was developed in partnership by NHK and the BBC, which is intended to be used for broadcast HDR. However there is also a third version called Dolby Vision. Why is that important? Well for one thing HDR was Dolby’s idea in the first place and even HDR10 uses a lot of Dolby’s proprietary technology as a basis for its implementation.

    Dolby have been developing Dolby Vision for a over decade and it was their research into how the human eye actually sees dynamic range – that is the difference between the deepest blacks and the brightest whites – that resulted in HDR. Dolby were years ahead of the curve in terms of their thinking when it came to higher dynamic range and it’s only recently that TV capabilities have reached a point where the technology can actually be implemented. However now that it’s here, the impact of HDR has been nothing short of a television revolution.

    The idea behind HDR in general and Dolby Vision in particular is to deliver a dramatically improved visual experience. This is achieved through a combination of a higher dynamic range and a wider colour gamut that results in brighter whites, deeper blacks and a fuller colour palette, that results in images that are more realistic than ever. However before we can look at Dolby Vision in more detail, we’ll need to explain some terminology that gets used a lot when discussing HDR.
    What is an EOTF?

    An Electro-Optical Transfer Function (EOTF) describes how to turn a digital signal into visible light and is commonly referred to as gamma. The non-linear gamma curve has remained much the same since its inception in the 1930s and is based on the characteristics of the cathode ray tube (CRT). Despite being used for nearly 80 years, it wasn’t until 2011 that the International Telecommunication Union (ITU) finally standardised gamma as ITU-R Recommendation BT.1886. However this EOTF was still derived from the limitations of a technology that is now redundant and the capabilities of modern displays have far exceeded those of CRT, especially in terms of their luminosity (overall brightness) and colour gamut (the colour palette). What was needed was a new EOTF, one that wasn’t defined by the limitations of an out-dated technology but was instead based on how humans actually see. So Dolby studied how the eye actually worked and conducted extensive viewer preference studies to discover the ideal dynamic range.

    After establishing that this range should be able to represent brightness from 0 to 10,000 nits (a measure of brightness), scientists at Dolby looked for an efficient way to deliver this higher dynamic range. In 100-nit Standard Dynamic Range (SDR) content, video is coded using 8-bit precision and a gamma curve. Using the same approach would require 14-bits when the range is increased from 100 to 10,000. Fortunately, the human visual system is much less sensitive to changes in highlights than in it is to changes in dark areas. So Dolby developed a new EOTF that took advantage of the way our eyes work to deliver the entire 10,000-nit range with 12-bits instead of 14. It also managed to do this without introducing any artefacts like banding or contouring. Dolby called this new EOTF the Perceptual Quantiser (PQ), it was standardised as SMPTE ST-2084 and forms the basis of both HDR10 and Dolby Vision.

    What is Colour Volume?

    Although HDR is most often associated with increases in the luminosity or brightness of images and specifically the peak brightness of specular highlights, there is another aspect that is equally as important – colour. A colour gamut is usually plotted in two dimensions but if you combine chromaticity (colour) and luminance (brightness) you create a three dimensional colour volume where the luminance axis is absolute and based on EOTF SMPTE ST.2084. As a result all the colours benefit from the increased brightness, as shown in the graph above. If you take the blue primary as an example, it is the same in a 2D chromaticity diagram regardless of the brightness. However once you add in the luminance things are very different and at 100 nits of overall luminosity blue has a brightness of 8 nits, at 800 nits it is 64 cd/m2 and at 4,000 nits it is 317 cd/m2. So a larger colour volume provides a larger palette of available colours and, as result a more realistic visual experience. It’s this combination of a wider colour gamut and an increased dynamic range that makes HDR so exciting.

    You can read more about wide colour gamuts here.
    Dolby Vision is a proprietary end-to-end ecosystem solution that goes from production to distribution and playback
    What is HDR?

    Cameras, be they film or digital, are capable of capturing higher dynamic ranges and wider colour gamuts but the current TV standards limit the dynamic range and colour gamut that can be reproduced via high definition broadcast or Blu-ray discs. Due to these limitations, which are defined by the Rec. 709 standard, a certain amount of brightness and colour has to be discarded in the production process in order to “fit” the delivery method. These TV and Blu-ray standards limit the maximum brightness to approximately 100 nits and the minimum brightness to approximately 0.1 nits, whilst also limiting the colour gamut to Rec. 709. As with the BT.1886 EOTF, these standards are based on the limitations of cathode ray tubes and date back to when CRT monitors were used for grading.

    High Dynamic Range (HDR) allows content creators to maintain a higher dynamic range and a wider colour gamut by providing technology that not only allows them to see in post-production what the camera has captured but also to allow for delivery of that content without losing any of that additional dynamic range or colour gamut. The graphic below shows how HDR allows more of what was originally captured to be retained through the post-production, grading, delivery and display processes. There are currently two main versions of HDR – the open-source HDR10 and Dolby’s proprietary Dolby Vision.

    HDR10 is a loosely defined term used to describe a generic HDR experience that is sometimes referred to as SMPTE HDR, alluding to the SMPTE standards that are used to define it. An example of HDR10 is the generic or base-layer HDR that is specified for Ultra HD Blu-ray by the Blu-ray Disc Association (BDA). HDR10 uses 10-bit video (which is the 10 in HDR10) and the PQ EOTF defined by SMPTE ST 2084, the Rec. 2020 colour gamut and is coded with static meta data as defined by SMPTE ST 2086. That means that the information relating to the minimum, maximum and average brightness is encoded for the entire feature, rather than on a scene-by-scene basis.

    Dolby Vision is a proprietary end-to-end ecosystem solution that goes from production to grading and post-production to mastering and encoding and finally to distribution and playback. It shares much of the same technology as HDR10 but also includes features that are exclusive to Dolby. So it uses the PQ EOTF defined by SMPTE ST 2084, the Rec. 2020 colour gamut and static meta data as defined by SMPTE ST 2086. However it also uses 12-bit video, dynamic meta data as defined by SMPTE ST 2094 and an Intelligent Display Mapping Engine. The addition of dynamic meta data means that information about the minimum, maximum and average brightness can be encoded on a scene-by-scene basis.

    You can read more about HDR here.
    What differentiates Dolby Vision from HDR10?

    Although both HDR10 and Dolby Vision share a similar foundation, there are significant differences. First of all HDR10 is a generic open-source form of HDR, which means that its implementation can vary depending on manufacturer, studio or video streaming service. There is some standardisation thanks to SMPTE ST2084 and ST2086 but the lack of overall standardisation is one of HDR10’s biggest drawbacks. There have been attempts to create a more cohesive approach to HDR10 and with the Ultra HD Blu-ray Disc Specifications, the Blu-ray Disc Association (BDA) has defined HDR10 as BDMV HDR. The Ultra HD Alliance (UHDA) has also announced its minimum requirements for HDR10 as part of its Ultra HD Premium certification program. However there is still a lot of work to be done in terms of creating a single HDR10 standard.

    Conversely Dolby Vision is an end-to-end closed eco-system with everything defined by Dolby from the post-production and grading all the way through to delivery and display. Dolby Vision is also included in the specifications for 4K Ultra HD Blu-ray but uses a 12-bit master with a colour space that can go up to Rec.2020. It also uses the SMPTE 2084 EOTF and and a peak brightness that could theoretically go as high as 10,000 Nits. However current Dolby professional monitors can ‘only’ go as high as 4,000 Nits and use the DCI colour space. Dolby have taken a different approach when it comes to delivery with a 10-bit base layer and a 2-bit enhancement layer that can deliver increased colour volumes up to 10,000 Nits. Dolby also have a single layer 10-bit solution that is aimed at broadcast and streaming applications.

    As mentioned previously, Dolby Vision uses very large colour volumes with luminance levels ranging from 0.0 (absolute black) to up to 10,000 nits and wider colour gamuts such as Rec. 2020. Since current display devices are unable to support the full extent of this range, each Dolby Vision display maps in real time from this large-input colour volume to the colour volume of the actual display device. This process preserves as much of the fidelity of the colour volume of the original source as possible. It’s this approach that gives Dolby Vision its advantage over HDR10.

    Dolby Vision offers improved optimisation by using dynamic metadata to adjust the contrast on a scene-by-scene basis. It also provides a scalable solution because the combination of dynamic metadata and an intelligent mapping engine improves the colour accuracy, dynamic contrast and detail retention down to mainstream panels. The combination of a 12-bit signal, dynamic metadata and display management results in images with the highest fidelity and minimised distortion to replicate the creative intent. Thanks to the use of 12-bit video, 10,000 nits and a Rec. 2020 container, Dolby Vision also offers a degree of future-proofing and a Dolby Vision display offers universal playback by being able to handle both Dolby Vision and HDR10.

    What are the real visual differences between HDR10 and Dolby Vision?

    Since Dolby Vision and HDR10 share much of the same technology, there would probably be little difference between the two on a well-specified flagship display. Both formats use the PQ EOTF, both use ST2086 static data and could theoretically go up to 10,000 nits of peak luminance and both use a Rec. 2020 container. The differences between HDR10 and Dolby Vision become more noticeable when the peak brightness of the TV is different from the HDR10 master, nominally 1,000 nits but it could go up to 4,000 nits. In this situation the Dolby Vision version may have more detail and accuracy than the HDR10 version of the same content. This is because the Dolby Vision signal contains dynamic metadata that enables accurate reproduction, mapping, and detail preservation.

    In the case of HDR10, the TV's system on chip (SoC) has to rely on extrapolation since only static metadata is available. Dolby Vision’s ability to reliably recreate HDR at various peak-brightness levels, results in a scalable solution that can be implemented across performance tiers for the best-quality image on a range of TVs. However there are plans to add dynamic metadata to HDR10 via SMPTE ST 2094, so this particular advantage of Dolby Vision may not last much longer. Although, as we mentioned earlier, there is still a lot of work to be done when it comes to HDR10 standardisation and there are still unanswered questions relating to whether dynamic HDR can be delivered over HDMI 2.0a.
    A combination of dynamic metadata and intelligent mapping improve the colour accuracy, dynamic contrast and detail retention
    How do you calibrate for Dolby Vision?

    One of the big advantages of Dolby Vision is how a display is calibrated. The calibration process uses absolute colour volumes as a precise way of describing colours and thanks to display mapping, the absolute colour volumes of both the source content and the target display device are known. This allows for accurate mapping from source content to the actual display’s capabilities. Instead of a generic reference, a Dolby Vision display uses a ‘golden reference’ tailored specifically to that display. With SDR content you calibrate to known standards like BT.1886 and BT.709. The content was created to these standards, so by doing this you retain the director’s intent. With Dolby Vision content you calibrate the colour volume of the display, so if the content has been produced beyond the capabilities of the display, it will be mapped to the display’s capabilities. This ensures that the image is both accurate and takes full advantage off the display’s native capabilities. This is achieved by comparing the EOTF and the colour primaries against the expected colour volume mapping using ‘golden reference’ values.

    What Dolby Vision displays are there?

    At present there are only a limited number of Dolby Vision displays available, primarily from LG globally and Vizio in the US. A number of Chinese manufacturers have also been looking at using Dolby Vision on their TVs. Dolby Vision’s greatest strength, it’s end-to-end closed eco-system, is also one of the reasons that many manufacturers are hesitant to embrace the format. Any manufacturer that uses Dolby Vision is essentially handing over development of their HDR platform to a third party. This can be appealing for a smaller manufacturer but larger TV manufacturers like Samsung would rather develop their own HDR10 platform and work towards greater standardisation through the the Ultra HD Alliance.

    There is also of course the issue of licence fees which any manufacturer using Dolby Vision will have to pay. Since HDR10 is open-source it is of course free. It would seem that most TV manufacturers feel that they can deliver an excellent HDR experience on their own, although only Samsung appear able to correctly map 4,000 nit content to their 1,000 nit panels. Where Dolby Vision does currently have an advantage is in terms of its dynamic metadata correctly tone-mapping content to TV’s with a less optimal peak brightness. This might well be why LG are using Dolby Vision on their OLED TVs, which can’t reach a peak brightness of 1,000 nits. It remains to be seen whether more manufacturers will embrace Dolby Vision but if the format is to succeed it needs to be available on more displays.

    Where can I find Dolby Vision content?

    The biggest issue facing Dolby Vision at the moment is in terms of content. The format has made headway with many of the video streaming services and Netflix use Dolby Vision globally. Amazon and VUDU are both using Dolby Vision in the States and Amazon will probably roll that out globally over time. The reason that Dolby Vision is proving so popular with video streaming services is the same reason that it is so effective with less optimal HDR TVs and that’s its dynamic metadata. Dolby Vision can help deliver a better and more consistent HDR experience than just static metadata on its own. Warner Bros, Sony Pictures, Universal, Lionsgate and MGM have all adopted Dolby Vision, at least for their streaming and download services.

    However no studio is currently using Dolby Vision for Ultra HD Blu-ray, despite it being included in the the format’s specifications. One reason might well be that there are currently no Ultra HD Blu-ray players that support Dolby Vision. Another reason is that the highly specified disc format with its larger bandwidth and greater consistency doesn’t really need Dolby Vision to deliver an excellent HDR experience. As with displays, Dolby Vision will need to begin making inroads into Ultra HD Blu-ray if it is going to become the dominant HDR format. There have been rumours of Dolby Vision capable Ultra HD Blu-ray players being announced at CES 2017 but nothing has been officially confirmed yet.

    So is Dolby Vision the future of Television?

    That really remains to be seen. There’s no doubt that Dolby Vision is technologically impressive and its end-to-end closed eco-system does mean that you can guarantee a superior performance. There is something very appealing about being able to calibrate a display to ensure that content is perfectly mapped to that display’s capabilities and the use of dynamic metadata means that Dolby Vision can deliver a better experience in less ideal conditions. It appears to be fast becoming the format of choice for HDR streaming and there is plenty of studio support, although that support doesn't currently extend to Ultra HD Blu-ray. However Dolby clearly has a close relationship with the majority of studios, all of whom have embraced the cinema version of Dolby Vision, and Dolby Vision enabled Ultra HD Blu-ray players are expected in early 2017. The big question is whether Dolby Vision can achieve better traction with the major TV manufacturers because so far only LG are using the format in the UK. This will ultimately decide how successful Dolby Vision is as a domestic HDR format. Whatever happens, at least this time we can avoid the horrors of a format war because Dolby Vision displays will also support HDR10 and Dolby Vision content will still deliver an HDR10 base layer for displays that don't support it.

    To comment on what you've read here, click the Discussion tab and post a reply.

    Share This Page