Qu: Will Dolby Vision transmit through a v2.0a amp in future?

dw89

Established Member
Joined
Apr 3, 2007
Messages
342
Reaction score
130
Points
122
Hi All,

Would like to update to, say, Yammy RXA-3050. I know this has HDMI v 2.0a and will, via update, transmit HDR10.

What I need to know is, when DV UHD players come out in future (eg maybe Oppo) will an amp that is not 'DV Certified', like 3050, transmit the DV to a DV tv (such as LG OleD)?

What I don't want to do is buy now and not be able to use DV through the amp in future. As I don't know technically about this, and as HDR10 needed an update, whether I'd be locked out or not.

Any advice appreciated.

Dave;)
 
Yes, only the player and the TV or PJ needs to be Dolby Vision enabled. The receiver plays no part in the signal processing and simply passes the associated metadata through to the display.
 
Hi Dante,

Ta v much for the reply. Hear what you are saying (no reason to disbelieve you). My only question is, if that's the case, why does the latest update for the 3050 (which includes a Dts x update) also state 'enables HDR'? If it's only tv/player that matters?

Many thanks in advance

Dave:)
 
The receiver you have didn't ship with HDMI 2.0a. HDMI version 2.0a is required for the passage of the HDR metadata. The update your receiver got basically gives it the same capabilities as those associated with HDMI version 2.0a. Your receiver couldn't passthrough any form of HDR metadata prior to the update.

Dolby Vision is a propriety variant of HDR. THe metadata will be no different, but the source needs to be able to recognise the presence of Dolby Vision encoded content using a a Dolby authorised chipset and the display needs the additional hardware abilities to process it. Your receiver simply needs to be able to passthrough HDR metadata and doesn't need Dolby Vision chipsets or certification.
 
Last edited:
Ah right, thanks. My mistake (I don't have it...yet), I did actually think it was v2.0a from new! That clears it up anyway and I will buy it now;)

Thanks again

Dave:)
 
HDMI 2.0a is only required for HDR10. Dolby Vision should work fine with HDMI 2.0 devices. The only requirement, beyond hardware support for Dolby Vision in both the source and the display is that all the devices in the chain accept a BT2020 signal, which usually means HDMI 2.0 minimum. Again, HDMI 2.0a compliance is not required for HDR with Dolby Vision. Dolby didn't want to wait for the new chipsets.
 
HDMI 2.0a is only required for HDR10. Dolby Vision should work fine with HDMI 2.0 devices. The only requirement, beyond hardware support for Dolby Vision in both the source and the display is that all the devices in the chain accept a BT2020 signal, which usually means HDMI 2.0 minimum. Again, HDMI 2.0a compliance is not required for HDR with Dolby Vision. Dolby didn't want to wait for the new chipsets.

Nope, you still need at least HDMI version 2.0a or rather an HDMI chipset that is able to passthrough HDR metadata in order to passthrough the Dolby Vision metadata. The HDMI passthrough requirements are the same for both HDR10 and Dolby Vision. Dolby Vision has to comply with the standards that allow for HDR10 and this is how it was accepted as an optional aspect to UHD. OLder HDMI version 2.0 equiped devices that haven't a chipset that can be updated for HDR will not passthrough or be compliant with Dolby Vision's requirements.
 
Nope, you still need at least HDMI version 2.0a or rather an HDMI chipset that is able to passthrough HDR metadata in order to passthrough the Dolby Vision metadata. The HDMI passthrough requirements are the same for both HDR10 and Dolby Vision. Dolby Vision has to comply with the standards that allow for HDR10 and this is how it was accepted as an optional aspect to UHD. OLder HDMI version 2.0 equiped devices that haven't a chipset that can be updated for HDR will not passthrough or be compliant with Dolby Vision's requirements.

You are mistaken and are making a confusion between the content and the hardware.
Dolby Vision is an optional layer on UHD Bluray on top of HDR10, but the Dolby Vision source and the Dolby Vision display do NOT need HDMI 2.0a to be able to communicate and display Dolby Vision HDR properly. This is why, for example, we can already calibrate for Dolby Vision using pattern generators on current GPUs which do not have HDMI 2.0a. This isn't possible for HDR10, which needs HDMI 2.0a compliant hardware through the whole chain.

However, if your AVR is only HDMI 2.0 compliant and isn't HDMI 2.0a compliant, you will not be able to passthrough HDR10. Only Dolby Vision.

I don't have the time to explain further but please check your sources, you are misinformed. Feel free to post a link from Dolby proving that HDMI 2.0a is required if you believe I am incorrect.

Dolby didn't want to wait for the HDMI 2.0a hardware to be available, this is why they bypassed the specification and require specific hardware in both the source and the display. There is zero requirement for HDMI 2.0a for Dolby Vision itself.

It is unlikely you'll find a Dolby Vision source or display without HDMI 2.0a, but an HDMI 2.0a AVR is NOT required to playback Dolby Vision content. HDMI 2.0 should be fine as long as BT2020 content is accepted, as I said initially.
 
Last edited:
My apologies if I am wrong. I simply assumed that because Dolby Vision is still reliant upon metadata that it would require the same provision for this additional metadata as that made for HDR10?

If not the case then does this mean that Dolby Vision content can't convey the HDR10 data that it can apparently include for setups not Dolby Vision enabled, but for compatibility with setups able to deal with HDR10? Would Dolby Vision be able to convey this via a non HDR compliant AV receiver to a non Dolby Vision HDR10 enabled UHD TV for example? Dolby have said that studios can include the HDR10 data within Dolby Vision encoded content, but is it embedded within the Dolby Vision metadata or would it be the same as conventional HDR10 metadata?

By the way, I never implied HDMI 2.0a had anything to do with processing of either HDR10 or Dolby Vision metadata. I'm perfectly aware that it is the source and the display that require a sprcific chipset in order to process the Dolby Vision data. THe receiver is simply passing that data through and not processing it. The HDMI chipset needed to do this is what I was referring to.
 
Last edited:
No problem :)

I'll try to clarify. It works like this:

All UHD Bluray titles have a mandatory HDR10 layer, that's part of the specs. So if your equipment can play HDR10 (i.e. all devices in the chain are HDMI 2.0a compliant), all UHD bluray titles will play in HDR using the HDR10 layer and its associated static metadata.

If a title supports Dolby Vision, it has the HDR10 mandatory layer and its associated static metadata, and an optional Dolby Vision layer (1080p resolution only) which, when played in Dolby Vision, brings the bit depth to 12bits instead of 10bits for HDR10. It also uses dynamic HDR metadata which doesn't use HDMI 2.0a as it's not the same HDR metadata as HDR10 (which is static).

If you want to play this Dolby Vision title in HDR10, you need all the devices in the chain to support HDMI 2.0a. There is no need to support Dolby Vision at all in any device.

If you want to play this title in Dolby Vision, you need Dolby Vision specific hardware support in both the source and the display, but because it uses its own metadata and it uses its specific hardware in the source and the display to negotiate a handshake and switch the display in HDR when HDR content is detected, it doesn't need HDMI 2.0a anywhere. However, all the devices have to support BT2020 content, otherwise it won't work, HDR or not. This likely mean HDMI 2.0 at least, but there might be some exceptions (for example if a Radiance Pro in the chain converts the HDR BT2020 content to SDR P3 or Rec709 depending on the capabilities of the display, when this feature is added, and we don't know yet if it will support Dolby Vision or not).

In practice, most recent and future devices supporting Dolby Vision will also support HDMI 2.0a, therefore HDR10, but HDMI 2.0a is not a requirement to passthrough Dolby Vision content because it doesn't use the same HDR metadata as HDR10 (dynamic vs static). It really doesn't care about HDMI 2.0a as it doesn't use it at all. For example, I think some of the early Vizio Dolby Vision displays support Dolby Vision but not HDR10 because they don't have HDMI 2.0a support (again, Dolby didn't want to wait for these to launch Dolby Vision). These Dolby Vision displays won't be able to play HDR10 titles in HDR, but they will play Dolby Vision titles in HDR. Does your head hurt by now? :)

We'll have to wait for HDMI 2.1 for HDR10 to support dynamic metadata, and we don't know yet if this will mean new silicon (hence new hardware) or a simple f/w upgrade (which might be possible on high end consumer equipment like Radiance Pro or D&M X7200WA or 8802, but not on Sony or JVC projectors for example).

One important thing to note as well is that unlike with streaming which can offer an compatibility layer with SDR, there is no SDR layer on UHD Bluray titles with HDR (not enough space on the discs, especially on discs with Dolby Vision which need room for the optional DV 1080p layer), so if you play the title in SDR BT2020 it's an on-the-fly conversion. If you can't get the wider gamut (at least P3 or close to P3) and the higher bit depth, it's probably a better idea to upscale the bluray, unless you need the immersive audio track.

Hope this helps :)
 
Last edited:

The latest video from AVForums

Is 4K Blu-ray Worth It?
Subscribe to our YouTube channel
Back
Top Bottom