The tv accepts 12 bit 444 and the player defaults to that, so nothing should be exceeded.
We've discussed this: The tv is a 500nit max luminance. Everything above that is already tone mapped.
I was saying that when tested the tv does not clip - i.e. it tone maps - until it hits just over 1400nit.
What I was saying was that with those settings there was a lot of over-contrasty clipping going on.
Set to auto - still doing full fat hdr and 12 bit 444 output - but this time the tone mapping was much better. No more over-contrasty image.
In the handshake there are now 2 aspects being negotiated that were not before. Resolution and chroma. If I set to 444 instead of auto and 4k instead of auto, these are not negotiated, just output.
The output is 4k and 444 either way. All at 12 bit.
Letting the handshake negotiate it automatically seems to be the best way.
If I set to output 4k and 444 it tells me it's doing a dynamic range conversion. It still has 4k 12 bit 444 resolution, and enables hdr and rec2020.
If I recall correctly the maxcll container for 2012 is 4000nit, but the maxfall was something in the region of 1600. And the tv doesn't tone map at that level.
However, the player set to tone map 500 nits fits the tv perfectly and always does a good job.
2012 isn't HDR 10+.
I had made a case for the HDR10+ on this tv being naff - but now I'm reviewing that opinion coz it looks great now.
Just to clarifey the initial point I was making on the previous page:
When I assumed the tv was struggling to tone map 1000nit and thought that the optimizer did a better job with over 500nit output, I had noticed that the HDR10+ was assuming a 1000nit output. That, I thought, was being further tone mapped by the tv. Hence why I thought using the optimizer on the hdr10 layer was a better option than hdr10+.
Now I find the 10+ is fine because I've set the output to auto in chroma and resolution.
I can't logically explain why outputting 12 bit 444 at 4k should make any difference to outputting the same spec in 'auto' mode.
It has to be in the handshake. But why, I have no idea.
Ah yes, you're right. I didn't factor in the frame rate being 24fps. That would bring the data rates into around the 13.5Gb/s mark, well within the bandwidth threshold of HDMI v2.0b.
I think I'm confused by your terms. Contrasty for example, used in the same context as clipping. To me, contrast means the difference between light and dark, so in a cloud for example, if it was contrasty, I'd be able to see all the detail in that cloud. By contrast (pun not intended), if that cloud was clipped, I'd just see a white blob, with no detail at all.
Out of interest, what test patterns are you using to judge when the display clips?
With regards to the 'handshake'. The player will interrogate the TV's EDID and gain an understanding of it's capabilities. In this example, it will see that the TV can accept a YCC 4:4:4 video signal with 12 bit colour depth. If you try and force that out of the player and the TV can't accept it, you typically won't get a picture on the TV. That's what 'Auto' is for. In essence, the source will interrogate the TV's EDID, see it can't accept a full chroma feed at 12 bit and then output something it can accept. YCC 4:2:0 @ 10 bit for example.
When you state even with a Dynamic Range Conversion you're still getting the full YCC 4:4:4 @ 12 bit, how can you be sure? Does your AVR or TV tell you this is what it's receiving?
If you're seeing a difference when you're forcing YCC 4:4:4 @ 12 bit compared to using the Auto setting, that's probably because it's not doing what you think it's doing.
Just to add, there's no benefit to having the player output at 12 bit colour depth. The content is graded at 10 bit so the player is adding information that's not there for your TV to then convert that back to 10 bit to display, possibly losing genuine information in the process.
The same could be said for using YCC 4:4:4, when the content is encoded as YCC 4:2:0 but, it does have to get from YCC 4:2:0, through YCC4:2:2 to YCC4:4:4 for the TV to convert to RGB to display. The TV could do this entire process but, the Panasonic's are supposed to have first class YCC processing, so it's conceivable the player will do a better job of this than your TV.
It would be interesting to see what the HDR Optimiser sets the MaxCLL to, as reported to the TV in your example. From what I've seen when set for an OLED TV, it sets it at 1000 nits, meaning for your 2012 disc, it'll pre-tone map any part of the content above that. The TV will then tone map what it receives from the player as though it was a 1000 nit disc.
Paul