Panasonic DP-UB820EB, DP-UB420EB and DP-UB320EB Owners Thread

Awesome. Thank you. Got that and will check the modes out. I must download the manual. There wasn't one in the box. Just a set up guide. Going back to the high clarity option, does it provide any benefit for cd playback? Thanks again
Not that I've ever noticed. I just use it to turn the display off when playing blu-rays.
 
Making what you have work, is most of the fun. 🤓

As we've seen with the Tenet disc metadata, what's being delivered to the consumer is far from what it should be and that's been the case since HDR10 inception (pun intended....). So doesn't look to be getting any better, any time soon. The TV, any TV, really does deserves better to work with.

Paul
Hah. I had forgotten that a while ago I was playing with settings and set the player to output 444 rather than automatic and the resolution to 4k rather than automatic.
I noticed that I was getting 'dynamic range conversion' message on startup rather than HDR.
Netflix and Amazon were not enabling HDR either, just UHD.
I put them back to defaults at auto and Dawn of the Dead looked spot on in HDR10+.
Netflix and Amazon now at HDR.
Logically, my settings should have just meant a reduced number of handshakes in the chain, but something's obviously needed in the auto aspect to fully enable whatever it needs.
So now the 1000 nit discs don't seem to be clipping too bright after all :0)
So I then put in my new 2012 disc, which is 4000 nit rated with max luminance of over 1000 nits and it actually looked really good with the optimizer off except for some clipping in the very bright highlights. The Optimizer tamed those really well.
So that does seem to tally with how the tv starts to clip at around 1400 nit when using test patterns. Maybe the tv does a fair job at tone mapping up to 1400 nit after all.
Makes me wonder how many times I might have judged the performance of the tv and adjusted accordingly after a bad handshake.
 
Hah. I had forgotten that a while ago I was playing with settings and set the player to output 444 rather than automatic and the resolution to 4k rather than automatic.
I noticed that I was getting 'dynamic range conversion' message on startup rather than HDR.
Netflix and Amazon were not enabling HDR either, just UHD.
I put them back to defaults at auto and Dawn of the Dead looked spot on in HDR10+.
Netflix and Amazon now at HDR.
Logically, my settings should have just meant a reduced number of handshakes in the chain, but something's obviously needed in the auto aspect to fully enable whatever it needs.
So now the 1000 nit discs don't seem to be clipping too bright after all :0)
So I then put in my new 2012 disc, which is 4000 nit rated with max luminance of over 1000 nits and it actually looked really good with the optimizer off except for some clipping in the very bright highlights. The Optimizer tamed those really well.
So that does seem to tally with how the tv starts to clip at around 1400 nit when using test patterns. Maybe the tv does a fair job at tone mapping up to 1400 nit after all.
Makes me wonder how many times I might have judged the performance of the tv and adjusted accordingly after a bad handshake.
If the player was forcing YCC 4:4:4 with HDR content, the bandwidth available out of the HDMI port would have been exceeded. I'm not sure exactly what the player would do with the video in this scenario but, I suspect it would convert it to SDR and then the 'Dynamic Range Conversion' message would make sense too. I had that message when attempting to playback HLG content but, there didn't seem to be any rhyme or reason to it. I had to reset the player to default to get HLG playback again but, after resetting the player up to what ostensibly looked to be in the same way, HLG continued to playback just fine.

I'm not sure what you mean about the number of handshakes would have been reduced but, the only real handshake is at the commencement of the disc being played, where HDCP is negotiated. If you mean the chroma processing and resolution upscaling, these would have happened anyway, no matter whether the player or the TV set to perform them.

With regards to 2012, is the MaxCLL of that disc 4000 nits or just over 1000 nits? If it's the latter and your TV is capable of a maximum output of 1400 nits, there won't be any tone mapping going on, as none's required. That doesn't mean the TV won't clip the specular highlights on that particular disc though. It's picture processor may just say to clip anything over a specific video level.
If it's the former, any content above 1400 nits will need to be tone mapped to fit into the TV's capabilities.

Paul
 
If the player was forcing YCC 4:4:4 with HDR content, the bandwidth available out of the HDMI port would have been exceeded. I'm not sure exactly what the player would do with the video in this scenario but, I suspect it would convert it to SDR and then the 'Dynamic Range Conversion' message would make sense too. I had that message when attempting to playback HLG content but, there didn't seem to be any rhyme or reason to it. I had to reset the player to default to get HLG playback again but, after resetting the player up to what ostensibly looked to be in the same way, HLG continued to playback just fine.

I'm not sure what you mean about the number of handshakes would have been reduced but, the only real handshake is at the commencement of the disc being played, where HDCP is negotiated. If you mean the chroma processing and resolution upscaling, these would have happened anyway, no matter whether the player or the TV set to perform them.

With regards to 2012, is the MaxCLL of that disc 4000 nits or just over 1000 nits? If it's the latter and your TV is capable of a maximum output of 1400 nits, there won't be any tone mapping going on, as none's required. That doesn't mean the TV won't clip the specular highlights on that particular disc though. It's picture processor may just say to clip anything over a specific video level.
If it's the former, any content above 1400 nits will need to be tone mapped to fit into the TV's capabilities.

Paul
The tv accepts 12 bit 444 and the player defaults to that, so nothing should be exceeded.
We've discussed this: The tv is a 500nit max luminance. Everything above that is already tone mapped.
I was saying that when tested the tv does not clip - i.e. it tone maps - until it hits just over 1400nit.
What I was saying was that with those settings there was a lot of over-contrasty clipping going on.
Set to auto - still doing full fat hdr and 12 bit 444 output - but this time the tone mapping was much better. No more over-contrasty image.
In the handshake there are now 2 aspects being negotiated that were not before. Resolution and chroma. If I set to 444 instead of auto and 4k instead of auto, these are not negotiated, just output.
The output is 4k and 444 either way. All at 12 bit.
Letting the handshake negotiate it automatically seems to be the best way.
If I set to output 4k and 444 it tells me it's doing a dynamic range conversion. It still has 4k 12 bit 444 resolution, and enables hdr and rec2020.
If I recall correctly the maxcll container for 2012 is 4000nit, but the maxfall was something in the region of 1600. And the tv doesn't tone map at that level.
However, the player set to tone map 500 nits fits the tv perfectly and always does a good job.
2012 isn't HDR 10+.
I had made a case for the HDR10+ on this tv being naff - but now I'm reviewing that opinion coz it looks great now.

Just to clarifey the initial point I was making on the previous page:
When I assumed the tv was struggling to tone map 1000nit and thought that the optimizer did a better job with over 500nit output, I had noticed that the HDR10+ was assuming a 1000nit output. That, I thought, was being further tone mapped by the tv. Hence why I thought using the optimizer on the hdr10 layer was a better option than hdr10+.
Now I find the 10+ is fine because I've set the output to auto in chroma and resolution.
I can't logically explain why outputting 12 bit 444 at 4k should make any difference to outputting the same spec in 'auto' mode.
It has to be in the handshake. But why, I have no idea.
 
Last edited:
The tv accepts 12 bit 444 and the player defaults to that, so nothing should be exceeded.
We've discussed this: The tv is a 500nit max luminance. Everything above that is already tone mapped.
I was saying that when tested the tv does not clip - i.e. it tone maps - until it hits just over 1400nit.
What I was saying was that with those settings there was a lot of over-contrasty clipping going on.
Set to auto - still doing full fat hdr and 12 bit 444 output - but this time the tone mapping was much better. No more over-contrasty image.
In the handshake there are now 2 aspects being negotiated that were not before. Resolution and chroma. If I set to 444 instead of auto and 4k instead of auto, these are not negotiated, just output.
The output is 4k and 444 either way. All at 12 bit.
Letting the handshake negotiate it automatically seems to be the best way.
If I set to output 4k and 444 it tells me it's doing a dynamic range conversion. It still has 4k 12 bit 444 resolution, and enables hdr and rec2020.
If I recall correctly the maxcll container for 2012 is 4000nit, but the maxfall was something in the region of 1600. And the tv doesn't tone map at that level.
However, the player set to tone map 500 nits fits the tv perfectly and always does a good job.
2012 isn't HDR 10+.
I had made a case for the HDR10+ on this tv being naff - but now I'm reviewing that opinion coz it looks great now.

Just to clarifey the initial point I was making on the previous page:
When I assumed the tv was struggling to tone map 1000nit and thought that the optimizer did a better job with over 500nit output, I had noticed that the HDR10+ was assuming a 1000nit output. That, I thought, was being further tone mapped by the tv. Hence why I thought using the optimizer on the hdr10 layer was a better option than hdr10+.
Now I find the 10+ is fine because I've set the output to auto in chroma and resolution.
I can't logically explain why outputting 12 bit 444 at 4k should make any difference to outputting the same spec in 'auto' mode.
It has to be in the handshake. But why, I have no idea.
Ah yes, you're right. I didn't factor in the frame rate being 24fps. That would bring the data rates into around the 13.5Gb/s mark, well within the bandwidth threshold of HDMI v2.0b.

I think I'm confused by your terms. Contrasty for example, used in the same context as clipping. To me, contrast means the difference between light and dark, so in a cloud for example, if it was contrasty, I'd be able to see all the detail in that cloud. By contrast (pun not intended), if that cloud was clipped, I'd just see a white blob, with no detail at all.

Out of interest, what test patterns are you using to judge when the display clips?

With regards to the 'handshake'. The player will interrogate the TV's EDID and gain an understanding of it's capabilities. In this example, it will see that the TV can accept a YCC 4:4:4 video signal with 12 bit colour depth. If you try and force that out of the player and the TV can't accept it, you typically won't get a picture on the TV. That's what 'Auto' is for. In essence, the source will interrogate the TV's EDID, see it can't accept a full chroma feed at 12 bit and then output something it can accept. YCC 4:2:0 @ 10 bit for example.

When you state even with a Dynamic Range Conversion you're still getting the full YCC 4:4:4 @ 12 bit, how can you be sure? Does your AVR or TV tell you this is what it's receiving?
If you're seeing a difference when you're forcing YCC 4:4:4 @ 12 bit compared to using the Auto setting, that's probably because it's not doing what you think it's doing.

Just to add, there's no benefit to having the player output at 12 bit colour depth. The content is graded at 10 bit so the player is adding information that's not there for your TV to then convert that back to 10 bit to display, possibly losing genuine information in the process.
The same could be said for using YCC 4:4:4, when the content is encoded as YCC 4:2:0 but, it does have to get from YCC 4:2:0, through YCC4:2:2 to YCC4:4:4 for the TV to convert to RGB to display. The TV could do this entire process but, the Panasonic's are supposed to have first class YCC processing, so it's conceivable the player will do a better job of this than your TV.

It would be interesting to see what the HDR Optimiser sets the MaxCLL to, as reported to the TV in your example. From what I've seen when set for an OLED TV, it sets it at 1000 nits, meaning for your 2012 disc, it'll pre-tone map any part of the content above that. The TV will then tone map what it receives from the player as though it was a 1000 nit disc.

Paul
 
Ah yes, you're right. I didn't factor in the frame rate being 24fps. That would bring the data rates into around the 13.5Gb/s mark, well within the bandwidth threshold of HDMI v2.0b.

I think I'm confused by your terms. Contrasty for example, used in the same context as clipping. To me, contrast means the difference between light and dark, so in a cloud for example, if it was contrasty, I'd be able to see all the detail in that cloud. By contrast (pun not intended), if that cloud was clipped, I'd just see a white blob, with no detail at all.

Out of interest, what test patterns are you using to judge when the display clips?

With regards to the 'handshake'. The player will interrogate the TV's EDID and gain an understanding of it's capabilities. In this example, it will see that the TV can accept a YCC 4:4:4 video signal with 12 bit colour depth. If you try and force that out of the player and the TV can't accept it, you typically won't get a picture on the TV. That's what 'Auto' is for. In essence, the source will interrogate the TV's EDID, see it can't accept a full chroma feed at 12 bit and then output something it can accept. YCC 4:2:0 @ 10 bit for example.

When you state even with a Dynamic Range Conversion you're still getting the full YCC 4:4:4 @ 12 bit, how can you be sure? Does your AVR or TV tell you this is what it's receiving?
If you're seeing a difference when you're forcing YCC 4:4:4 @ 12 bit compared to using the Auto setting, that's probably because it's not doing what you think it's doing.

Just to add, there's no benefit to having the player output at 12 bit colour depth. The content is graded at 10 bit so the player is adding information that's not there for your TV to then convert that back to 10 bit to display, possibly losing genuine information in the process.
The same could be said for using YCC 4:4:4, when the content is encoded as YCC 4:2:0 but, it does have to get from YCC 4:2:0, through YCC4:2:2 to YCC4:4:4 for the TV to convert to RGB to display. The TV could do this entire process but, the Panasonic's are supposed to have first class YCC processing, so it's conceivable the player will do a better job of this than your TV.

It would be interesting to see what the HDR Optimiser sets the MaxCLL to, as reported to the TV in your example. From what I've seen when set for an OLED TV, it sets it at 1000 nits, meaning for your 2012 disc, it'll pre-tone map any part of the content above that. The TV will then tone map what it receives from the player as though it was a 1000 nit disc.

Paul
Gosh Paul, you're putting in a lot of effort in your replies; turning it into a bit of an investigation.
I hope you're not seeing my comments as a problem put out for help? I'm not, so don't feel obliged to try and solve a 'problem' - it's not what I'm doing.
You should recall I'm using the basic setting for the optimizer - which sets the maxfall to 500 nit.
In HDR10+ Dawn of the Dead maxfall is at 1000.
The player informs me what is being output and what is on the disc. In both setups it stated 12 bit 444 24hz. The tv is rated at this spec and has never had a problem with it.
I disagree with you about the benefits of the chroma upscaling. After several years of tinkering I definitely find the 12 bit 444 24p output from the Panny players to look superior to any 'lower' setting. Believe me, I've used them all back and forth.
I've got Ray Masciola's test patterns and the S&M UHD discs.
BTW I still prefer the picture for hdr10 with the player tone mapping to 500 nit than the tv.
By contrasty I mean too much contrast - too much light causing clipping.
 
Last edited:
Gosh Paul, you're putting in a lot of effort in your replies; turning it into a bit of an investigation.
I hope you're not seeing my comments as a problem put out for help? I'm not, so don't feel obliged to try and solve a 'problem' - it's not what I'm doing.
You should recall I'm using the basic setting for the optimizer - which sets the maxfall to 500 nit.
In HDR10+ Dawn of the Dead maxfall is at 1000.
The player informs me what is being output and what is on the disc. In both setups it stated 12 bit 444 24hz. The tv is rated at this spec and has never had a problem with it.
I disagree with you about the benefits of the chroma upscaling. After several years of tinkering I definitely find the 12 bit 444 24p output from the Panny players to look superior to any 'lower' setting. Believe me, I've used them all back and forth.
I've got Ray Masciola's test patterns and the S&M UHD discs.
BTW I still prefer the picture for hdr10 with the player tone mapping to 500 nit than the tv.
By contrasty I mean too much contrast - too much light causing clipping.
No worries. In 'real' life, I tend to have a conversation with myself, when I'm attempting to understand something. In these forums, I just put it all down on the page. :D

But I think you're agreeing with me re Panasonic's superior chroma processing, as that's what I wrote. Of course as I have a Panasonic TV, it makes no odds to me whether I let the player or TV do that part of the video processing. I can't agree regarding the player outputting 10 bit content as 12 bit though. The player has to add something (quite a lot of something in fact) that's not there in the source content, for the TV to then take (quite a lot of) something away again. That can't ever be a good thing for the purist.

Thanks for the heads up in the test patterns, I have both those you've listed so should probably make use of them.

Paul
 
No worries. In 'real' life, I tend to have a conversation with myself, when I'm attempting to understand something. In these forums, I just put it all down on the page. :D

But I think you're agreeing with me re Panasonic's superior chroma processing, as that's what I wrote. Of course as I have a Panasonic TV, it makes no odds to me whether I let the player or TV do that part of the video processing. I can't agree regarding the player outputting 10 bit content as 12 bit though. The player has to add something (quite a lot of something in fact) that's not there in the source content, for the TV to then take (quite a lot of) something away again. That can't ever be a good thing for the purist.

Thanks for the heads up in the test patterns, I have both those you've listed so should probably make use of them.

Paul
Cool.
FYI I've just popped the 2012 disc in and here's the metadata

Maxcll 4451
Maxfall 514
Max Luminance 4000
(I think it's the first time I've seen a disc with that luminance and greater maxcll)

Optimizer on:
500
400

I do agree with your logical take on the mechanics of chroma upscaling. Makes absolute sense.
It's just that the image does look better to me and I trust that Panasonic knows a lot more about what makes a good picture in their player-tv tech than me.
I've always gone back to the 12 bit 444 after tweaking to 10 bit or 422 or 420.
I see a difference more in the SDR content. At 444 something happens - a bit of extra magic. Dunno why, coz as you say logic states there's extra processing going on. And we purists tend to have the sense that more processing is less 'pure' and shouldn't look better.
 
Paul,

It's interesting that you are discussing this 4:4:4 12-Bit aspect, because it's something I haven't been able to get my head around nor get concrete feedback on. Perhaps you can help...

I am running a UB9000, not 820, but for the purposes of this discussion they are basically the same. The unit is connected to a Samsung NU8000 LCD panel via HDMI, and everything in the Panny is basically on automatic. This includes:

Resolution: Auto
4K 60p Output: 4K (60p) 4:4:4
Color Mode: YCbCr Automatic
Deep Color Output: Auto (12-Bit Priority)

HDR/Color Gamut Option: HDR/BT.2020 (Auto)

When using these settings, the player outputs the following when watching 1080p Blu-rays and UHD 4K Blu-rays:

4K/24P YCbCr 4:4:4/12-Bit

With DVDs, the player is outputting:

4K/60P YCbCr 4:2:2/12-Bit

Now, I understand why DVDs are being output the way they are (because the signal is in 60P/4K so a full 12-Bit/4:4:4 transfer isn't possible), but with the Blu-rays, it seems weird that my TV is telling the player, via the automatic selections, that it can accept 12-Bit 4:4:4 when all reports say that this model Samsung can only support 12-Bit 4:4:4 at 24P if the panel is in PC picture mode -- which mine is not.

Is this because the tele can accept this signal, but it can't display it? If this is so, what is the tele doing with the 12 bits and 4:4:4 color space...is it "reducing" it down to 10 bits (or 8 bits with dithering)?

I also understand that the Panasonic's chroma upsampling technology/feature, which you were also discussing above, is heavily marketed and preferred by many videophiles, and that with these automatic settings, the player's processor is implementing these algorithms to bring the color space up to 4:4:4 from 4:2:0 and the color depth up, too...do I have this right?

Here's what someone that has great knowledge in these things from another forum said about this, but it was a little lost on me:

4K/24 YCbCr 4:4:4 12-bit is not anything exotic for a modern, HDR TV. The TV may not actually use the extra bits to best advantage, but it shouldn’t screw up.

Digital video is, necessarily, only an approximation of what’s actually an Analog process. Any digital representation will have inherent errors built in, because it can not record a value between any two adjacent digital values — what’s known as “quantization error”.

Digital video processing exacerbates such inherent errors unless you are really careful how you do it. This is probably deeper than you want to go, but if interested, do searches for “digital signal processing” and “information theory” (which is the field of mathematics which dives into this stuff).

There are known techniques for doing this video processing -- this math -- “more carefully”, but they are complicated. Much simpler, if you can afford the hardware processing power to do it, is to do the math at a higher bit depth. This helps because you can now preserver the rounding that shows up in your digital results. That is, you can record intermediate results in 12-bit that are BETWEEN the closest results you could get in 8-bit.

Better players and TVs will do their video processing at higher than 8-bit for just this reason. (Of course HDR-10 video is 10-bit and Dolby Vision is 12-bit, so you’d have to do higher than 8-bit for any video processing on those.)

Right now, the upper limit you can send on the HDMI cable is 12-bit, so it is normal for a better player to produce 12-bit during its video processing, and pass that on to the TV — which will hopefully use those extra rounding bits correctly. (Dolby Vision is handled weirdly, so just ignore it for this discussion.)

In the TV, the final step of converting the resulting, now fully processed, digital video into light actually output by the pixels is a conversion from Digital to Analog — a DAC for video. The fact the pixels can’t render adjacent brightness/color levels as finally as 12-bit has nothing to do with whether or not it is a good idea to feed 12-bit Digital video into that video DAC.

As I said in my blog post, the reason 4:2:0 is an exception on the HDMI cable is that it would require the TV to buffer multiple lines of video to do the vertical part of the color upsampling. That complicates the processing in the TV (likely making it more expensive).

Any thoughts on this?
 
Now, I understand why DVDs are being output the way they are (because the signal is in 60P/4K so a full 12-Bit/4:4:4 transfer isn't possible), but with the Blu-rays, it seems weird that my TV is telling the player, via the automatic selections, that it can accept 12-Bit 4:4:4 when all reports say that this model Samsung can only support 12-Bit 4:4:4 at 24P if the panel is in PC picture mode -- which mine is not.

Is this because the tele can accept this signal, but it can't display it? If this is so, what is the tele doing with the 12 bits and 4:4:4 color space...is it "reducing" it down to 10 bits (or 8 bits with dithering)?
The TV won't display YCC 4:4:4, it displays in RGB. Thus the whole processing and conversion from what's stored on the disc (YCC4:2:0 no matter what format) to what the TV actually displays (RGB).

From what you've written, it seems that the TV will indeed accept 12 bit YCC4:4:4 at 24fps, without being in PC mode. How official are the reports that it can't do this? I'm always sceptical of such reports until I can trace them back to an official statement or document.
Also, how are you verifying the TV does actually accept that format?
I also understand that the Panasonic's chroma upsampling technology/feature, which you were also discussing above, is heavily marketed and preferred by many videophiles, and that with these automatic settings, the player's processor is implementing these algorithms to bring the color space up to 4:4:4 from 4:2:0 and the color depth up, too...do I have this right?
Yes that's right.
As eluded to above, the video has to get from YCC4:2:0 as it's stored on the disc to RGB as it's displayed by the TV. For this process it has to go from YCC4:2:0 to YCC4:2:2 then to YCC4:4:4 before it's converted to RGB. That process can take place in the source device or the TV but, it has to take place somewhere. If you don't have a Panasonic TV, you can use the a Panasonic disc player to do it instead. Panasonic just seem to be very good at that particular process, and this is probably down to the resolution at which they process it.
Here's what someone that has great knowledge in these things from another forum said about this, but it was a little lost on me:

4K/24 YCbCr 4:4:4 12-bit is not anything exotic for a modern, HDR TV. The TV may not actually use the extra bits to best advantage, but it shouldn’t screw up.

Any thoughts on this?
Some of that is quite existential and probably not necessary to discuss but the point about more bits for processing is valid. I had a Radiance Pro picture processor and that has a 12 bit processing pipeline for video as the great number of bits allows greater precision. Bear in mind that this would apply to 8 bit as well as 10 bit source content. They'd output the video in the native colour depth though.
You can't argue with that logic, as it's sound. However, if as with the Panasonic players the output remains at the 12 bit colour depth the content is processed in, there's a reliance on the display device to then process that back down to what it can display. It's here where I think there's a cause for concern that the display will screw it up, as it will likely not process it in the same way as a Panasonic display would.
I freely admit that I'm assuming that and possibly not giving the TV manufacturers the credit they're due.
As your linked quotes stated, this is all just maths, with calculations made to expand 8 and 10 bit colour into 12 bit colour and then back again. And that equation for the maths is a standard one obviously, so you'd hope it would be applied equally by any/ all manufacturers. Whether it is, is another question though.

Paul
 
Thanks, Paul; appreciate it. Will get back to you as soon as I have some free time.
 
The TV won't display YCC 4:4:4, it displays in RGB. Thus the whole processing and conversion from what's stored on the disc (YCC4:2:0 no matter what format) to what the TV actually displays (RGB).
So then is it better to set the PLAYER to RGB output?
From what you've written, it seems that the TV will indeed accept 12 bit YCC4:4:4 at 24fps, without being in PC mode. How official are the reports that it can't do this? I'm always sceptical of such reports until I can trace them back to an official statement or document.
Also, how are you verifying the TV does actually accept that format?
This was based on a conversation I was having with someone on this forum, in the Samsung TV section, in the dedicated thread for the model I own....a gentlemen who I trust in terms of his experience and knowledge with video sources. He also owned an NU8000, and he advised me that it was a known bug of some kind that this model cannot process 4K video at 24FPS and 4:4:4 color space unless it's running in PC mode.

As for verifying that the TV actually does accept this format, I'm only going by the fact that the Panasonic is outputting 12-Bit 4:4:4 when I check the player's video output onscreen information while discs are playing.

According to that data, here's what the unit is outputting:

For DVDs:

Source: 720x480 SDR/BT.601 YCbCr 4:2:0/8-Bit
Output From Player: 4K/60P SDR/BT.709 YCbCr 4:2:2/12-Bit

For Blu-rays:


Source: 1920x1080 SDR/BT.709 YCbCr 4:2:0/8-Bit
Output From Player: 4K/24P SDR/BT.709 YCbCr 4:4:4 12-Bit

Now apparently, the television is TELLING the player it can accept 4:4:4 12-Bit with Blu-rays (and Ultra HD Blu-rays)...
Yes that's right.
As eluded to above, the video has to get from YCC4:2:0 as it's stored on the disc to RGB as it's displayed by the TV. For this process it has to go from YCC4:2:0 to YCC4:2:2 then to YCC4:4:4 before it's converted to RGB. That process can take place in the source device or the TV but, it has to take place somewhere. If you don't have a Panasonic TV, you can use the a Panasonic disc player to do it instead. Panasonic just seem to be very good at that particular process, and this is probably down to the resolution at which they process it.
Understood; yeah, I don't, unfortunately, have a Panasonic TV because they're not available in the States any longer...

So the 4:4:4 is okay for Blu-rays and UHD Blu-rays?
Some of that is quite existential and probably not necessary to discuss but the point about more bits for processing is valid.
So it's kind of like what I described with my discussions with the Oppo engineers...that the player is kind of processing the color at an "enhanced" level by doing it internally (bringing it to, say, 12-Bits)?
I had a Radiance Pro picture processor and that has a 12 bit processing pipeline for video as the great number of bits allows greater precision. Bear in mind that this would apply to 8 bit as well as 10 bit source content. They'd output the video in the native colour depth though.
You can't argue with that logic, as it's sound. However, if as with the Panasonic players the output remains at the 12 bit colour depth the content is processed in, there's a reliance on the display device to then process that back down to what it can display. It's here where I think there's a cause for concern that the display will screw it up, as it will likely not process it in the same way as a Panasonic display would.
I freely admit that I'm assuming that and possibly not giving the TV manufacturers the credit they're due.
As your linked quotes stated, this is all just maths, with calculations made to expand 8 and 10 bit colour into 12 bit colour and then back again. And that equation for the maths is a standard one obviously, so you'd hope it would be applied equally by any/ all manufacturers. Whether it is, is another question though.

Paul
Thanks very much, Paul; I suppose the overarching question is, then, is it normal that my Samsung 4K display is telling the UB9000 that it can accept 12-Bit 4:4:4 video in 4K/24P?
 
So then is it better to set the PLAYER to RGB output?
That depends on which device does an accurate job of that final conversion from YCC4:4:4 to RGB. Experimentation will give you that answer. But don't forget that RGB will take up a lot more bandwidth than what's stored on the disc but, if you're already sending YCC4:4:4 over the HDMI interface, you shouldn't have an issue.
As for verifying that the TV actually does accept this format, I'm only going by the fact that the Panasonic is outputting 12-Bit 4:4:4 when I check the player's video output onscreen information while discs are playing.

According to that data, here's what the unit is outputting:

For DVDs:

Source: 720x480 SDR/BT.601 YCbCr 4:2:0/8-Bit
Output From Player: 4K/60P SDR/BT.709 YCbCr 4:2:2/12-Bit

For Blu-rays:


Source: 1920x1080 SDR/BT.709 YCbCr 4:2:0/8-Bit
Output From Player: 4K/24P SDR/BT.709 YCbCr 4:4:4 12-Bit

Now apparently, the television is TELLING the player it can accept 4:4:4 12-Bit with Blu-rays (and Ultra HD Blu-rays)...
I'd like to understand how reliable this output information is from the Panasonic and I can do that with my TV, as I can interrogate it to show what it's seeing at the input.

As you've stated though, your TV clearly can access 2160p24 SDR with YCC4:4:4 12 bit encoding, if that information is accurate.
Understood; yeah, I don't, unfortunately, have a Panasonic TV because they're not available in the States any longer...
That really is a shame, and you guys are missing out.
So the 4:4:4 is okay for Blu-rays and UHD Blu-rays?
Yes, if your TV can accept the data at that level, and it seems it can.
So it's kind of like what I described with my discussions with the Oppo engineers...that the player is kind of processing the color at an "enhanced" level by doing it internally (bringing it to, say, 12-Bits)?
Yes exactly. By using the greater bit depth, the processing gains precision as it can work at a lower level.
Thanks very much, Paul; I suppose the overarching question is, then, is it normal that my Samsung 4K display is telling the UB9000 that it can accept 12-Bit 4:4:4 video in 4K/24P?
I think what defines normal is a question for the individual manufacturer. The HDMI hardware installed certainly permits the content to be sent from the source to the display at that level. Whether the TV manufacturer then decide to allow the TV to process the video at that level is another thing altogether.

Paul
 
And here with the 2 year warranty but, whether it's actually a UB820 that arrives.... :rotfl:

Paul
JL is where I got mine from less than 2 weeks ago (got a few % off via my company benefits scheme), and the correct item turned up...
 
JL is where I got mine from less than 2 weeks ago (got a few % off via my company benefits scheme), and the correct item turned up...
Yes I suspect they've got their act together now.

My rather tongue in cheek comment was in reference to the beginning of this thread. I and another couple of members ordered more than one each of these from JL, at launch, for a UB320 to arrive each time. I think I got my UB820 from Panasonic Direct in the end.

Paul
 
That depends on which device does an accurate job of that final conversion from YCC4:4:4 to RGB. Experimentation will give you that answer. But don't forget that RGB will take up a lot more bandwidth than what's stored on the disc but, if you're already sending YCC4:4:4 over the HDMI interface, you shouldn't have an issue.

I'd like to understand how reliable this output information is from the Panasonic and I can do that with my TV, as I can interrogate it to show what it's seeing at the input.

As you've stated though, your TV clearly can access 2160p24 SDR with YCC4:4:4 12 bit encoding, if that information is accurate.

That really is a shame, and you guys are missing out.

Yes, if your TV can accept the data at that level, and it seems it can.

Yes exactly. By using the greater bit depth, the processing gains precision as it can work at a lower level.

I think what defines normal is a question for the individual manufacturer. The HDMI hardware installed certainly permits the content to be sent from the source to the display at that level. Whether the TV manufacturer then decide to allow the TV to process the video at that level is another thing altogether.

Paul
Thanks, as always, Paul...will respond as soon as I can.
 
just ordered @ £299 with 2 year guarantee from Peter Tyson.
 
That depends on which device does an accurate job of that final conversion from YCC4:4:4 to RGB. Experimentation will give you that answer. But don't forget that RGB will take up a lot more bandwidth than what's stored on the disc but, if you're already sending YCC4:4:4 over the HDMI interface, you shouldn't have an issue.
Okay Mate,

Getting back to replying to this; it's weird that there's no "RGB Automatic" selection in the player then (unless there is). I suppose the Auto selections and what they're choosing for my display are the best I'm gonna get...

I swear, though...I simply cannot get my head around this Color Space and Color Depth stuff...:confused::confused::confused:
I'd like to understand how reliable this output information is from the Panasonic and I can do that with my TV, as I can interrogate it to show what it's seeing at the input.
Well, I don't know of any way to test if the player is actually sending out what it's saying; I think we have to rely on just trusting it.

But I have heard of specialized equipment that can be put in-between a display and source unit to analyze what is actually going on in a setup; that would be beyond my knowledge, budget and interest reach, though.

With regard to your statement about confirming it at the display -- yes, I think there is a menu that allows for this, but I never accessed it, I don't believe, on my own Samsung...
As you've stated though, your TV clearly can access 2160p24 SDR with YCC4:4:4 12 bit encoding, if that information is accurate.
IF it's accurate...and that's what we don't know.

But as I included from that gentlemen who sent me the info about the space and depth, it seems like it's something of a consensus that all decent modern day 4K flat panels (unsure about the projector side) can handle 12-Bit 4:4:4 video, even though the display is "bringing it down" to something along the lines of 10-Bit or 8-Bit with dithering.
That really is a shame, and you guys are missing out.
I've heard great things about their OLEDs, and I know their plasmas were well-regarded in the 1080p era; regardless, I don't think I'd be able to afford one if they were sold here.
Yes, if your TV can accept the data at that level, and it seems it can.
OK.
Yes exactly. By using the greater bit depth, the processing gains precision as it can work at a lower level.
What do you mean by "it can work at a lower level"?
 
Hi,

I have the UB820 (recent) and happen to have the Alitta 4K blu ray that supports dolby vision and HDR10+. The dolby vision setting in the player is on, as is the HDR10+. When playing alitta it start off showing as dolby vision when on the title screen but when the movie starts is switches to HDR10+. The only way I can force the player to show dolby vision is to disable HDR10+?

The HDR10+ showing is perfectly acceptable, so no complaining there but as dolby vision supports more colours and I prefer the look of DV dark on my OLED to HDR10+ filmmaker

Am I missing something? Is there a setting where you can set the default format if the disc supports it?

Thanks

Martin
 
Hi,

I have the UB820 (recent) and happen to have the Alitta 4K blu ray that supports dolby vision and HDR10+. The dolby vision setting in the player is on, as is the HDR10+. When playing alitta it start off showing as dolby vision when on the title screen but when the movie starts is switches to HDR10+. The only way I can force the player to show dolby vision is to disable HDR10+?

The HDR10+ showing is perfectly acceptable, so no complaining there but as dolby vision supports more colours and I prefer the look of DV dark on my OLED to HDR10+ filmmaker

Am I missing something? Is there a setting where you can set the default format if the disc supports it?

Thanks

Martin
No. On the UB820 to get DV you have to turn off HDR10+ which is the default between those two.
Similarily to get HDR10 you have to turn off HDR10+ and DV.
 

The latest video from AVForums

TV Buying Guide - Which TV Is Best For You?
Subscribe to our YouTube channel
Back
Top Bottom