GadgetObsessed
Distinguished Member
- Joined
- Jul 6, 2001
- Messages
- 3,225
- Reaction score
- 1,849
- Points
- 1,050
This is definitely not an OLED vs LCD thread. I am just an LCD owner who noticed something about HDR that may be relevant to the OLED discussions.
I am a new owner of a Sony ZD9 - which is one of the brightest LCDs available. I think it can produce a maximum of 1700/1800 nits of peak brightness.
This weekend I was watching Planet Earth 2 UHD and there were a couple of scenes shooting into the sun where the sun was simply too bright for me - and I was watching the TV in quite a bright room during the day. If I had been watching in the dark then I really would have been squinting. It made me consider turning down the brightness (it defaults to max during HDR) to make such highlights easier on the eye.
Note that this only happened in one or two scenes during watching two hour long episodes of PE2 - which has quite a lot of bright sun shots. All the other very bright shots simply looked very vibrant. (Does anyone know if PE2 is mastered to 1,000 nits or 4,000 nits?)
This dazzling effect made me wonder two things:
(1) If this years OLEDs are getting to about half the brightness of the ZD9 are they actually bright enough already? Half the brightness is one "stop" less in camera terms and one "stop" isn't a huge amount in perceived brightness for really bright highlights. If something is starting to get uncomfortably bright then one stop dimmer is probably about right.
So I wonder if all this discussion about OLEDs not being bright enough for HDR is just people getting too focussed on the numbers rather than what it looks like in reality?
(2) If we already have sets that can generate highlights that can generate dazzling highlights during the day, then why are manufacturers seemingly on a race to 4,000 and 10,000 nits? Is it just marketing driven as a bigger number is always an easier sell to consumers? Or is it simply because LCDs can go brighter than OLEDs so that is something that LCD manufacturers will push as important.
(3) Is the real issue for OLED not its peak brightness but ABL kicking in to reduce power consumption for bright scenes?
Here are the HDR brightness figures for sustained windows on the Sony A1 and ZD9 (from RTings):
2% window: 653 cd/m2 - 1294 cd/m2
10% window: 649 cd/m2 - 1607 cd/m2
25% window: 429 cd/m2 - 1332 cd/m2
50% window: 237 cd/m2 - 899 cd/m2
100% window: 145 cd/m2 - 673 cd/m2
As can be seen for both LCD and OLED as the white window gets larger the luminance falls. So ABL is kicking in for both LCD and OLED. Although for OLED the effect is more pronounced - the 100% window is only 22% as bright as the 2% window. For LCD the 100% window is 52% as bright as the 2% window.
And below are the SDR figures. Note that calibrators often aim for a peak brightness of 120 cd/m2 for SDR so while the ABL is reducing the maximum possible brightness the A1 should always reach its target brightness for SDR.
2% window: 370 cd/m2 - 1235 cd/m2
10% window: 377 cd/m2 - 1583 cd/m2
25% window: 380 cd/m2 - 1320 cd/m2
50% window: 227 cd/m2 - 895 cd/m2
100% window: 129 cd/m2 - 672 cd/m2
Is there anything that manufacturers can do to reduce the impact of ABL (e.g. better power supplies, more heat resistant components, cooling fans, etc.) or are they constrained by EU/UK/world wide regulations on how much power a TV can use?
I am a new owner of a Sony ZD9 - which is one of the brightest LCDs available. I think it can produce a maximum of 1700/1800 nits of peak brightness.
This weekend I was watching Planet Earth 2 UHD and there were a couple of scenes shooting into the sun where the sun was simply too bright for me - and I was watching the TV in quite a bright room during the day. If I had been watching in the dark then I really would have been squinting. It made me consider turning down the brightness (it defaults to max during HDR) to make such highlights easier on the eye.
Note that this only happened in one or two scenes during watching two hour long episodes of PE2 - which has quite a lot of bright sun shots. All the other very bright shots simply looked very vibrant. (Does anyone know if PE2 is mastered to 1,000 nits or 4,000 nits?)
This dazzling effect made me wonder two things:
(1) If this years OLEDs are getting to about half the brightness of the ZD9 are they actually bright enough already? Half the brightness is one "stop" less in camera terms and one "stop" isn't a huge amount in perceived brightness for really bright highlights. If something is starting to get uncomfortably bright then one stop dimmer is probably about right.
So I wonder if all this discussion about OLEDs not being bright enough for HDR is just people getting too focussed on the numbers rather than what it looks like in reality?
(2) If we already have sets that can generate highlights that can generate dazzling highlights during the day, then why are manufacturers seemingly on a race to 4,000 and 10,000 nits? Is it just marketing driven as a bigger number is always an easier sell to consumers? Or is it simply because LCDs can go brighter than OLEDs so that is something that LCD manufacturers will push as important.
(3) Is the real issue for OLED not its peak brightness but ABL kicking in to reduce power consumption for bright scenes?
Here are the HDR brightness figures for sustained windows on the Sony A1 and ZD9 (from RTings):
2% window: 653 cd/m2 - 1294 cd/m2
10% window: 649 cd/m2 - 1607 cd/m2
25% window: 429 cd/m2 - 1332 cd/m2
50% window: 237 cd/m2 - 899 cd/m2
100% window: 145 cd/m2 - 673 cd/m2
As can be seen for both LCD and OLED as the white window gets larger the luminance falls. So ABL is kicking in for both LCD and OLED. Although for OLED the effect is more pronounced - the 100% window is only 22% as bright as the 2% window. For LCD the 100% window is 52% as bright as the 2% window.
And below are the SDR figures. Note that calibrators often aim for a peak brightness of 120 cd/m2 for SDR so while the ABL is reducing the maximum possible brightness the A1 should always reach its target brightness for SDR.
2% window: 370 cd/m2 - 1235 cd/m2
10% window: 377 cd/m2 - 1583 cd/m2
25% window: 380 cd/m2 - 1320 cd/m2
50% window: 227 cd/m2 - 895 cd/m2
100% window: 129 cd/m2 - 672 cd/m2
Is there anything that manufacturers can do to reduce the impact of ABL (e.g. better power supplies, more heat resistant components, cooling fans, etc.) or are they constrained by EU/UK/world wide regulations on how much power a TV can use?