Do OLEDs really need more nits?

GadgetObsessed

Distinguished Member
Joined
Jul 6, 2001
Messages
3,225
Reaction score
1,849
Points
1,050
This is definitely not an OLED vs LCD thread. I am just an LCD owner who noticed something about HDR that may be relevant to the OLED discussions.

I am a new owner of a Sony ZD9 - which is one of the brightest LCDs available. I think it can produce a maximum of 1700/1800 nits of peak brightness.

This weekend I was watching Planet Earth 2 UHD and there were a couple of scenes shooting into the sun where the sun was simply too bright for me - and I was watching the TV in quite a bright room during the day. If I had been watching in the dark then I really would have been squinting. It made me consider turning down the brightness (it defaults to max during HDR) to make such highlights easier on the eye.

Note that this only happened in one or two scenes during watching two hour long episodes of PE2 - which has quite a lot of bright sun shots. All the other very bright shots simply looked very vibrant. (Does anyone know if PE2 is mastered to 1,000 nits or 4,000 nits?)

This dazzling effect made me wonder two things:
(1) If this years OLEDs are getting to about half the brightness of the ZD9 are they actually bright enough already? Half the brightness is one "stop" less in camera terms and one "stop" isn't a huge amount in perceived brightness for really bright highlights. If something is starting to get uncomfortably bright then one stop dimmer is probably about right.

So I wonder if all this discussion about OLEDs not being bright enough for HDR is just people getting too focussed on the numbers rather than what it looks like in reality?

(2) If we already have sets that can generate highlights that can generate dazzling highlights during the day, then why are manufacturers seemingly on a race to 4,000 and 10,000 nits? Is it just marketing driven as a bigger number is always an easier sell to consumers? Or is it simply because LCDs can go brighter than OLEDs so that is something that LCD manufacturers will push as important.

(3) Is the real issue for OLED not its peak brightness but ABL kicking in to reduce power consumption for bright scenes?

Here are the HDR brightness figures for sustained windows on the Sony A1 and ZD9 (from RTings):
2% window: 653 cd/m2 - 1294 cd/m2
10% window: 649 cd/m2 - 1607 cd/m2
25% window: 429 cd/m2 - 1332 cd/m2
50% window: 237 cd/m2 - 899 cd/m2
100% window: 145 cd/m2 - 673 cd/m2

As can be seen for both LCD and OLED as the white window gets larger the luminance falls. So ABL is kicking in for both LCD and OLED. Although for OLED the effect is more pronounced - the 100% window is only 22% as bright as the 2% window. For LCD the 100% window is 52% as bright as the 2% window.

And below are the SDR figures. Note that calibrators often aim for a peak brightness of 120 cd/m2 for SDR so while the ABL is reducing the maximum possible brightness the A1 should always reach its target brightness for SDR.
2% window: 370 cd/m2 - 1235 cd/m2
10% window: 377 cd/m2 - 1583 cd/m2
25% window: 380 cd/m2 - 1320 cd/m2
50% window: 227 cd/m2 - 895 cd/m2
100% window: 129 cd/m2 - 672 cd/m2

Is there anything that manufacturers can do to reduce the impact of ABL (e.g. better power supplies, more heat resistant components, cooling fans, etc.) or are they constrained by EU/UK/world wide regulations on how much power a TV can use?
 
Yes Oled needs more nits :smoke:

More nits more details in the light parts of the picture.
 
Ur planet earth 2 is only 600 nits Btw :rolleyes: so ur blinded of 600 now ?
 
If content is dazzling then that's a content or interpretation issue, not a hardware capability one. It may be that exactly the same level of brightness is being employed in a more comfortable way in other scenes.

You know when peak brightness is high enough when you can't find any scenes where a brighter TV looks better than a dimmer one.
 
Yes much more required, I returned my E6 and got the ZD9 which can output 1800 nits and lets say High dynamic range video is in another league on this beast.
 
More nits more details in the light parts of the picture.
Is that really the case though? I would have thought that is down to the tone mapping curve employed. If the content on a pro monitor with 4000 nits has detail at the top end, then no TV can currently display that *if* it's using the reference tone mapping.

It's more a question of what's the best tone mapping curve for my TV?

Maybe this is where Dolby may provide a more consistent experience?

Also how bright do we need a TV to go to allow a tone mapping curve that preserves the artistic intent for most light levels, and provide enough top end nits to make the highlights bright enough to deliver the intended effect without going OTT?
 
Last edited:
Ur planet earth 2 is only 600 nits Btw :rolleyes: so ur blinded of 600 now ?
My TV is not calibrated and with HDR it automatically sets its brightness to maximum. So does that mean that the TV is simply pumping out its maximum 1800 nits for 100% white?

Also, when considering the total luminance produced and whether a TV is too bright to view comfortably you have to consider the area of brightness as well as the nits. A 100% window at 1000 nits will be producing 10 times as much total illumination as a 10% window at 1000 nits.
 
When I look at mine with Planet Earth 2 - I don't think it's possible to get a better picture for years, so I am sticking with mine, I wouldn't have a clue with the point of changing.
 
Absolute nit numbers do not tell the whole story of HDR and yet many seem obsessed by them. A lot depends on:
- your own eyes and their sensitivity to bright light
- the ambient light level
- the technology used.

Light eyes (blue, green or grey) are more sensitive to bright light than brown eyes. I have blue eyes and I know I'm more sensitive to bright lights. I'm certainly not craving for a 4000 nit TV.

In a light controlled room with a low level of bias lighting, you don't need huge maximum nit levels to have an amazing HDR experience. It's about dynamic range (from black) and not absolute maximum brightness. That's why it's called High Dynamic Range.

The display technology makes a difference. While OLEDs do not offer the same peak brightness of LCD, they provide the ability to have a very bright pixel next to a completely black pixel so do very well with specular highlights.
 
Probably it is, what are the nits values for plasma, that for years were seen as the best TVs you can get. Now it seems it all about peak brightness that can outshine a TV showroom.
 
Oleds has the Old Plasma problem with nits.

Full Qled panels might be the future of HDR tvs the best of two world.
 
Well, like the article said, likely to take more than 5 years to get mass production going. But this is the first step - the next would be some stupidly expensive non-demo monitor you could actually buy, something like the 11" Sony OLED or the 15" LG.

I don't think it is really the top-end brightness and the ability to blind the viewer or not that is the issue, but the required tone mapping. HDR10 with its absolute luminance targets and no defined tone mapping is a rather weird standard - didn't they understand the mess they were likely to create?

If you tone map the whole range (LG without Active HDR setting on), your 100 nit scenes might only be rendered at 90 nits and the picture overall can look a bit dark, especially in a lit room. If you do not tone map and instead clip (Sony with current Euro firmware and Cinema Pro) you lose highlight detail. If you do the Panasonic/Sony with other presets/LG with Active HDR setting tone mapping style choice, the upper end of brightness range gets compressed and apparently this can lead to colour banding. No free lunch.

However, I think hitting only OLEDs with this label is more than a little disingenuous. The recent OLEDs have more real-world brightness (600-700 nits, as measured by Rtings) than almost all LCDs - the LCDs with more real-world HDR brightness than the current OLEDs? Sony's Z9D, XD93, XE93, XD94, XE94, Panasonic's DX902, Samsung's Q9 (maybe Q8) and KS9500, perhaps some Chinese models. The prices on these are quite similar, or even higher, compared to OLEDs.

If you look at for example the KS9000 (same as EU KS8000) Rtings review, real world HDR scene brightness is 471 cd/m2. Sony XE85? 392 cd/m2. Q7F? 393 cd/m2. Even the praised XE90 "only" gets to 546 cd/m2. These are still midrange or upper midrange models, entry level LCDs are HDR-capable in name only. Meanwhile the blacks are nowhere near OLED level so the actual dynamic range is pretty compressed.
 
My TV is not calibrated and with HDR it automatically sets its brightness to maximum. So does that mean that the TV is simply pumping out its maximum 1800 nits for 100% white?
If your TV is doing what it is supposed to (calibrated or not), it should output exactly 600 nits for the highlights that are specified as 600 nits in the source. The fact that it sets the brightness to maximum for HDR should not enter into this, especially as the Z9D is FALD and can dim the backlight zones independently of the global brightness setting.
 
I don't think it is really the top-end brightness and the ability to blind the viewer or not that is the issue, but the required tone mapping. HDR10 with its absolute luminance targets and no defined tone mapping is a rather weird standard - didn't they understand the mess they were likely to create?
That will have been glossed over in the rush to save on licensing fees and grab more profit on the back of HDR :) It will be interesting to see how the Dolby approach compares once some discs are available.

While the concept of HDR seems good in general, the HDR10 implementation so far is underwhelming purely because of all the inconsistency, you're not sure what you're supposed to be seeing.

If Dolby can provide a more consistent experience then I think they'll walk it, and deservedly so. If they can't well, I suspect HDR will have a slow take up as it's too fiddly and frustrating so far IMO.

I don't buy the argument that Dolby will only be useful on low end sets, I think it will allow us all to do a fairer comparison between sets, which is probably what some manufacturers don't want.
 
While the concept of HDR seems good in general, the HDR10 implementation so far is underwhelming purely because of all the inconsistency, you're not sure what you're supposed to be seeing.
I thought that it was clear what we are supposed to be seeing in HDR10. We are supposed to see the EOTF tracking as defined by the ST.2084 standard. The issue is that none of the current generation of OLEDs can follow this as they don't have enough peak brightness.

Below is an example. The yellow line is what we should be seeing and the black line is what we actually see - in this case with a Philips OLED.

pq-eotf.png


and here is an example with Sony ZD9 that can go above 1,000 nits.

hdr-pq.png



With HDR-10 each TV manufacturer has the freedom to choose their own curve.

With DV, Dolby will choose the curve. However, there is no right answer. So whatever Dolby choose will simply be a compromise between dimming the picture (black line below the yellow line) and avoiding white crush (black line going flat) in the same way that all the manufacturer defined curves are.

So while DV may be more consistent across sets it wont necessarily be any better than any of the manufacturer defined curves. Indeed some people may prefer a manufacturer defined curve to the DV one. For example, if Dolby follow LGs lead and try to maintain a smooth, ever increasing curve throughout the range, then some people will find that DV may be too dark for daytime viewing. (As they do with LG.) Alternatively if Dolby follow Sony's example of sticking to the standard for as long as possible and then going flat then some people will complain about the crushed highlights - but their viewing experience will be better in the day.

Hopefully Dolby will give people the option of choosing their own tone mapping curve.
 
Last edited:
On my B6 I have problems with bright skies/clouds where I lose a lot of highlight detail. I have turned the contrast down in HDR mode to 80 which helps a bit but is not ideal. Also I find colors are a bit odd. Some tones seem garish or have a greeny blue tone to them. I also find greens have too much blue in them. HDR is a lot harder to calibrate by eye than SDR though.
 
@GadgetObsessed the problem is that HDR10 is a very forward-looking standard in that there is no consumer display that is able to track the EOTF in ST.2084 up to 4000 nits (let alone 10000 nits if that was the ultimate goal), nor is there any display that is anywhere near to being able to fully show Rec.2020 colour space - even the DCI-P3 used in UHD Blu-rays is a hard target! This was well known, and it was also known that it would be many years before displays capable of these targets would appear. And still they did not specify a standard tonemapping to use in case the display was not a $30,000 mastering monitor.

The whole HDR thing stinks of forced upgrade cycles, from its beginning with the "surprise" arrival of high dynamic range as the new thing - something that really crapped on the purchasers of the expensive, supposedly future-proof, HDCP 2.2 4k TVs from 2014.

There hasn't been a year since without a new standard or three to aspire to, and even if the standards themselves settle in a couple of years, there's still a profitable decade of incremental brightness and colour gamut improvements to look forward to. AVForums can just write a template with "still not 100% Rec.2020, can't give reference award" for their future manufacturer flagship reviews.

Rant over... but manufacturers, just because we are obsessed with new tech does not mean we are not onto your game :mad:
 
The PEAK 0-25% brightness is ok, the problem is the ABL and the brightness on larger parts of the screen, this gives a weird effect where something like car headlights would be very bright, then something like a sunny sky etc. (which takes up more than 25-50% of the screen), will look dull. So yes overall, peak brightness on small bits of screen is ok, although 1000 nits would be better, larger parts of screen and ABL are the problem on OLED. I think 4000 or 10,000 nits is just silly though it is too bright, 1000-1500 nits is good enough.
 
Look at a full peak white on an OLED........brings a grown man to tears I tell ya.....
 
Look at a full peak white on an OLED........brings a grown man to tears I tell ya.....

Look at a lcd black levels :cool: can make anyone regrett spending money on Old tech
 
Hahaha please........the tech in the right hands kicks butt to the highest degree!!

Most mot intrested in poor view angels n light bleeds of few "zones"
 

The latest video from AVForums

Is Home Theater DEAD in 2024?
Subscribe to our YouTube channel
Back
Top Bottom