Do OLEDs really need more nits?

Most mot intrested in poor view angels n light bleeds of few "zones"

20170530_224540.jpg
20170530_225252.jpg

That's what my bad boy zones can do haha, note the same detail in the dimming zones off pic to on.....hopefully you're not viewing on an iPhone.

OLED tech just isn't there yet, but when it is I'll be there, but as of now, I get the best from both worlds.
 
Not much shadow details there :cool: so black crushing needed to try to make it look black?
 
Not much shadow details there :cool: so black crushing needed to try to make it look black?

The camera has crushed it slightly, but my point being in the grey washed out one it's showing the same details as the local dimming shot. I understand you probably have little to no experience with FALD sets within the last 12-18 months. Things have moved on a bit.

Only 1 reason to own an OLED is for viewing angles.
 
I have manually created the following chart which shows the tone curve for the Philips OLED (in green) together with the curve for the Sony ZD9 LCD (in red). Also shown is the standard target (in blue) for something mastered to 1,000 nits. (Note that the 2017 LG and Sony OLEDs seem to go a bit brighter than the Philips with a maximum around 800 nits.)

Tone Curve.jpg


One thing that I hadn't realised before is that sets that can go beyond/above the reference curve will do so.

It looks like the Sony maintains a curve all the way up i.e. it doesn't just follow the standard and clip once it reaches 1000 nits at around 77% brightness. Instead it shows the brightest highlights as bright as it possibly can of around 1,800/1,900 nits.

I assume that this will have been similar while I was watching Planet Earth 2, which apparently is mastered to 600 nits i.e. the bright scenes that I thought were too bright could well have been at the TVs maximum 1,800 nits.

It is also worth noting that the really bright scenes that I felt were a bit too bright had a large part of the screen display being very bright. I doubt I would have felt the same if the very bright areas had been the same brightness but covered a smaller portion of the screen.

To really answer my initial questions of whether OLED is bright enough for HDR I would have to be able to watch one of the 2017 OLEDs at home. However, I cannot really justify spending a few thousand on a set just to satisfy my curiosity. (However, if there is a generous manufacturer out there wanting to loan me a set then I am willing to test it out for them and report back here.)

Now I don't think it would necessarily be useful to compare the ZD9 and an OLED side by side.
One thing I found with plasma (which is much dimmer than OLED) is that although I knew that
brightness was limited and that ABL was kicking in - it was rarely an issue in practice.

I assume that that is partly because of the way that they eye reacts to light, i.e. when watching a dim scene our pupilopens to let in more light. When we watch a bright scene our pupil contracts to reduce the light hitting our retina. So even if one TV generates twice as much light as another that does not mean that we will perceive much of a difference when watching one set in isolation to another. However, if you put the two together (as our pupil has only one brightness "setting") then the difference will be obvious.

When I put my 65" Sony next to my Samsung last gen plasma (which was brighter than my last gen Panasonic plasma) I was shocked at how dim the plasma looked in comparison with high brightness scenes. However, it never looked that way when viewing the plasma by itself.
 
I seem to be in a minority of one here, but I am not in the least bit interested in how bright a TV goes, nor am I interested in numbers.

For me, it's all about contrast between the brightest whites and the darkest blacks i.e. the dynamic range and producing a natural, accurate picture across all sources, from DVD's to Blu-Rays, gaming to broadcasts.

I do love this site and the information and support it provides, but it also seems to produce performance anxiety in some with its emphasis on graphs and curves. In everyday life, you just close the curtains!

I have noticed that non-techie types i.e. the majority of the viewing public tend to have both their TV's and any other screen-related device far too bright. It's a standard tactic in showrooms too: a bright picture will always have more initial impact than a balanced one, but I don't know how easy it would be to actually live with one everyday, it would be too fatiguing for me.

Personally, I have chosen OLED. It's not perfect nor do I expect it to be, but it seemed the obvious and most logical change after leaving plasma. And it's bright enough for me, despite the numbers. Also, for me, content matters more than hardware - no point having the "best TV in the world" if all you watch are soaps, shopping channels and "Mrs Brown's Boys"!;)
 
Last edited:
Too many folk too obsessed with HDR and especially Dolby Vision - which some people seem to think is a must have without ever having seen any Dolby Vision content but they still NEED IT. Everything to do with the HDR rollout is a mess - OK somebody can produce graphs and quote standards but 99.9% of the buying public still aren't sure what HDR is supposed to look like and that is quite a major faux pas. Truer to life or just a gimmick like 3D?? Seems more of a gimmick so far ...

Personally I will be concentrating on the SDR performance as that will make up 95%+ of my viewing in the next 12 months. Of course we then come down to SD + HD performance and that is why I have thus far avoided OLED ... low bit-rate or poorly mastered material just looks gruesome.
 
Loads of DV content on netflix and it looks Fantastic
 
Yes I suppose it will look pretty decent if you have a DV capable TV but less impressive perhaps if you don't.

:rolleyes: Hmm? think you need to see it...
 
After a lot of research I recently bought an LG OLED B6, which I am over the moon with (using settings that appear on avforums), I simply love the deep blacks and colours. Coming from owning several Panasonic Plasma's it felt like the right direction to go. I recently watched UHD HDR Deadpool on a LG UP970 and there is a scene where there is a dark (ish) room with a large Window at the back facing the viewer. The difference in the bright and dark parts of the screen were incredible, my fiancee and I had to almost look away at the brightness of the light coming through the window. Why oh why I would need it any brighter I don't know!
I also bought and watched a couple of the Planet Earth 2 UHD discs and they are unbelievable. I cannot imagine it looking any better!
I'm completely happy with my OLED TV and would never swap it for anything. It's future proof (Dolby Vision on Netflix I love - can't wait for the UHD dics) and the picture is amazing. I suppose it's each to their own but it is OLED all the way for me.
 
:rolleyes: Hmm? think you need to see it...

No my point is how many people who are obsessing about Dolby Vision have actually seen any content. It is not something that can be easily demoed until we get something more than streaming ...
 
And the 'white' on my TV are 'white' I can assure you - not grey!
 
A great bluray,Planet Earth 2 - a reference hdr source to launch the nits debate.
I actually on another thread mentioned this very disc in response to a thread when some one said I exaggerated in claiming 700 nits was not sufficient in Oled for bright scenes in daylight viewing.
However, the Samsung Q9 has been demoed recently with this 4k disc by some UK journalists- who knows the ZD9 -and is the brighter with hdr highlights than the Sony.
I think the Samsung now is on balance the reference set for testing out hdr to the max till the other reference 4,000 nits ZD9 is released.
As far as I am concern a fab hdr capable TV needs to match the human eyes operating limits.
My optometrist tells me that the human eyes like to operate with 2k nits so it make senses for your TV to get to that range. Samsung's own analysis talks alot about colour volume they released a paper on their site on how lcd trumps WRGB Oled.In principle I agreed with 2 and qualified 1.
To display say bright vibrant colors perfectly a TV needs to juice out high brightness - but a more contentious point leveled is that the W pixel in Oled means they aren't true RGB display and can not show the superior image of bright objects.I agree with that and on my Samsung witnessed this over my C6 Oled in daylight.
When viewing the glowing scenes of Planet Earth 2 bluray it should really be in daylight because it is how the cameraman captures it.To see the scene In a darkroom this introduces a false viewing condition.In real life you don't see the scene in a dark tunnel and feel uncomfortable.
In conclusion, Oled needs to be brighter to allow the eyes to operate optimally and see colour and brightness as from the lense.1 to 1.
Oled does excel with the infinity contrast on less bright scenes and delight with SDR,really pops, as the eyes can see those steps (15 SLR equivalent f stops, in contrast to a mere 8,000 static with dimming ON measured on the best lcd Sony XE940), sdr or hdr concerts with 70/30 bright to dark sources are Oled perfect viewed in the dark as the lense sees it.
We need a two TV solution and I adopted this idea as some others have.
 
Last edited:
A great bluray,Planet Earth 2 - a reference hdr source to launch the nits debate.
I actually on another thread mentioned this very disc in response to a thread when some one said I exaggerated in claiming 700 nits was not sufficient in Oled for bright scenes in daylight viewing.
However, the Samsung Q9 has been demoed recently with this 4k disc by some UK journalists- who knows the ZD9 -and is the brighter with hdr highlights than the Sony.
I think the Samsung now is on balance the reference set for testing out hdr to the max till the other reference 4,000 nits ZD9 is released.
As far as I am concern a fab hdr capable TV needs to match the human eyes operating limits.
My optometrist tells me that the human eyes like to operate with 2k nits so it make senses for your TV to get to that range. Samsung's own analysis talks alot about colour volume they released a paper on their site on how lcd trumps WRGB Oled.In principle I agreed with 2 and qualified 1.
To display say bright vibrant colors perfectly a TV needs to juice out high brightness - but a more contentious point leveled is that the W pixel in Oled means they aren't true RGB display and can not show the superior image of bright objects.I agree with that and on my Samsung witnessed this over my C6 Oled in daylight.
When viewing the glowing scenes of Planet Earth 2 bluray it should really be in daylight because it is how the cameraman captures it.To see the scene In a darkroom this introduces a false viewing condition.In real life you don't see the scene in a dark tunnel and feel uncomfortable.
In conclusion, Oled needs to be brighter to allow the eyes to operate optimally and see colour and brightness as from the lense.1 to 1.
Oled does excel with the infinity contrast on less bright scenes and delight with SDR pops as the eyes can see those steps (15 SLR equivalent f stops, in contrast 8,000 static with dimming ON measured on the best lcd Sony XE940), concerts with 70/30 bright to dark sources are Oled perfect viewed in the dark as the lense sees it.
We need a two TV solution and I adopted this idea as some others have.
I couldn't disagree with your post more. At the end of the day, video content is mastered in controlled lighting conditions just like audio is mixed in controlled studio conditions. It is those conditions that we are trying to match. The problem with daylight viewing is that you generally lose a substantial amount of shadow detail and can be affected by screen reflections. Dark conditions allow the visual distractions and unwanted ambient lighting to be removed. HDR is about dynamic range and trying to replicate what the director intended when watching his or her studio monitor in controlled lighting conditions.
 
Peak brightness levels are a part of HDR but only a Part. Would I rather have a 1000nit full REC2020 TV or a 4000nit 75% REC2020 TV? In truth I would rather have a 4000nit Full REC2020 TV so it accurately maps colour and luminescence with all (current) HDR Content but if I had to choose between the two cases above, I think it would very much depend on the content.

As for finding content painfully bright, I have had that experience with SDR too whilst watching some movie or TV show and that's with the TV set at recommended SDR levels. Yet I have watched some HDR which is significantly brighter with less pain.

The difference though was often what was on screen before. In the SDR example that I found painful, the scene was dark - a guy wandering through a forest with a torch illuminating his path. The diffused light of the torch bouncing off the trees wasn't that intense and the overall APL was very low, then all of sudden the guy spun round and shone the torch at the camera so the overall APL suddenly jumped from low to 'high' - well high for SDR which I found quite painful and caused me to squint. Dark scenes cause the pupil to open up to let more light in so a sudden brightness increase doesn't give the pupil time to contract.

Its like sitting in a dark room and then someone suddenly turning the light on - even a 40w bulb can be painful but if you are in a room with a 60w bulb and someone turns on a 100w bulb, the extra brightness is not an issue. Its this principal that HDR Content makers need to be aware of. You don't want to go from 0 to 1000nits (or more) instantly.

I still find it odd that anything 'white' has to be at the maximum brightness. Since when was a white shirt as bright as a arc welders torch or as bright as the sun? They can still be 'bright' but not necessarily 1000 or 4000nits bright!
 
I don't think HDR is a gimmick at all, it will definitely be used a lot in the future, it looks very good if you have good quality HDR content and a TV that can actually do HDR properly. Not sure how 10 bit colour and high dynamic range is a gimmick, main problem is that 90% of TV's cannot do HDR properly, so it ends up looking almost worse than SDR, if you have OLED or FALD LCD it is good.
 
Last edited:
Peak brightness levels are a part of HDR but only a Part. Would I rather have a 1000nit full REC2020 TV or a 4000nit 75% REC2020 TV? In truth I would rather have a 4000nit Full REC2020 TV so it accurately maps colour and luminescence with all (current) HDR Content but if I had to choose between the two cases above, I think it would very much depend on the content.

As for finding content painfully bright, I have had that experience with SDR too whilst watching some movie or TV show and that's with the TV set at recommended SDR levels. Yet I have watched some HDR which is significantly brighter with less pain.

The difference though was often what was on screen before. In the SDR example that I found painful, the scene was dark - a guy wandering through a forest with a torch illuminating his path. The diffused light of the torch bouncing off the trees wasn't that intense and the overall APL was very low, then all of sudden the guy spun round and shone the torch at the camera so the overall APL suddenly jumped from low to 'high' - well high for SDR which I found quite painful and caused me to squint. Dark scenes cause the pupil to open up to let more light in so a sudden brightness increase doesn't give the pupil time to contract.

Its like sitting in a dark room and then someone suddenly turning the light on - even a 40w bulb can be painful but if you are in a room with a 60w bulb and someone turns on a 100w bulb, the extra brightness is not an issue. Its this principal that HDR Content makers need to be aware of. You don't want to go from 0 to 1000nits (or more) instantly.

I still find it odd that anything 'white' has to be at the maximum brightness. Since when was a white shirt as bright as a arc welders torch or as bright as the sun? They can still be 'bright' but not necessarily 1000 or 4000nits bright!

You should use a bias light, it makes the blacks look better and stops eye glare.
 
A pointless 'war' to my mind...like contrast used to be.

All I know is that the one time I really noticed high nits on the oled was on Marco Polo during a cave scene where lightning was happening outside. It was 'sooooo' bright it made me squint! If an OLED has 'low' nits and can make me squint Im really not interested in anything higher! Whats the honest point to it other than manufacturer point scoring?

Ive 'never' thought anything other than 'wow' watching my oled and the belief it 'could be brighter' has never even entered my mind!

Dave:)
 
Ultimately, only a real life public viewing test of a Zd9,Q9,E7 in a controlled condition with 4k hdr discs like Planet Earth 2,St Andreas Fall to see what the average person thinks about the nits issue.
I would accept the outcome.

Would the public prefer the lcd over the oled in daytime?
Which is best for colour production?
Which is better for dark scenes?
The order of preference for buying choice?

It is the only way to answer the nits question and arrive at a fair democratic answer.
Would the movie enthusiasts but not TV experts agree with the TV critics?
Dw89's post-.the last paragraph of just calling on the image as how 'You' see is the best sliderule of measure.
Don't think about it.
 
Last edited:
I don't think that's the right test. The problem with HDR right now is we have no reference. No TV can do a 4000 nit curve, and as some of the graphs above show, the Sony has a curve that's brighter than the reference to go above the 1000 nit reference curve.

Unlike SDR, which all TV's could manage in terms of brightness, we can't rely 100% on calibration now to give us a like for like comparison either.

So any test would IMO need to have a studio reference monitor used for HDR grading that *does* show all the nits levels for the films being compared. This then gives the audience a reference point against which to judge the consumer TV's. We already know that in general people prefer brighter with no reference, else shops wouldn't have embarked on displaying TV's in vivid modes :)

We also need to know what modes on the TV's represent the TV's closest to the reference tone mapping etc.

You can then judge the TV's in two ways, which gets closest to the reference, and then perhaps which people just prefer in general.

The BIG problem for me at the moment with HDR is that there is no reference we can see against which to judge what it should look like, and the tone mapping is too open to interpretation, as is the standard in terms of nits that content is mastered to, and this is my biggest beef with HDR10. The curves posted above by gadget obsessed to me, just emphasise that point as a) they're only for 1000 nits, so what about other content, and b) the Sony goes brighter in its interpretation to hit 1800 nits, so that's no good either in my view.

If we need different reference curves for content mastered to different levels then what are they (I think we've all seen the 1000 nit curve now) and how does a TV know and then switch between them?
 
At the moment, there is some experimentation with HDR and that extends to content makers too. As Star Trek: Beyond demonstrates, its not necessarily all about APL or hitting 1000nits (if that's the peak brightness standard it was mastered too). All Peak Brightness means is that IF (for example) looking directly at the Sun is the brightest thing possible, that would be at 1000 (or 4000nits if that's the mastering standard) BUT if you don't have the sun directly on screen, everything is scaled accordingly which may mean the brightest element could only peak at 600 nits.

The problem is though, especially with tone mapping algorithms. ALL UHD Premium TV's - including OLEDs - should have no problem hitting 600nits but most would use some 'algorithm' that scales down that '600nits' to 450nits for example because it needs the room to fit 1000nits in IF that cropped up. Some OLEDs TVs may map brightness perfectly and display that 600nits as 600nits but then the next film that hits 1000nits would have clipping of all highlights above 700nits.

The thing is though, that just because something is Mastered to a certain level, doesn't and shouldn't mean that the actual Peak Brightness will hit that point. You also want to have 'realistic' and consistent natural looking people - you don't want one film where the people look overall dimmer, just because its mastered to 1000nits than a film that's mastered to 4000nits and people looking as if they have been irradiated because of the amount of 'glow' they have.

Its the 'white' shirt issue. Just because its White, it shouldn't mean its at Maximum peak brightness. Not all white objects need to look like they are under UV light. You don't walk down the street and have these incredibly bright shirts illuminating the world, glowing so bright. Walking around a business sector with all these very bright and illuminating shirt collars/cuffs sticking out from black Pinstripe suits... This is where I think mastering gets it wrong. I don't see why White text has to be so bright - like 'end credits' just because they are white. At the moment, it seems 'colour' almost determines the brightest points rather than the object. A white shirt can be as bright as the sun....
 
This is where I think mastering gets it wrong. I don't see why White text has to be so bright - like 'end credits' just because they are white.

Absolutely agree. I hate the glaring white text, although it's nice to see it without halos on an OLED. It's early days with HDR mastering. To me, it's very much like 3D in the early days when the effects were overdone.

Having said all that, listening to some comments throughout this forum would suggest that HDR is a disaster. However, I have been pleasantly surprised just how good HDR looks in general across Ultra HD blu ray discs, streamed services and PS4 games. And it's only going to get better as everyone in 'the chain' improves their techniques and approaches. I think the limited light output of OLED is not the dominant issue with HDR.
 

The latest video from AVForums

Is Home Theater DEAD in 2024?
Subscribe to our YouTube channel
Back
Top Bottom