Forum Topic: Do OLED TVs really need more brightness for HDR?
Is it really all about the highest nit count?
Home AV Article
90AVForums member GadgetObsessed - as well as being perfectly named – posed an excellent question in the OLED TV forum recently (30/05/17) on the topic of just how much brightness was necessary to create an effective High Dynamic Range image?
Gadget (we’ll call him that for short) is actually an LED LCD TV owner – the very fine Sony ZD9, no less, but something struck him while watching the Ultra HD Blu-ray of Planet Earth 2 over the weekend. In his own words, there were a couple of scenes shooting into the sun where the sun was simply too bright and this was in a bright viewing room. If he had been watching in the dark then Gadget felt he really would have been squinting.
This dazzling effect made GadgetObsessed wonder a few things:
(1) If this years OLEDs are getting to about half the brightness of the Sony ZD9 are they actually bright enough already?.
So he wonders if all this discussion about OLED TVs not being bright enough for HDR is just people getting too focussed on the numbers rather than what it looks like in reality?
(2) If we already have sets that can generate dazzling specular highlights during the day, then why are manufacturers seemingly on a race to 4,000 and 10,000 nits? Is it just marketing driven as a bigger number is always an easier sell to consumers? Or is it simply because LCDs can go brighter than OLEDs so that is something that LCD manufacturers will push as important?
(3) Is the real issue for OLED not its peak brightness but ABL (Automatic Brightness Limiting) technology kicking in to reduce power consumption for bright scenes?
So, over to you, there’s at least three questions for you to chew over and the more contributions speaking from personal experience the better…
To comment on what you've read here, click the Discussion tab and post a reply.