GadgetObsessed
Distinguished Member
- Joined
- Jul 6, 2001
- Messages
- 3,226
- Reaction score
- 1,849
- Points
- 1,050
I am very impressed with the current HDR10 implementation. HDR really does seem the way forward and is the most significant improvement in PQ since HD came along. Yes there are questions about things like tone mapping, mastering at different nit levels, etc. but overall this doesn't distract (much) from the generally large benefit of HDR.Absolutely agree. I hate the glaring white text, although it's nice to see it without halos on an OLED. It's early days with HDR mastering. To me, it's very much like 3D in the early days when the effects were overdone.
Having said all that, listening to some comments throughout this forum would suggest that HDR is a disaster. However, I have been pleasantly surprised just how good HDR looks in general across Ultra HD blu ray discs, streamed services and PS4 games. And it's only going to get better as everyone in 'the chain' improves their techniques and approaches. I think the limited light output of OLED is not the dominant issue with HDR.
My interest in starting this thread was more about the focus some people have on claims that OLED cannot do proper HDR because it it's peak brightness is not as bright as the brightest LCDs. As an owner of one such LCD I can see that in some cases that 1,800 nits is simply too bright which to me indicates that you don't need anything like 1,800 nits for a very good HDR effect - so the claims about HDR on OLEDs are not well founded.
Also OLED will always have the advantage of individually controllable pixel level brightness. The ideal for intra frame contrast. Especially when you have bright specular highlights against dark backgrounds.
This OLED praise may sound strange coming from someone who has just bought a top range LCD but I am trying to remain objective despite my own purchasing decision. I had other reasons for choosing an LCD such as price - I really like Sony's processing but cannot justify buying a 65A1 - and potential concerns about image retention screen burn as the TV is used for gaming (including as a monitor when using Vive VR) and is sometimes left showing things like Sky News/BBC News for long periods.
Ironically, I have probably gone overboard with the use of numbers and graphs myself just to show that focussing on a single number - peak brightness - may not be that important.
The graph showing the Sony was included because I was surprised that sets that could follow the standard tone curve completely - didn't but instead went to their own peak brightness. I can understand that may be better than just flatlining like the standard at 1,000 nits which would crush bright details - but then why does the standard do that?