Silicon Warrior
Novice Member
I'm considering buying an LG C9 for my living room. Since the room receives a considerable amount of natural light, I'm naturally worried that the TV will not be sufficiently bright for comfortable daytime viewing. HOWEVER. For nearly 10 years, I've had a 54" Panasonic VIERA G10 Series plasma TV in that room, and I've found that it's bright enough. Therefore, if the C9's at least as bright as the G10, I think I'd be good to go!
However I'm having some trouble comparing the two TVs. A decade ago, reviewers mostly used foot-lamberts to measure brightness. Today, of course, the preferred unit of measurement is nits. On HDGuru.com, I found an article saying that my Panasonic puts out 31 ft. lamberts in THX mode, with a maximum of 92.2 ft. lamberts when contrast control was set to 100% and picture control set to Vivid.
Using a conversion calculator found at KylesConverter.com, I converted these two numbers to 106.21 nits and 315.9 nits respectively.
Here are my three questions:
[1] Am I performing this conversion properly? Can ft. lamberts be compared to nits simply by making this calculation?
[2] How does the G10's 106.21 nits/315.95 nits compare to the C9? Of course I can see all the nits figures spelled out in the dozens of reviews of the C9, but I don't fully understand under what conditions these nits were measured? I see terms like, "Peak 2% Window," "Sustained 50% Window," "Real Scene Brightness," "SDR Brightness," "HDR Brightness," "Calibrated vs. Uncalibrated Brightness" and on and on and on. Aieee. Which measured conditions would compare to the two conditions under which HDGuru measured the G10? I'm trying to do as close of an apples-to-apples comparison as is possible.
[3] Finally, would there be any reason why the C9's anti-reflective coating would be any WORSE than the G10's? (Naturally I'm concerned about this because of the amount of ambient light I'm dealing with.) I'm assuming anti-reflective coatings have only improved over time, but maybe plasma TVs had particularly good anti-reflective properties not matched by more modern televisions?
THANK YOU for any guidance you could offer here!
However I'm having some trouble comparing the two TVs. A decade ago, reviewers mostly used foot-lamberts to measure brightness. Today, of course, the preferred unit of measurement is nits. On HDGuru.com, I found an article saying that my Panasonic puts out 31 ft. lamberts in THX mode, with a maximum of 92.2 ft. lamberts when contrast control was set to 100% and picture control set to Vivid.
Using a conversion calculator found at KylesConverter.com, I converted these two numbers to 106.21 nits and 315.9 nits respectively.
Here are my three questions:
[1] Am I performing this conversion properly? Can ft. lamberts be compared to nits simply by making this calculation?
[2] How does the G10's 106.21 nits/315.95 nits compare to the C9? Of course I can see all the nits figures spelled out in the dozens of reviews of the C9, but I don't fully understand under what conditions these nits were measured? I see terms like, "Peak 2% Window," "Sustained 50% Window," "Real Scene Brightness," "SDR Brightness," "HDR Brightness," "Calibrated vs. Uncalibrated Brightness" and on and on and on. Aieee. Which measured conditions would compare to the two conditions under which HDGuru measured the G10? I'm trying to do as close of an apples-to-apples comparison as is possible.
[3] Finally, would there be any reason why the C9's anti-reflective coating would be any WORSE than the G10's? (Naturally I'm concerned about this because of the amount of ambient light I'm dealing with.) I'm assuming anti-reflective coatings have only improved over time, but maybe plasma TVs had particularly good anti-reflective properties not matched by more modern televisions?
THANK YOU for any guidance you could offer here!