A90J and G1 peak brightness updated - 3 second boost

Michael7877

Established Member
Joined
Oct 29, 2020
Messages
388
Reaction score
94
Points
74
Age
34
Location
Ontario, Canada
This thread's been updated a couple times as reviews have come out. I'm done updating this thread, final opinion on the A90J is below:

Yesterday I went to my local big box store and explored the settings of an A90J. It was mounted beside an A9G.
After putting both sets in the most accurate "Custom" picture mode and adjusting everything to be the same (gamma, motion etc), I could not tell them apart. That's a lie: the Expert 1 colour temperature is slightly cooler (50-75k) on the A90J - and probably closer to the correct 6500k. I tested both SDR and HDR, and vivid mode was brighter on the A90J. But since nobody who wants an accurate picture uses that mode, you might as well disregard the finding.

So, if you have an A9F/A9G/A8H (which are all basically the same), and possibly even an A8G, A8F, or A1E (which are also all basically the same) there is no point in upgrading. Unless you're a gamer and maybe even if you're a gamer - vrr doesn't work on the A90J yet, and last year's 900H still doesn't have it, as was promised (it was supposed to be released with it and now the next series is launching without it). In summary, everything performed pretty much the same - didn't see fewer or more interpolation errors, better colour, anything. Except the brighter peaks in Vivid mode.

So, I think, if you need a good OLED TV and don't require HDMI2.1 features (4k120/VRR), get an A8H or A9G right now for a good price while they're on sale. Unless you use and appreciate vivid mode or like spending more for the same thing or waiting longer for no reason.
Edit: Also, Unless you need 120hz BFI. The flicker on the lowest setting is tolerable (in the store anyway) and noticably improves motion resolution. A further improvement in motion resolution (an additional ~40% of what is gained by using frame interpolation to turn 24fps material into 120) is realised on the lowest BFI setting.


Previous update: I just came across a post showing that the A90J does 1350ish nits for 3 seconds, then drops down to 850-900. If this is true, and in a normal calibrated mode that comes down to 1050-1100 nits for 3 seconds, dropping to 750 nits afterward, that'd be great news. Most scenes in new movies are only 1-10 seconds, so as long as the dimming from the very high peak to the sustained peak is gradual (happens over 5-10 seconds and isn't visually obvious) the TVs should be great. Reviews will have to describe this - so far they don't. I'll leave the original post below as it is for now because it has the sustained brightness measurements, and no reviews have the details yet.

The LG G1 measures between 735 and 740 nits for 1, 2, 5, and 10% windows
And 736 nits peak on a 20% window (reviewed.com

The Sony A90J measures 775 nits on a 10% window (pcmag)
And 723.3 nits peak with a 20% window (reviewed.com)


After being as forgiving as possible with the math,
The 2020 A9G to 2021 A90J is just 12% brighter. (641 to 723 nits)
The 2020 GX to 2021 G1 is only 5% brighter. (704 to 736 nits)

The numbers used for comparison come from averaging rtings measurements for real scene peak brightness and 2% window. If just the 2% windows are used, the A9G is only 5% brighter and the G1 is actually dimmer than the GX. 8 percent dimmer to be exact. And if the A9G's cheaper sibling the A8G's 768 nit 2% window is used, it's 6% brighter. It's worth mentioning a 5% difference in peak brightness is pretty much imperceptible, even side by side, so it's very safe to say there's no brightness improvement with this year's sets. 2016, 2017, 2018, 2019, and 2020 OLED TVs basically have the same peak brightness.

The G1 has a panel that's 20% more efficient and the A90J drives all 4 subpixels at once. - - - manufacturers regarding the non-existent "improved peak brightness!"...

My guess is they are using the efficiency improvement invisibly. To lengthen the lifespan of the panels, and make it a bit harder to get burn-in.

Has anyone come across other reviews of either of these TVs?

LG G1

Sony A90J


Update:

pcmag.com has full field white at 180 nits. Rtings has the A9G at 147 in SDR's

180/147 = 1.22 - 22% brighter. First decent improvement.

But it does 174 nits in game mode.

Pcmag has full field white on a 10% window at 775 nits, 50 more than reviewed.com's.
For those who don't know, 50 nits difference in TVs of the same model is normal tolerance. For now, it looks like if you buy an A90J you can reasonably expect a peak brightness of ~750 nits.

Another disappointment is the gamut measurement done by pcmag. It shows the green pixel in the same spot as the 2016/17/18/19/20 TVs. A green that is not quite saturated enough and slightly leaning toward yellow. In their review I think I saw them mention that the green subpixel had been improved. Contradicted by their own chart. Lol.
It's not too detrimental to viewing that the most saturated green of DCI-P3 can't be represented by OLEDs. The difference it would make viewing real world content is negligible. If the green was a very narrow 532nm - as called for by Rec2020 - that'd be amazing. I think it's a big ask of the technology right now, but half way between DCI-P3 and Rec2020 wouldn't be unreasonable. Content generated with cameras capable of fully capturing the Rec2020 colour space (available soon) would be noticeably improved by a TV with a green subpixel half way to Rec2020 from DCI-P3. Scenes outside with trees and plants benefitting the most.
edit: After carefully looking at the chart, although green isn't improved as claimed, the red may be. I saw somewhere the pre 2021 panels had red at 617nm - the A90J might be closer to 630 if the picture is right. The difference this will make in picture quality is much smaller than the improvement green at 532 would be. But it's there (if it's there) (with Rec 2020 content obviously, DCI-P3 red is 615nm, so covered by previous OLEDS)

I thought this was going to be the year. All I see is HDMI 2.1 - 2 years late
 
Last edited:
Thanks for the links - hadn't seen those before. I only scan-read Reviewed.com's piece but their HDR peak for the A90J is on a 20% window and most of the other sources I've seen have quoted 1,300 nits on a 10% window in Vivid mode (lower when calibrated).

Stop the FOMO did an A90J post-calibration interview with a US retailer called Value Electronics. IIRC their calibrator thought that the A90J could be touching 1,000 nits on small specular highlights - for some reason he had difficulty measuring those.

While any progress is always welcome, I agree it's disappointing that we haven't yet seen a step-change in WOLED brightness and colour gamut. Perhaps we have to wait for Samsung's QD-OLED (and whatever response LGD can make to it!) for a real change of gear in this technology.
 
Thanks for the links - hadn't seen those before. I only scan-read Reviewed.com's piece but their HDR peak for the A90J is on a 20% window and most of the other sources I've seen have quoted 1,300 nits on a 10% window in Vivid mode (lower when calibrated).

Stop the FOMO did an A90J post-calibration interview with a US retailer called Value Electronics. IIRC their calibrator thought that the A90J could be touching 1,000 nits on small specular highlights - for some reason he had difficulty measuring those.

While any progress is always welcome, I agree it's disappointing that we haven't yet seen a step-change in WOLED brightness and colour gamut. Perhaps we have to wait for Samsung's QD-OLED (and whatever response LGD can make to it!) for a real change of gear in this technology.

Edit: pc magazine did a 10% window on the A90J and measured it at 775 nits.
TVs are supposed to be able to hit their peak brightness over at least 18% of the screen. Looks like the A90J does, and starts dropping shortly after. A 20% measurement is unusual, it seems they rounded the 18% to 20 for some reason. Lots of reviewers do 2, 5, 10, and 25, skipping 18. We get it this way lol.

Disappointment is why I made this thread. The qd-oled technology with dedicated rgb (sans white) subpixels is what I plan to upgrade to now, so long as it doesn't have horrible ABL and goes over 1000 nits. Colour volume would be solved. Unless microLED does the same thing sooner.

I wouldn't be surprised if the 20% increase in efficiency is comparing 2016 to 2021 panels, and nothing but the normal incremental changes they make were done this year

I want a display which can be used in both a bright room and a pitch black room, and has a quality image (high contrast required) in both.
 
Last edited:
Edit: pc magazine did a 10% window and measured 775 nits.

Disappointment is why I made this thread. The qd-oled technology with dedicated rgb (sans white) subpixels is what I plan to upgrade to now, so long as it doesn't have horrible ABL and goes over 1000 nits. Colour volume would be solved. Unless microLED does the same thing sooner.

I wouldn't be surprised if the 20% increase in efficiency is comparing 2016 to 2021 panels, and nothing but the normal incremental changes they make were done this year

I want a display which can be used in both a bright room and a pitch black room, and has a quality image (high contrast required) in both.
"One TV technology to rule them all" - that would be the holy grail.

What are you using at the moment? I upgraded from a plasma to an OLED last year so the 'low brightness' argument against OLED doesn't cut a lot of ice with me. IIRC the brightest plasma ever made only achieved 200 nits peak on a 10% window - and my old set certainly wasn't the brightest plasma ever made.

From what I understand, reading Display Daily and other sources, a quantum leap in OLED brightness and colour gamut really depends on the development of a better blue emitter. If we could get a better blue, we could dispense with the colour filters in WOLED - which block a lot of light - and have a true RGB subpixel structure.
 
"One TV technology to rule them all" - that would be the holy grail.

What are you using at the moment? I upgraded from a plasma to an OLED last year so the 'low brightness' argument against OLED doesn't cut a lot of ice with me. IIRC the brightest plasma ever made only achieved 200 nits peak on a 10% window - and my old set certainly wasn't the brightest plasma ever made.

From what I understand, reading Display Daily and other sources, a quantum leap in OLED brightness and colour gamut really depends on the development of a better blue emitter. If we could get a better blue, we could dispense with the colour filters in WOLED - which block a lot of light - and have a true RGB subpixel structure.

I have an A8G, bought it almost a year and a half ago.

Removing the filters would improve efficiency of an RGB only display with filtered RGB elements by a factor of 3-5 (lifespan as well). Though the white subpixel in LG's panels helps efficiency when displaying normal images. Most pixels are not very saturated in colour, and with most of the brightness coming from the non-filtered white element and just some colour being added by the filtered element, power consumption and wear is minimized.

I don't know much about OLED recent developments, but I do know blue is a problem. Being the shortest wavelength, it is the highest energy and degrades fastest.

For the rec2020 green, 532nm, if you tried to filter it from the white emitter LG is using, its level would be too low. The colour would lack peak brightness and probably degrade even faster than their red currently does. They could do it and add it as a second green subpixel. It'd widen the gamut through at least the lower brightness range. Probably not worth the trouble

They'll figure out both of the problems, or implement what they've figured out already. As soon as they've milked what they're selling now, for as much as they can lol
 
Sony A90J

LG G1

The Sony A90J measures 723.3 nits peak at 20% window (reviewed.com)
And 775 nits 10% window (pcmag linked below)

And the LG G1 measures 736.7 nits peak

Neither of these numbers are impressive when compared to 2016, 2017, 2018, 2019, or 2020 OLED TVs. They're actually disappointing because the G1 has a panel that's 20% more efficient and the A90J drives all 4 subpixels at once. Supposed reasons for an improved peak brightness!

The 2020 A9G to 2021 A90J is just 12% brighter. (641 to 723 nits)
The 2020 GX to 2021 G1 is only 5% brighter. (704 to 736 nits)

And that's after being as forgiving as possible with the math. It's worth mentioning that a 10% difference in peak brightness is almost imperceptible, even side by side.

The 641 and 704 used for comparison come from averaging rtings measurements for real scene peak and 2% window.

It can be said that reviewed.com doesn't do the most thorough job with their measurements, but they are pretty accurate. As far as I can tell they're the only site that's taken measurements in their reviews of either of these TVs. They clocked my A8G at 667.8 nits, which I think is pretty accurate. Rtings has it at 616 nits "real scene" brightness, and this site has it at 599 nits after calibration to D65. If anything, reviewed.com is generous

Has anyone come across other reviews of either of these TVs with different measurements? Full field white over 200 nits (pref 250), and 10% window over 1000?

Update:

pcmag.com has full field white at 180 nits. Rtings has the A9G at 174 full field white in game mode. Let's be generous and assume that game mode isn't calibrated. SDR's is probably at D65 and it's 147.

180/147 = 1.22 - 22% brighter. First decent improvement

Still, 2020's GX does 168 in SDR. The new Sony is only 7% brighter than it

Also, they have the 10% window on the A90J at 775 nits. 50 nits brighter than reviewed.com's measurement.
For those who don't know, 50 nits difference in TVs of the same model is normal tolerance. For now, it looks like if you buy an A90J you can reasonably expect a peak brightness of ~750 nits.

Another disappointment is the gamut measurement done by pcmag. It shows the green pixel in the same spot as the 2016/17/18/19/20 TVs. A green that is not quite saturated enough and slightly leaning toward yellow. In their review I think I saw them mention that the green subpixel had been improved. Contradicted by their own chart. Lol.
It's not too detrimental to viewing that the most saturated green of DCI-P3 can't be represented by OLEDs. The difference it would make viewing real world content is negligible. If the green was a very narrow 532nm - as called for by Rec2020 - that'd be amazing. I think it's a big ask of the technology right now, but half way between DCI-P3 and Rec2020 wouldn't be unreasonable. Content generated with cameras capable of fully capturing the Rec2020 colour space (available soon) would be noticeably improved by a TV with a green subpixel half way to Rec2020 from DCI-P3. Scenes outside with trees and plants benefitting the most.
edit: After carefully looking at the chart, although green isn't improved as claimed, the red may be. I saw somewhere the pre 2021 panels had red at 617nm - the A90J might be closer to 630 if the picture is right. The difference this will make in picture quality is much smaller than the improvement green at 532 would be. But it's there (if it's there) (with Rec 2020 content obviously, DCI-P3 red is 615nm, so covered by previous OLEDS)

Is anyone else as disappointed with this year's TVs as me? I thought this was going to be the year. All I see is HDMI 2.1 - 2 years late
Here is another full review for you:
Also if you have a look on twitter, he has done the calibrated brightness comparisons, similar to what you have already posted.
https://pbs.twimg.com/media/EwdcFb2WYAYZ_wM?format=jpg&name=medium -A90J
https://pbs.twimg.com/media/Ews-UscWgAMjgd3?format=jpg&name=large -G1
 
Here is another full review for you:
Also if you have a look on twitter, he has done the calibrated brightness comparisons, similar to what you have already posted.
https://pbs.twimg.com/media/EwdcFb2WYAYZ_wM?format=jpg&name=medium -A90J
https://pbs.twimg.com/media/Ews-UscWgAMjgd3?format=jpg&name=large -G1
Thanks, I'll add the links and information they contain in the first post
 
Looking at rtings HDR brightness measurements for the LG CX, is it just me or there doesn't seem to be as big of a jump as expected for the G1's evo panel in comparison?

Currently I can bag the 65 CX for £1619, overhauled WebOS aside, I'd expect the C1 to be otherwise borderline identical.

My main area of concern with LG is chrominance overshoot on HD streaming content and this VRR raised gamma and flickering issues I have seen reported. I understand the VRR issues may have been mitigated with a very recent patch though.

I'm keeping my ear to the ground in the meantime for info on Panasonic's JZ1000. If it's the same as last year but with all the gaming features on 2x HDMI 2.1 ports, I may just wait for that to get some cuts later in the year.
 
Looking at rtings HDR brightness measurements for the LG CX, is it just me or there doesn't seem to be as big of a jump as expected for the G1's evo panel in comparison?

Currently I can bag the 65 CX for £1619, overhauled WebOS aside, I'd expect the C1 to be otherwise borderline identical.

My main area of concern with LG is chrominance overshoot on HD streaming content and this VRR raised gamma and flickering issues I have seen reported. I understand the VRR issues may have been mitigated with a very recent patch though.

I'm keeping my ear to the ground in the meantime for info on Panasonic's JZ1000. If it's the same as last year but with all the gaming features on 2x HDMI 2.1 ports, I may just wait for that to get some cuts later in the year.

It seems there's not much difference. It won't be too long now before the reviewers commenting that there are improvements have published their reviews with measurements.
If the sale isn't ending imminently, you could wait. Still, I doubt the performance the C1 ends up having will make it worth the price premium you'll pay to have it over the CX
 
Ordered 55 " G1 for 1800 euro in Holland. Current TV is a Sammy mu9000 so I can't wait :)
 
50-100 nits increase on an OLED should be noticeable easily especially in a dark room. It shouldn't be night and day but every little helps. Also 2021 TVs with the new panel are supposed to draw less current which means lower energy bills (???).

It's been proven already many times that LG G1 and Sony A90J are indeed 10-20% brighter over the 2020 models and few notches above 2018/2019 ones.
 
50-100 nits increase on an OLED should be noticeable easily especially in a dark room. It shouldn't be night and day but every little helps. Also 2021 TVs with the new panel are supposed to draw less current which means lower energy bills (???).

It's been proven already many times that LG G1 and Sony A90J are indeed 10-20% brighter over the 2020 models and few notches above 2018/2019 ones.
Could you provide the article about less current. Sounds interesting. Thanks
 
I can’t find a browser on the Sony but there is a speed test hidden in the tools. The wifi reaches 200mbps while the ethernet connection seems stuck at 50mbps.
 

Attachments

  • D18B8B80-A889-4CC7-A58D-BB9C755E37D4.jpeg
    D18B8B80-A889-4CC7-A58D-BB9C755E37D4.jpeg
    204.6 KB · Views: 121
  • BA9C8A66-530D-4C5F-BD74-DC5584A3FD1B.jpeg
    BA9C8A66-530D-4C5F-BD74-DC5584A3FD1B.jpeg
    193.8 KB · Views: 133
50-100 nits increase on an OLED should be noticeable easily especially in a dark room. It shouldn't be night and day but every little helps. Also 2021 TVs with the new panel are supposed to draw less current which means lower energy bills (???).

It's been proven already many times that LG G1 and Sony A90J are indeed 10-20% brighter over the 2020 models and few notches above 2018/2019 ones.

50 nits is negligible. One A90J can be 750 nits while the next is 810. You don't see people returning TVs over it, do you?
Now, extend that way of thinking to why would someone upgrade for 50 nits, it's not much of a leap. It's a step back even

About efficiency, supposedly they're 20% more efficient, but they are no brighter. 725-840 nits in multiple reviews for the A90J (not counting vivid mode which is useless because it wrecks the picture and is unusable). I'd bet a lot of money no reviewers do a proper power consumption comparison. Edit: there is no standard for measuring - if Sony or LG wants the number slightly lower for marketing purposes it's easily doable without being "illegal"

My A8G (a 2019 set by the way) does 768 nits in its most accurate picture mode, which is smack dab in the middle of the A90J's range of tested peak brightnesses.

This isn't a discussion - there hasn't been an improvement in brightness. The huge improvements we were led to believe were coming have not arrived. It sucks, but that's how it goes. Better luck next year. Maybe HDMI 2.1 will work out of the box. I won't buy a new TV until it does
 
Last edited:
I can’t find a browser on the Sony but there is a speed test hidden in the tools. The wifi reaches 200mbps while the ethernet connection seems stuck at 50mbps.
That speed test tool is terrible too. Its very inconsistent. Take what that says with a pinch of salt. The TVs soc is not powerful enough to drive a much higher speed too.
 
50 nits is negligible. One A90J can be 750 nits while the next is 810. You don't see people returning TVs over it, do you?
Now, extend that way of thinking to why would someone upgrade for 50 nits, it's not much of a leap. It's a step back even

About efficiency, supposedly they're 20% more efficient, but they are no brighter. 725-840 nits in multiple reviews for the A90J (not counting vivid mode which is useless because it wrecks the picture and is unusable). I'd bet a lot of money no reviewers do a proper power consumption comparison. Edit: there is no standard for measuring - if Sony or LG wants the number slightly lower for marketing purposes it's easily doable without being "illegal"

My A8G (a 2019 set by the way) does 768 nits in its most accurate picture mode, which is smack dab in the middle of the A90J's range of tested peak brightnesses.

This isn't a discussion - there hasn't been an improvement in brightness. The huge improvements we were led to believe were coming have not arrived. It sucks, but that's how it goes. Better luck next year. Maybe HDMI 2.1 will work out of the box. I won't buy a new TV until it does

50 nits on OLED is not negligible IMO especially on the lower end but side by side is the best way to figure out if the added brightness helps.
 
50 nits on OLED is not negligible IMO especially on the lower end but side by side is the best way to figure out if the added brightness helps.
Agree. When your talking about the picture quality/HDR experience of 2020 OLEDs you aren't going to have huge improvements. The picture quality/HDR experience of these TVs is pretty close to perfect. Any little improvement is good.
 
50 nits on OLED is not negligible IMO especially on the lower end but side by side is the best way to figure out if the added brightness helps.
You're entitled to an opinion, but not an uninformed one. 1000 nits is twice as bright as 100 nits. 50 nits between 700 and 800 nits, is negligible. It's much less than 5%.

Edit: and what do you mean by "on the lower end"?
All of the improvement of peak brightness is on the "peak", by definition...
 
You're entitled to an opinion, but not an uninformed one. 1000 nits is twice as bright as 100 nits. 50 nits between 700 and 800 nits, is negligible. It's much less than 5%.

Edit: and what do you mean by "on the lower end"?
All of the improvement of peak brightness is on the "peak", by definition...
Sorry but he IS entitled to an opinion just as you are. Whether you judge it to be uninformed or otherwise is irrelevant, its an OPINION and he is fully entitled to share on this forum as others are.
 
Sorry but he IS entitled to an opinion just as you are. Whether you judge it to be uninformed or otherwise is irrelevant, its an OPINION and he is fully entitled to share on this forum as others are.

Btw, I do fully understand @Michael7877 is trying to say.

I know it's doesn't sound like a big deal when we look at numbers but the way I see it is every little helps with an OLED.

With the lights off and doing a side by side, these should become noticeable.

50 nits increase in the lower end of the scale and 100+ nits on the higher end of scale combined with new panel properties to drive certain sub-pixels hard plus improved picture processing would all help.
 
For any casual readers of this topic / thread the link below may be useful (though its not a technical write up it helps explain some of the issues being debated)

(I hope its accurate i havent fully read it as im supposed to be working !)
 

The latest video from AVForums

Is Home Theater DEAD in 2024?
Subscribe to our YouTube channel
Back
Top Bottom