Michael7877
Established Member
This thread's been updated a couple times as reviews have come out. I'm done updating this thread, final opinion on the A90J is below:
Yesterday I went to my local big box store and explored the settings of an A90J. It was mounted beside an A9G.
After putting both sets in the most accurate "Custom" picture mode and adjusting everything to be the same (gamma, motion etc), I could not tell them apart. That's a lie: the Expert 1 colour temperature is slightly cooler (50-75k) on the A90J - and probably closer to the correct 6500k. I tested both SDR and HDR, and vivid mode was brighter on the A90J. But since nobody who wants an accurate picture uses that mode, you might as well disregard the finding.
So, if you have an A9F/A9G/A8H (which are all basically the same), and possibly even an A8G, A8F, or A1E (which are also all basically the same) there is no point in upgrading. Unless you're a gamer and maybe even if you're a gamer - vrr doesn't work on the A90J yet, and last year's 900H still doesn't have it, as was promised (it was supposed to be released with it and now the next series is launching without it). In summary, everything performed pretty much the same - didn't see fewer or more interpolation errors, better colour, anything. Except the brighter peaks in Vivid mode.
So, I think, if you need a good OLED TV and don't require HDMI2.1 features (4k120/VRR), get an A8H or A9G right now for a good price while they're on sale. Unless you use and appreciate vivid mode or like spending more for the same thing or waiting longer for no reason.
Edit: Also, Unless you need 120hz BFI. The flicker on the lowest setting is tolerable (in the store anyway) and noticably improves motion resolution. A further improvement in motion resolution (an additional ~40% of what is gained by using frame interpolation to turn 24fps material into 120) is realised on the lowest BFI setting.
Previous update: I just came across a post showing that the A90J does 1350ish nits for 3 seconds, then drops down to 850-900. If this is true, and in a normal calibrated mode that comes down to 1050-1100 nits for 3 seconds, dropping to 750 nits afterward, that'd be great news. Most scenes in new movies are only 1-10 seconds, so as long as the dimming from the very high peak to the sustained peak is gradual (happens over 5-10 seconds and isn't visually obvious) the TVs should be great. Reviews will have to describe this - so far they don't. I'll leave the original post below as it is for now because it has the sustained brightness measurements, and no reviews have the details yet.
The LG G1 measures between 735 and 740 nits for 1, 2, 5, and 10% windows
And 736 nits peak on a 20% window (reviewed.com
The Sony A90J measures 775 nits on a 10% window (pcmag)
And 723.3 nits peak with a 20% window (reviewed.com)
After being as forgiving as possible with the math,
The 2020 A9G to 2021 A90J is just 12% brighter. (641 to 723 nits)
The 2020 GX to 2021 G1 is only 5% brighter. (704 to 736 nits)
The numbers used for comparison come from averaging rtings measurements for real scene peak brightness and 2% window. If just the 2% windows are used, the A9G is only 5% brighter and the G1 is actually dimmer than the GX. 8 percent dimmer to be exact. And if the A9G's cheaper sibling the A8G's 768 nit 2% window is used, it's 6% brighter. It's worth mentioning a 5% difference in peak brightness is pretty much imperceptible, even side by side, so it's very safe to say there's no brightness improvement with this year's sets. 2016, 2017, 2018, 2019, and 2020 OLED TVs basically have the same peak brightness.
The G1 has a panel that's 20% more efficient and the A90J drives all 4 subpixels at once. - - - manufacturers regarding the non-existent "improved peak brightness!"...
My guess is they are using the efficiency improvement invisibly. To lengthen the lifespan of the panels, and make it a bit harder to get burn-in.
Has anyone come across other reviews of either of these TVs?
LG G1
Sony A90J
Update:
pcmag.com has full field white at 180 nits. Rtings has the A9G at 147 in SDR's
180/147 = 1.22 - 22% brighter. First decent improvement.
But it does 174 nits in game mode.
Pcmag has full field white on a 10% window at 775 nits, 50 more than reviewed.com's.
For those who don't know, 50 nits difference in TVs of the same model is normal tolerance. For now, it looks like if you buy an A90J you can reasonably expect a peak brightness of ~750 nits.
Another disappointment is the gamut measurement done by pcmag. It shows the green pixel in the same spot as the 2016/17/18/19/20 TVs. A green that is not quite saturated enough and slightly leaning toward yellow. In their review I think I saw them mention that the green subpixel had been improved. Contradicted by their own chart. Lol.
It's not too detrimental to viewing that the most saturated green of DCI-P3 can't be represented by OLEDs. The difference it would make viewing real world content is negligible. If the green was a very narrow 532nm - as called for by Rec2020 - that'd be amazing. I think it's a big ask of the technology right now, but half way between DCI-P3 and Rec2020 wouldn't be unreasonable. Content generated with cameras capable of fully capturing the Rec2020 colour space (available soon) would be noticeably improved by a TV with a green subpixel half way to Rec2020 from DCI-P3. Scenes outside with trees and plants benefitting the most.
edit: After carefully looking at the chart, although green isn't improved as claimed, the red may be. I saw somewhere the pre 2021 panels had red at 617nm - the A90J might be closer to 630 if the picture is right. The difference this will make in picture quality is much smaller than the improvement green at 532 would be. But it's there (if it's there) (with Rec 2020 content obviously, DCI-P3 red is 615nm, so covered by previous OLEDS)
I thought this was going to be the year. All I see is HDMI 2.1 - 2 years late
Yesterday I went to my local big box store and explored the settings of an A90J. It was mounted beside an A9G.
After putting both sets in the most accurate "Custom" picture mode and adjusting everything to be the same (gamma, motion etc), I could not tell them apart. That's a lie: the Expert 1 colour temperature is slightly cooler (50-75k) on the A90J - and probably closer to the correct 6500k. I tested both SDR and HDR, and vivid mode was brighter on the A90J. But since nobody who wants an accurate picture uses that mode, you might as well disregard the finding.
So, if you have an A9F/A9G/A8H (which are all basically the same), and possibly even an A8G, A8F, or A1E (which are also all basically the same) there is no point in upgrading. Unless you're a gamer and maybe even if you're a gamer - vrr doesn't work on the A90J yet, and last year's 900H still doesn't have it, as was promised (it was supposed to be released with it and now the next series is launching without it). In summary, everything performed pretty much the same - didn't see fewer or more interpolation errors, better colour, anything. Except the brighter peaks in Vivid mode.
So, I think, if you need a good OLED TV and don't require HDMI2.1 features (4k120/VRR), get an A8H or A9G right now for a good price while they're on sale. Unless you use and appreciate vivid mode or like spending more for the same thing or waiting longer for no reason.
Edit: Also, Unless you need 120hz BFI. The flicker on the lowest setting is tolerable (in the store anyway) and noticably improves motion resolution. A further improvement in motion resolution (an additional ~40% of what is gained by using frame interpolation to turn 24fps material into 120) is realised on the lowest BFI setting.
Previous update: I just came across a post showing that the A90J does 1350ish nits for 3 seconds, then drops down to 850-900. If this is true, and in a normal calibrated mode that comes down to 1050-1100 nits for 3 seconds, dropping to 750 nits afterward, that'd be great news. Most scenes in new movies are only 1-10 seconds, so as long as the dimming from the very high peak to the sustained peak is gradual (happens over 5-10 seconds and isn't visually obvious) the TVs should be great. Reviews will have to describe this - so far they don't. I'll leave the original post below as it is for now because it has the sustained brightness measurements, and no reviews have the details yet.
The LG G1 measures between 735 and 740 nits for 1, 2, 5, and 10% windows
And 736 nits peak on a 20% window (reviewed.com
The Sony A90J measures 775 nits on a 10% window (pcmag)
And 723.3 nits peak with a 20% window (reviewed.com)
After being as forgiving as possible with the math,
The 2020 A9G to 2021 A90J is just 12% brighter. (641 to 723 nits)
The 2020 GX to 2021 G1 is only 5% brighter. (704 to 736 nits)
The numbers used for comparison come from averaging rtings measurements for real scene peak brightness and 2% window. If just the 2% windows are used, the A9G is only 5% brighter and the G1 is actually dimmer than the GX. 8 percent dimmer to be exact. And if the A9G's cheaper sibling the A8G's 768 nit 2% window is used, it's 6% brighter. It's worth mentioning a 5% difference in peak brightness is pretty much imperceptible, even side by side, so it's very safe to say there's no brightness improvement with this year's sets. 2016, 2017, 2018, 2019, and 2020 OLED TVs basically have the same peak brightness.
The G1 has a panel that's 20% more efficient and the A90J drives all 4 subpixels at once. - - - manufacturers regarding the non-existent "improved peak brightness!"...
My guess is they are using the efficiency improvement invisibly. To lengthen the lifespan of the panels, and make it a bit harder to get burn-in.
Has anyone come across other reviews of either of these TVs?
LG G1
LG's top 2021 OLED has been (subtly) improved in almost every way
LG's subtle "2nd evolution" of OLED manages to justify the prices of these excellent TVs.
www.reviewed.com
Sony A90J
Sony’s bright and beautiful new OLED puts LG (and everyone else) on notice
Sony's new OLED TV is putting LG on notice.
www.reviewed.com
Update:
Sony Master Series XR-55A90J Review
The Sony Master Series A90J line of TVs has an impressively bright OLED panel with one of the widest color ranges we've seen, along with powerful audio performance on par with dedicated speakers, making it worth the high cost of entry.
www.pcmag.com
pcmag.com has full field white at 180 nits. Rtings has the A9G at 147 in SDR's
180/147 = 1.22 - 22% brighter. First decent improvement.
But it does 174 nits in game mode.
Pcmag has full field white on a 10% window at 775 nits, 50 more than reviewed.com's.
For those who don't know, 50 nits difference in TVs of the same model is normal tolerance. For now, it looks like if you buy an A90J you can reasonably expect a peak brightness of ~750 nits.
Another disappointment is the gamut measurement done by pcmag. It shows the green pixel in the same spot as the 2016/17/18/19/20 TVs. A green that is not quite saturated enough and slightly leaning toward yellow. In their review I think I saw them mention that the green subpixel had been improved. Contradicted by their own chart. Lol.
It's not too detrimental to viewing that the most saturated green of DCI-P3 can't be represented by OLEDs. The difference it would make viewing real world content is negligible. If the green was a very narrow 532nm - as called for by Rec2020 - that'd be amazing. I think it's a big ask of the technology right now, but half way between DCI-P3 and Rec2020 wouldn't be unreasonable. Content generated with cameras capable of fully capturing the Rec2020 colour space (available soon) would be noticeably improved by a TV with a green subpixel half way to Rec2020 from DCI-P3. Scenes outside with trees and plants benefitting the most.
edit: After carefully looking at the chart, although green isn't improved as claimed, the red may be. I saw somewhere the pre 2021 panels had red at 617nm - the A90J might be closer to 630 if the picture is right. The difference this will make in picture quality is much smaller than the improvement green at 532 would be. But it's there (if it's there) (with Rec 2020 content obviously, DCI-P3 red is 615nm, so covered by previous OLEDS)
I thought this was going to be the year. All I see is HDMI 2.1 - 2 years late
Last edited: