Q80a vs X94j - 65 inch

sc1lcq

Established Member
Joined
Nov 2, 2020
Messages
100
Reaction score
20
Points
14
Age
36
Location
england
Hi all

I’m close to buying a new tv. I’ve looked at lots of models and think the best for my needs is either the Samsung Q80a or the Sony X94J both in 65 inch.

The Samsung seems to have confusion as to whether the panel is VA or not in the uk, does anyone know for sure? The user reviews seem very good. The Sony seems to get good write ups all round on the whole.

I could get the Samsung slightly cheaper which is appealing as it’s just within budget whilst the Sony is slightly over. Also I have an old Samsung which I like but I’m not loyal and happy to change brand. I like when the image almost pops and looks 3D and nice and bright, I constantly have HDR+ mode on, on my current tv, does the Sony have a similar setting? I like the option of having Q symphony as I have a Samsung soundbar but the picture is the most important thing to me which seems to be Sony?

Any thoughts welcomed, please help me decide! It’s hard to tell in the shops! Thanks
 
Both are VA. Samsung offers higher brightness and more vivid colors. Sony aims for more detail in highlights and more accurate colors. Both can be found under starter HDR TVs
 
Thanks. The Samsung is £190 cheaper than the Sony. If they’re much of a muchness I’d be tempted to save the money but if Sony is considerably better I’d look to get that. I’ve seen some Sony’s which look dark compared to my current Samsung which I don’t like, but it could well be just the settings they were on!
 
Sony stays true to source especially in HDR. Samsung employs dynamic tone-mapping to make HDR look brighter, see example below. Settings can close the gap a bit, but the difference remains. Based on your current TV, your findings and their price I would get the Q80A.

Samsung_QN95A_49_small.jpg

Dusk time image
 
Last edited by a moderator:
Thanks. That does look vastly different, I much prefer the one on the right. I have an mu7000 so think any of these will be a decent upgrade
 
Let us know how you like it.
 
Sony stays true to source especially in HDR. Samsung employs dynamic tone-mapping to make HDR look brighter, see example below. Settings can close the gap a bit, but the difference remains. Based on your current TV, your findings and their price I would get the Q80A.

Samsung_QN95A_49_small.jpg

Dusk time image
The image on the right is honestly horrible and you are losing so much of the detail. Why people think blowing everything out with horrible dynamic tone mapping is a good thing or even remotely attractive I’ve no idea. But basically you are losing all the details in the image and turning it into a cartoon.

I’ve only recently had a Tv with dynamic tone mapping to realise how truly awful it can be. Some implementations are better than others but turned it off on my LG as soon as I saw how much it blew the detail out. Same with this Samsung.

If you want a screen that bright why not just buy a massive whacking blue light and be done with it?
 
Looking on my current Samsung I don’t use dynamic tone mapping. I agree some settings can completely ruin image and in all honesty I didn’t look that close at the images above but I do much prefer a brighter image and in my experience the Sony tv’s have been on the dark side. Might just be the settings of those particular Sony’s, but my current Samsung set up I like, with good brightness but still decent detail. With the £400 difference it made sense to go with that one
 
Looking on my current Samsung I don’t use dynamic tone mapping. I agree some settings can completely ruin image and in all honesty I didn’t look that close at the images above but I do much prefer a brighter image and in my experience the Sony tv’s have been on the dark side. Might just be the settings of those particular Sony’s, but my current Samsung set up I like, with good brightness but still decent detail. With the £400 difference it made sense to go with that one
Samsungs I think have undefeatable tone mapping. If you look at those two images and don’t immediately conclude the one on the left is orders of
magnitude better then I don’t know what to tell you.

Why would anyone want an artificially brightened overblown image? I know people do I just do not understand it. You buy a 4K Tv only to blow all the details out and make it look like a much lower resolution feed just so your TV can be retina searingly bright? I just don’t get it. I mean daytime viewing I have an OLED with Dynamic tone mapping off and HDR is still more than bright enough.
 
Each to their own mate I wouldn’t let it bother you so much, people like what they like :) all the best
 
The image on the right is honestly horrible and you are losing so much of the detail. Why people think blowing everything out with horrible dynamic tone mapping is a good thing or even remotely attractive I’ve no idea. But basically you are losing all the details in the image and turning it into a cartoon.

I’ve only recently had a Tv with dynamic tone mapping to realise how truly awful it can be. Some implementations are better than others but turned it off on my LG as soon as I saw how much it blew the detail out. Same with this Samsung.

If you want a screen that bright why not just buy a massive whacking blue light and be done with it?
Haven't you missed the point with HDR? It's supposed to be bright. If it's not bright then you lose detail, not gain it. The problem with the X90J is that its too dim to be considered a proper HDR TV. The Q80A isn't that much brighter but having just a little extra really helps push it over the line.

For the record, Samsung do over-bright their HDR images a little, but not by much compared to some brands. You can argue on TVs that don't have high peak brightness its more of a necessary than a negative.

Sony choose accuracy above everything else, but the X90J is very dim, so you have no detail whatsoever above its rated nits.

If you compared the same photo on a brighter TV like the Samsung QN95A or Sony X95J the Q80A would look a lot closer to both than the X90J.
 
Haven't you missed the point with HDR? It's supposed to be bright.

I don't know if you're being sarcastic here - but simply being bright is not how it is supposed to be.

The HDR picture is supposed to be bright, but only in certain areas of the picture.
In the sample picture above, only the inside the lamp has the potential to be above 300 nits.

If you compared the same photo on a brighter TV like the Samsung QN95A or Sony X95J the Q80A would look a lot closer to both than the X90J.

If the sets are calibrated, the scene should look almost identical on any TV these days. This isn't overly bright frame and shouldn't be a challenge to any set to be fair. Between low-end and high-end you should see differences only in highlights: the brightest parts of the picture.

Samsung is the VW in the TV world. When it detects it is being measured with 10% window, it tracks the brightness perfectly. Change the window size to anything non-standard, it starts to massively overbright the picture.
 
I don't know if you're being sarcastic here - but simply being bright is not how it is supposed to be.

The HDR picture is supposed to be bright, but only in certain areas of the picture.
In the sample picture above, only the inside the lamp has the potential to be above 300 nits.



If the sets are calibrated, the scene should look almost identical on any TV these days. This isn't overly bright frame and shouldn't be a challenge to any set to be fair. Between low-end and high-end you should see differences only in highlights: the brightest parts of the picture.

Samsung is the VW in the TV world. When it detects it is being measured with 10% window, it tracks the brightness perfectly. Change the window size to anything non-standard, it starts to massively overbright the picture.
I still think you are missing the point, being darker does not equate to more detail. More dynamic range is what brings more detail with HDR.
Providing the display correctly tracks EOTF correctly and has the potential to put out high brightness with more colour volume HDR will be a lot better on brighter displays.

On displays that are too dim you get exactly what you describe, a loss of detail. Bright parts of the image are cut off, or even worse, the TV tries to tone map meaning there's a lot of crushed detail throughout the entire picture.

I think the Sony X90J does a good job considering, but it must not be compared to higher end LCD models that have higher brightness or OLEDs with better detail in darker areas of the picture.

Remember, a display capable of being bright can still decide to preserve detail. But a display that can't get bright enough cannot physically create detail that's beyond its specifications.
 
I still think you are missing the point, being darker does not equate to more detail. More dynamic range is what brings more detail with HDR.
Providing the display correctly tracks EOTF correctly and has the potential to put out high brightness with more colour volume HDR will be a lot better on brighter displays.

On displays that are too dim you get exactly what you describe, a loss of detail. Bright parts of the image are cut off, or even worse, the TV tries to tone map meaning there's a lot of crushed detail throughout the entire picture.

I think the Sony X90J does a good job considering, but it must not be compared to higher end LCD models that have higher brightness or OLEDs with better detail in darker areas of the picture.

Remember, a display capable of being bright can still decide to preserve detail. But a display that can't get bright enough cannot physically create detail that's beyond its specifications.

Dynamic range is directly related to contrast of particular set. 1000:1 results in around 10 steps of dynamic range, while 6000:1 will give 12.5 steps.

Now, if you have two sets with same contrast, both are capable of similar dynamic range. The brighter one will simply crush detail in the darks.
 
Dynamic range is directly related to contrast of particular set. 1000:1 results in around 10 steps of dynamic range, while 6000:1 will give 12.5 steps.

Now, if you have two sets with same contrast, both are capable of similar dynamic range. The brighter one will simply crush detail in the darks.
No, because if setup correctly the brighter TV will not over-brighten content. It will map the content exactly as it meant too.

Some TVs will artificially over-bright content, but this doesn't have anything to do with the peak brightness of the TV, only the algorithm used.

TVs follow a EOTF curve with HDR. The content 'tells' the TV it wants it to display x number of nits. If the content tells a TV that can display 2000 nits it wants it to display just 600, it will.

The other way around, if the content 'tells' a TV that can't get bright enough to display 1000 nits, it won't and detail beyond its nit count will be lost.

The contrast figures you see TVs tested at today are based on SDR ANSI test patterns and have no relevance at all really for HDR. HDR pushes the contrast of TVs a lot higher if they can get brighter.
 
No, because if setup correctly the brighter TV will not over-brighten content. It will map the content exactly as it meant too.

That is a complete contradiction to what you said before. Before it was "HDR is supposed to be bright"...

Some TVs will artificially over-bright content, but this doesn't have anything to do with the peak brightness of the TV, only the algorithm used.

Yes. That's what the sample from @ab12 was supposed to show - Samsung applies massive over-brightness.

TVs follow a EOTF curve with HDR. The content 'tells' the TV it wants it to display x number of nits. If the content tells a TV that can display 2000 nits it wants it to display just 600, it will.

The other way around, if the content 'tells' a TV that can't get bright enough to display 1000 nits, it won't and detail beyond its nit count will be lost.

Agreed. I never disputed that. The TV has two options: clip specular highlights or roll off earlier. Reference displays clip (if one gets 1000nits display, it will be 100% accurate up to 1000nits, but would give zero above), consumer displays roll off earlier.

The contrast figures you see TVs tested at today are based on SDR ANSI test patterns and have no relevance at all really for HDR. HDR pushes the contrast of TVs a lot higher if they can get brighter.

No, they can't. The contrast is a physical limitation of the panel and it doesn't matter which mode you measure it or how bright the backlight is; the contrast is the ability to differentiate between completely bright and completely dark content, it is the measure of difference between how much light the panel can filter out and how much it can transmit.

With contrast of 5000:1, the panel can filter out 99.98% of light going from the back*. So, if the backlight is producing 2500 nits, the panel is able to output anything between 2500 and 0.5 nits*. This TV cannot differentiate below 0.5 nits. Backlight of 250 nits gives results between 250 and 0.05nits respectively.

Similarly to the specular highlights, the TV has a decision to make: what to do with the tones that fall below the lowest possible value - in other words which tone mapping to apply. TV can accurately track brightness as long as possible and clip shadows (dark details will be lost in the elevated blacks) or roll off earlier (dark details will be visible, but dark areas of screen will be brighter than "content demands").

So, with all other things being equal:
  • a TV capable of very high brightness will lose some detail in dark areas, unless a very sophisticated local dimming is applied;
  • a TV that over-brightens the picture, not only ruins the creators intend, but also risks loosing detail in bright areas of the picture (because it simply has less room to work with).
* This is a slight simplification that assumes the LCD can be setup in a way to transmit full 100% of the backlight. Physical devices can't - but it doesn't change much in this explanation, as we can always imagine backlight slightly brighter that would result in desired highlight brightness.

** All of that is while we're ignoring local diming of course.
 
That is a complete contradiction to what you said before. Before it was "HDR is supposed to be bright"...
If we are strictly talking about the sample image, you'd need to compare the image taken next to a reference monitor to know which is closer to how the image should be displayed. My argument was simply that the darker image does not necessarily have more detail, and that a TV capable of getting brighter can display darker areas of the image just as well.

The problem with the HDR implementation on some TVs, and the part I think you are trying to get too is that some TVs are artificially brightening parts of the image that shouldn't be as bright. This is something I hate, and something that should never be the case. It's related to the EOTF I've mentioned before, and I think what you're also getting at when you mention the 'decision making' TVs have to make.

Nonetheless, a TV that can display high nits and doesn't do this, is better than a TV that cannot display high nits at all.
No, they can't. The contrast is a physical limitation of the panel and it doesn't matter which mode you measure it or how bright the backlight is; the contrast is the ability to differentiate between completely bright and completely dark content, it is the measure of difference between how much light the panel can filter out and how much it can transmit.

With contrast of 5000:1, the panel can filter out 99.98% of light going from the back*. So, if the backlight is producing 2500 nits, the panel is able to output anything between 2500 and 0.5 nits*. This TV cannot differentiate below 0.5 nits. Backlight of 250 nits gives results between 250 and 0.05nits respectively.

Similarly to the specular highlights, the TV has a decision to make: what to do with the tones that fall below the lowest possible value - in other words which tone mapping to apply. TV can accurately track brightness as long as possible and clip shadows (dark details will be lost in the elevated blacks) or roll off earlier (dark details will be visible, but dark areas of screen will be brighter than "content demands").

So, with all other things being equal:
  • a TV capable of very high brightness will lose some detail in dark areas, unless a very sophisticated local dimming is applied;
  • a TV that over-brightens the picture, not only ruins the creators intend, but also risks loosing detail in bright areas of the picture (because it simply has less room to work with).
No, the test patterns used to measure contrast ratio are SDR only, and use the default brightness level of whatever preset they are measured in. Contrast would be through the roof on bright displays in HDR mode, especially with local dimming engaged. The ANSI pattern that's widely used is also used with no local dimming at all.

With LCD TVs, native contrast measurements are only useful for SDR, and measurements without local dimming are even more useless unless you're only considering buying TVs without FALD systems.

To use an example of two TVs that both have identical accurate EOTF: the Sony X90J and X95J. The X95J is much, much more impressive with HDR due to it's increased contrast in HDR mode, despite it seemingly measuring quite low in its native contrast tests.

I'm not trying to argue, but I don't think I'm contradicting myself. Just trying to explain, so if others ever happen to read this, they get a bit more of the gist of what to look out for when buying a TV for its HDR merits.
 

The latest video from AVForums

The BIG PROBLEM With Your Home AV System EXPLAINED
Subscribe to our YouTube channel
Back
Top Bottom