Samsung q80t, Sony x900h or other

Spectator

Member
It depends how careful you are using it as a PC monitor, and what steps you use to prevent burn in. If you use a screensaver and don't keep apps up all the time with the same static elements then your TV will last a lot longer than if you do.

We will have to wait to see if the X90J meets the minimum 1000 nits that's recommended for HDR. The XH90 didn't but the XG90/XF90 before it did. Either way its not going to be as good as the XH95 with HDR picture quality and if it does meet 1000 it won't be 1500 like the higher tier Sony TVs are.

MicroLED is a long way off being at smaller sizes, but it will certainly be a game changer if it makes it. Not to be confused with miniLED which doesn't really make LCD TVs that much better.

Power consumption will be low on both TVs for SDR usage where the backlight (OLED brightness on OLED) will be low compared to HDR where its on full.
Sony is making this hard for me as the XJ95 won't be at 55" size. You're pretty sure the XJ90 won't be bright enough for HDR future-proof? When can we expect reviews telling nits etc.? I'm pretty sure you have enough knowledge from previous advancements.

Your remark about OLED made me more suspicious, which might be a good thing. Although I could buy 100 € 5 year warranty for burn-in (and it's possible the warranty is not good). But that would make the CX cost 1300 €, 300 € more than XH95. I don't game very much anymore or need the absolute best quality (I have 10 year old TV now ***), so such an investment is not so sensible, especially with the burn-in risk for my usage.

So what I'm now thinking as there won't be good enough 55" size TV in the next Sony generation (I do want the HDR capabilities to be future-proof), and I don't want to risk Samsung either (and the 55" Q90T costs 1300 € here), should I buy the XH95.

I'm not a gamer, but like to have that option as I do game sometimes. But I won't buy the best graphics cards either. I also sit 2,5 metres away from the 55" TV, so 4K is not so important.

I read that I might be able to force 1440p 120hz from the computer for this Sony XH95, eventhough it does not support such a resolution? Is it so? Because that 1440p 120hz should be good enough for my distance and gaming needs. Also I think HDMI 2.1 graphic cards are very expensive currently, especially if one plans to have the best graphics at 4K/120hz.

Are those other HDMI 2.1 features very important for gaming, or just more of nice things to have? Are they important for other things? I think I will never buy a console, only play from PC.

Also now I'm thinking how big of an upgrade the new TV would be compared to my 10 year old FullHD TV, Sony W5500 52". You have said elsewhere that it's not a big upgrade if the content is max. FullHD, and it seems all the broadcasts are like 720p/1080p.

So in short, my dilemma now is if I should wait for XJ90 reviews so I can compare it to XH95. Sucks that the XJ95 isn't in 55" size, so it makes my choice of XH95 more likely. That XH95 offer of 1000 € is valid until 18.4. and they have about 100 units to sell.

And I'm still not sure that I won't need some of the HMDI 2.1 capabilities, like Dynamic HDR, in the future.

Another issue I have with no HDMI 2.1, is that I can't get 4K/120hz out of it in normal desktop usage, eventhough it has both capabilities. Will this affect if I watch sports from my computer through a browser?
 
Last edited:

Dodgexander

Moderator
We won't know about the X90J until reviews hit. Its hard to predict how it will be because last years model was a regression in picture quality compared to the year before. This year could be similar with the TV only having around 750nits peak brightness which isn't very bright for HDR.

If you aren't a big gamer HDMI 2.1 or high refresh rate doesn't matter at all, and even if you were a gamer you'd need to buy a top-end graphics card to push out more than 60fps at 4k resolution. Each FPS is equivalent to a frame (hz) so its useless really to refresh at 120hz if you can't push over 60fps.

Since you view 2.5m from the TV using 120hz at 1440p certainly is doable on HDMI 2.0 but again not sure I see any reason for it if you aren't game very much.

People have been gaming no problem with 4k 60hz up until now, so don't really see the problem.
Sony is making this hard for me as the XJ95 won't be at 55" size. You're pretty sure the XJ90 won't be bright enough for HDR future-proof? When can we expect reviews telling nits etc.? I'm pretty sure you have enough knowledge from previous advancements.
Its an ongoing trend unfortunately, interest is moving away from smaller screen sizes so the best spec TVs are usually larger. Its hard to make money from TVs that are smaller, than larger.
Also now I'm thinking how big of an upgrade the new TV would be compared to my 10 year old FullHD TV, Sony W5500 52". You have said elsewhere that it's not a big upgrade if the content is max. FullHD, and it seems all the broadcasts are like 720p/1080p.
Depends how you'll use the TV. Since you are viewing quite far from a small TV you will probably find the biggest upgrade will be with HDR. Which makes it even more important to focus on a TV that can display that with good merit compared to other factors. If you are mainly going to use the TV for SDR, don't bother upgrading, its not worth it.
So in short, my dilemma now is if I should wait for XJ90 reviews so I can compare it to XH95. Sucks that the XJ95 isn't in 55" size, so it makes my choice of XH95 more likely. That XH95 offer of 1000 € is valid until 18.4. and they have about 100 units to sell.
Indeed, if I were you I'd buy the XH95 now. The X90J will be overpriced on release anyway, so you'd be waiting a while to pickup that TV at reasonable cost even if it was suited. At best the X90J will offer 1000 nits HDR performance and not exceed the performance of the XH95.
And I'm still not sure that I won't need some of the HMDI 2.1 capabilities, like Dynamic HDR, in the future.
I don't think that's a new specification of HDMI 2.1, but introduced in HDMI 2.0. All the Dynamic HDR formats such as HDR10+ and Dolby Vision are supported on HDMI 2.0 (dolby vision on HDMI 1.4 actually). They don't use extra bandwidth.

Future-proofing takes many different forms, if I was being honest I think you are worrying too much about future proofing connectivity and negating how important future proofing HDR. I have a 2016 TV which has good HDR performance and its up there with the best LCD TVs sold today. It runs HDMI 2.0 and doesn't use any dynamic meta data HDR formats. What makes it hold up today isn't its connectivity or smart TV, but the fact it has good HDR performance.
 

Spectator

Member
We won't know about the X90J until reviews hit. Its hard to predict how it will be because last years model was a regression in picture quality compared to the year before. This year could be similar with the TV only having around 750nits peak brightness which isn't very bright for HDR.

If you aren't a big gamer HDMI 2.1 or high refresh rate doesn't matter at all, and even if you were a gamer you'd need to buy a top-end graphics card to push out more than 60fps at 4k resolution. Each FPS is equivalent to a frame (hz) so its useless really to refresh at 120hz if you can't push over 60fps.

Since you view 2.5m from the TV using 120hz at 1440p certainly is doable on HDMI 2.0 but again not sure I see any reason for it if you aren't game very much.

People have been gaming no problem with 4k 60hz up until now, so don't really see the problem.

Its an ongoing trend unfortunately, interest is moving away from smaller screen sizes so the best spec TVs are usually larger. Its hard to make money from TVs that are smaller, than larger.

Depends how you'll use the TV. Since you are viewing quite far from a small TV you will probably find the biggest upgrade will be with HDR. Which makes it even more important to focus on a TV that can display that with good merit compared to other factors. If you are mainly going to use the TV for SDR, don't bother upgrading, its not worth it.

Indeed, if I were you I'd buy the XH95 now. The X90J will be overpriced on release anyway, so you'd be waiting a while to pickup that TV at reasonable cost even if it was suited. At best the X90J will offer 1000 nits HDR performance and not exceed the performance of the XH95.

I don't think that's a new specification of HDMI 2.1, but introduced in HDMI 2.0. All the Dynamic HDR formats such as HDR10+ and Dolby Vision are supported on HDMI 2.0 (dolby vision on HDMI 1.4 actually). They don't use extra bandwidth.

Future-proofing takes many different forms, if I was being honest I think you are worrying too much about future proofing connectivity and negating how important future proofing HDR. I have a 2016 TV which has good HDR performance and its up there with the best LCD TVs sold today. It runs HDMI 2.0 and doesn't use any dynamic meta data HDR formats. What makes it hold up today isn't its connectivity or smart TV, but the fact it has good HDR performance.
Thanks again!

I don't have a problem waiting for another year if I get more future-proof or better things, but that XJ95 not being in 55" size makes this more difficult for me.

Another option is to risk the stutter thing of Samsung Q90T and wait if that price will drop to 1000 €. It's currently 1300 € here, which is 100 € more than CX, which is ridiculous.

One thing I'm still not certain about, is 4K/120hz for sports, which I do watch. Can I get 4K/120hz from the internal apps of the TV? Because if I watch from computer browser, it will be 4K/60hz.

Or would it be easy to change the resolution to 1440p/120hz for sports? Isn't 120hz good for watching sports? Can I get 4K/120hz sports from XH95?

Are the broadcasts even 120hz? I'm not sure if I'm mixing things here or asking stupid questions.
 

Dodgexander

Moderator
I've known people who've bought Samsung TVs and don't notice the stutter. Its all to do with your own perception of motion. Its certainly not something you'd notice during gaming or PC use, but instead watching video from internal apps or the tuner.

To explain a little. 120hz panels have been used on TVs for some years now. There purpose originally had nothing to do with accepting a 120hz input signal, but instead so the TVs could use software to interpolate (create fake frames) to boost the motion clarity of video content. There's two methods of inserting frames nowadays. The first is the oldest called motion interpolation. On TVs these controls will be called something like (Motion+, Samsung. MotionFlow, Sony) and they let you insert predicted frames to boost lower frame rate content up to 120hz. The unfortunate side effect of this is motion artifacts (caused by the predicted frames being predicated, rather than true) and soap opera effect. Its for this reason that many people have decided in the past to keep this setting always disabled.

The other method of frame insertion is more modern and is called black frame insertion (or sometimes red frame insertion). This makes your TV flicker at a slower rate and inserts blank black or red frames inbetween each flicker that tricks your eyes into thinking there's less motion blur. The side effect of this is a darkened image, and noticeable flicker (to some people more than others).

Each of these motion settings has its downsides, and many people don't use them at all. If they do, they use them on low settings to mitigate the side effects.

Its only been in recent times that TVs have also included the option to accept a 120hz input for gaming purposes... a by-product of their original intent.

So a TV like the Sony XH95 or Samsung Q90T still have 120hz panels and will still refresh at 120hz even when you send a 60hz signal to the TV. The difference is the source device will limit that output to 60hz rather than 120hz.

This stays true for all different frame rates. For example:

Watch a film or TV series in Netflix or similar - FPS will be 24. TV refreshes at 120hz which is a multiple of 24 and therefore matches the refresh rate. No conversion is required.
Same is true for older TV series. They refresh at 25 (Europe) or 30(American). The 30 is again a multiple of 120...and for 25 the TV changes down to refresh at 100hz.
Again for sport. 50 (Europe) 60 (American) the TV adjusts between 100/120hz to become multiples of that frame rate.

So 120hz is definitely nice to have, much better at avoiding problems with motion than 60hz TVs...but its not about being able to accept a 120hz signal, its about being able to refresh a multiple of, and so you can use motion settings if you wish.
 

Russa

Well-known Member
My LGC9 is far superior than my Samsung Q80 in every respect.

Maybe the Q90 is significantly better.... I doubt it though.
 

Spectator

Member
I've known people who've bought Samsung TVs and don't notice the stutter. Its all to do with your own perception of motion. Its certainly not something you'd notice during gaming or PC use, but instead watching video from internal apps or the tuner.

To explain a little. 120hz panels have been used on TVs for some years now. There purpose originally had nothing to do with accepting a 120hz input signal, but instead so the TVs could use software to interpolate (create fake frames) to boost the motion clarity of video content. There's two methods of inserting frames nowadays. The first is the oldest called motion interpolation. On TVs these controls will be called something like (Motion+, Samsung. MotionFlow, Sony) and they let you insert predicted frames to boost lower frame rate content up to 120hz. The unfortunate side effect of this is motion artifacts (caused by the predicted frames being predicated, rather than true) and soap opera effect. Its for this reason that many people have decided in the past to keep this setting always disabled.

The other method of frame insertion is more modern and is called black frame insertion (or sometimes red frame insertion). This makes your TV flicker at a slower rate and inserts blank black or red frames inbetween each flicker that tricks your eyes into thinking there's less motion blur. The side effect of this is a darkened image, and noticeable flicker (to some people more than others).

Each of these motion settings has its downsides, and many people don't use them at all. If they do, they use them on low settings to mitigate the side effects.

Its only been in recent times that TVs have also included the option to accept a 120hz input for gaming purposes... a by-product of their original intent.

So a TV like the Sony XH95 or Samsung Q90T still have 120hz panels and will still refresh at 120hz even when you send a 60hz signal to the TV. The difference is the source device will limit that output to 60hz rather than 120hz.

This stays true for all different frame rates. For example:

Watch a film or TV series in Netflix or similar - FPS will be 24. TV refreshes at 120hz which is a multiple of 24 and therefore matches the refresh rate. No conversion is required.
Same is true for older TV series. They refresh at 25 (Europe) or 30(American). The 30 is again a multiple of 120...and for 25 the TV changes down to refresh at 100hz.
Again for sport. 50 (Europe) 60 (American) the TV adjusts between 100/120hz to become multiples of that frame rate.

So 120hz is definitely nice to have, much better at avoiding problems with motion than 60hz TVs...but its not about being able to accept a 120hz signal, its about being able to refresh a multiple of, and so you can use motion settings if you wish.
Okay, wow, you're a great help. Very interesting.

I think even my ancient W5500 has this fake 100hz.

So XH95 or Q90T it is. Do you have a final recommendation for my purposes between these two, if they would both cost 1000 €?

The difference is Dolby Vision (which I think is better for future) vs. the HDMI 2.1 ports basically. Plus the risk of Samsung stutter.

Samsung also has that mode, in which I can put a photo as a "frame", and it doesn't use as much power, which I would deem important. But I think something similar is possible to do with Sony.

Also, I have the impression that Sony is leading with motion handling? How does this compare with these models? Sports is an important factor for me.

Game mode on both has pretty good HDR, unlike Q80T model for example.
 

Spectator

Member
My LGC9 is far superior than my Samsung Q80 in every respect.

Maybe the Q90 is significantly better.... I doubt it though.
Well for example HDR in Rtings rating for Q80T is 7.5, and Game Mode 6.7. With Q90T it's 8.9 and Game Mode 8.7.

Sony XH95 is 8.7 and Game Mode 8.3. XH90 is 6.9 and Game Mode 7.4.
 

Dodgexander

Moderator
Fake 100hz will just be a made up figure by Sony. Probably means the TV is actually 50/60hz.

The Samsung Q90T is better for gaming, smart TV, future proofing.
The Sony XH95 is better for sports.

Motion is very much in the eye of the beholder though, I've known people happy and unhappy with both TVs motion.
 

Spectator

Member
Fake 100hz will just be a made up figure by Sony. Probably means the TV is actually 50/60hz.

The Samsung Q90T is better for gaming, smart TV, future proofing.
The Sony XH95 is better for sports.

Motion is very much in the eye of the beholder though, I've known people happy and unhappy with both TVs motion.
I just don't like the risk of buying a Samsung, seeing how they treat this stutter-issue. They don't seem to take it seriously at all. I don't have trust for the brand. I have for Sony TV's from experience, and from a few sources. Also that sports benefit and Dolby Vision is nice.

I will observe the sale price shop for the XH95, if it will get closer to being sold out in the next week, and contemplate a bit more about these matters, and see if Q90T gets a price drop, or a review comes out for XJ90.

Thanks again for the help, hope this helps others too in deciding. I would likely buy the CX if my use case wasn't as it is, too big of a risk imo. But for others that is a clear choice.

I would love to see OLED in action as it gets so much praise. But for my 10 year old TV, I can only imagine the upgrade even for LCD, or maybe it isn't as much as I think.
 

Spectator

Member
Motion is very much in the eye of the beholder though, I've known people happy and unhappy with both TVs motion.
One more thought. I don't really have a hurry to upgrade my old TV with my viewing distance or habits of using it. If I have got this far with this 52" 1080p TV, certainly image quality isn't as big of a deal to me.

I'm still wondering if I will see much of an increase otherwise than motion and blacks for normal use, because the content isn't 4K usually yet, so in some sense it might be worse?

Do you think I would be better of to wait a year or two currently for the nits to get up in cheaper models, or to buy that XH95 now that it is 1000 € and the next model won't be at 55" size?

Wouldn't the next ~1000 nits model like XH95 for the XJ90 budget be at least 2 years away (for ~1000 € price), and not this XJ90 model?

I'm not totally sure I want to wait 2 years for that still, because as we saw in Finland, the XH90 got only to ~900 € at its lowest, and XH95 is now 1000 €, so seems like a bargain even without the HDMI 2.1 port. Maybe the lack of HMDI 2.1 port gets this model this low in price, because Q90T is still 300 € more?

It's very weird that XH90 got to 900 € at lowest, and XH95 is very soon after 1000 €. The starting difference for 55" seems to have been about 1550 € vs 1950 €, so this 100 € difference now is more than it was in the beginning, like double the difference.

Wouldn't you say that XH95 is much better TV? 100 € is almost nothing. Isn't the XH95 better than 10 %, which is the price difference, better in performance, other than the HDMI 2.1?

Did they get most users overlyhyped on the HDMI 2.1, and as such now the XH95 is a bargain, when the XH90 got sold out?
 
Last edited:

Dodgexander

Moderator
Pricing fluctuates a lot and can be different region to region. The XH9505 and Q90T perform about the same overall and should be priced the same. In the UK the Q90T was always overpriced, and never good value for money until very recently when it saw a reduction. At some sizes its now cheaper than the XH9505.

I don't think you can argue paying 1000EUR for the XH95, it will still be a good TV in many years even without HDMI 2.1.

The XH90 is not as good as the XH95 and I'd rate it a lot better than 10% XH90 has an unfixable issue of blur in 120hz mode making 120hz next to redundant, too low peak brightness for HDR. No dedicated picture processor either.

As for whether its worth upgrading at all. Only upgrade if you have plans and means to enjoy 4k HDR material. Don't upgrade thinking a new TV is going to polish everything and make it look better.
 

Spectator

Member
Pricing fluctuates a lot and can be different region to region. The XH9505 and Q90T perform about the same overall and should be priced the same. In the UK the Q90T was always overpriced, and never good value for money until very recently when it saw a reduction. At some sizes its now cheaper than the XH9505.

I don't think you can argue paying 1000EUR for the XH95, it will still be a good TV in many years even without HDMI 2.1.

The XH90 is not as good as the XH95 and I'd rate it a lot better than 10% XH90 has an unfixable issue of blur in 120hz mode making 120hz next to redundant, too low peak brightness for HDR. No dedicated picture processor either.

As for whether its worth upgrading at all. Only upgrade if you have plans and means to enjoy 4k HDR material. Don't upgrade thinking a new TV is going to polish everything and make it look better.
Oh, that's interesting about the 120hz mode in XH90! Seems a lot of people can easily get fooled especially here in Finland, looking at those prices and seeing that XH90 is 900 €. Of course the HDMI 2.1 has some other possible benefits too, but I'm not so sure how good those are in real life, especially for people who don't game a lot.

You don't think this 10 year old LCD is a lot worse for example in blacks, brightness and motion than the newest XH95? Even in SDR Rtings says that XH95 is 8.7 while XH90 is 8.3. So I wonder what my W5500 is, must be ridiculous.

I would think it's still a huge upgrade, but maybe not? Doesn't even LCD lose quality over time? This is the weird thing I have observed from your great advice, that you don't seem to think the technology has got that much better in all these years?
 

Dodgexander

Moderator
No, I think a new TV will be significantly better in terms of blacks. The problem is things like blacks and contrast are only part of the puzzle. A 4k TV has to upscale more content than a 1k TV, which can make the picture softer with older, poorer quality material.

So that's why its a good general rule to only upgrade if you have 4k HDR content in mind.
 

Spectator

Member
No, I think a new TV will be significantly better in terms of blacks. The problem is things like blacks and contrast are only part of the puzzle. A 4k TV has to upscale more content than a 1k TV, which can make the picture softer with older, poorer quality material.

So that's why its a good general rule to only upgrade if you have 4k HDR content in mind.
That's really weird. I would have thought the processing power etc. would have got so good, that the poor quality old content would be at least as good, but apparently not.

Also that general rule sounds odd. So you are saying I shouldn't upgrade this 10 year old 52" 1080p TV for my usage, because my 4K HDR needs are not big enough?

I don't even have Netflix 4K currently, only the basic version. With that 52" TV and ~2,5m viewing distance, I don't really need yet. But I could upgrade. How much do I benefit from the HDR otherwise? That HDR is not available with the 1080p version of Netflix subscription?
 

Spectator

Member
No, I'm not saying anything about your usage. That's up to you to decide.
Just saying its a good rule only to upgrade with 4k HDR content in mind, and not to expect much from a new TV otherwise.
Does that "4K HDR content" always have to be 4K, or can it be FullHD HDR? How much does one benefit from the 10-bit panel compared to 8-bit? Especially in computer use?

I still don't understand how the Rtings review gives also a better rating for XH95 with SDR content than XH90, but you say I shouldn't upgrade even for that from my ancient TV? Isn't the HDR stuff more like a bonus?
 

Dodgexander

Moderator
HDR usually comes hand in hand with 4k. With Netflix for instance if you watch only in HD, there's no HDR. On Prime Video you have two choices, 4k and HDR or HD and SDR.

Bit depth relates to how smooth the transition of one colour to another is. It doesn't have any significance with HDR. Yes HDR is mastered usually as 10bit, but it doesn't matter if the TV can display 10bit or not.

What matters for HDR on an LCD TV is colour gamut coverage, colour volume, colour accuracy, how bright in small windows a TV can get (2%, 10%, 20% nits) and how good the local dimming is.

Not sure what you mean with SDR vs HDR rating, usually they use a combination of measurements to create each score, so if you look at the finer details you should see in which area the XH95 scores better. I'd take a wild guess and say its something like viewing angles.
 

Spectator

Member
HDR usually comes hand in hand with 4k. With Netflix for instance if you watch only in HD, there's no HDR. On Prime Video you have two choices, 4k and HDR or HD and SDR.

Bit depth relates to how smooth the transition of one colour to another is. It doesn't have any significance with HDR. Yes HDR is mastered usually as 10bit, but it doesn't matter if the TV can display 10bit or not.

What matters for HDR on an LCD TV is colour gamut coverage, colour volume, colour accuracy, how bright in small windows a TV can get (2%, 10%, 20% nits) and how good the local dimming is.

Not sure what you mean with SDR vs HDR rating, usually they use a combination of measurements to create each score, so if you look at the finer details you should see in which area the XH95 scores better. I'd take a wild guess and say its something like viewing angles.
Ah, okay, good to know about 4K & HDR. Can 10-bit panel be better with photo editing/looking at photos from computer?

SDR brightness is rated here:

"Our tests

Our testing for SDR and HDR brightness is fairly straightforward. We use the most accurate picture settings because this is how most people will watch TV, while setting the backlight/brightness setting to max and using the recommended local dimming setting. We use a Konica Minolta LS-100 Luminance Meter to measure the brightness with different content.

We use a video and five test pictures to measure the brightness in both SDR and HDR. The real scene video is supposed to represent content in most shows or movies with bright scenes. Also, the test windows, especially the smaller ones, are meant to test for small highlights.

We use the same videos and pictures for testing the SDR Brightness, HDR Brightness, and HDR Brightness in Game Mode."

"SDR REAL SCENE BRIGHTNESS

The SDR Real Scene Peak Brightness test is most representative of real-world use. Before playing the video, we 'warm up' the TV so that the pixels aren't 'cold' for this test; almost like an athlete stretching their muscles before physical activity, it's important to get the pixels going before the test. We use the luminance meter and focus on the lamp in the upper-left side of the video for 30 seconds to get our final measurement. Anything above 365 cd/m2 should be good enough to combat glare in well-lit rooms. Also, keep in mind that the final luminance measurement can vary up to 20 cd/m2 between measurements."

 

Spectator

Member
I'm also not sure what to make of this statement from MOFO.

He says XH95 is not as good in low-light as XH90, because it has some different processing or something.

Actually I think he might mean this? So the XH95 has better viewing angles, but is not as good in low-light. Q90T seems to be on another level.

"Native Contrast
3170 : 1

Contrast with local dimming
3819 : 1

The Sony X950H has a great native contrast ratio, and it gets slightly better when local dimming is enabled. However, it's lower than what we would expect of a VA panel due to Sony's 'X-Wide Angle' layer, which improves viewing angles at the expense of lower contrast. That said, blacks still look deep, making it a good choice for dark room viewing. Note that the contrast ratio can vary between units."


Also the X90J seems to get some criticism in some places.

I'm now wondering if I'm better off waiting for Q90T price drop. It's just so silly to pay more or the same for LCD than CX though.

 
Last edited:

Dodgexander

Moderator
SDR brightness doesn't really matter. Both TVs are more than bright enough for SDR. To give you an idea, most TVs before HDR were calibrated around 110-120nits full screen. HDR brightness is what counts, as HDR is mastered with as much as 10,000 nits in mind. In HDR, every TV puts its backlight on full to achieve maximum effect and colour volume.

On cheaper TVs with lower peak brightness these high peaks are lost and the image looks flat when using HDR. Can depend title to title, with some releases using an average light output that's a lot higher or lower than others.

Contrast Ratio is better on the XH90 due to no wide viewing angle filter, local dimming performs similar to the XH95 but the key difference is HDR where the brightness of the XH95 is a lot better. Rtings.com measure contrast in SDR mode, in HDR mode contrast will be a lot higher on the XH95 as it can get brighter.

XH95 also has a dedicated picture processor, whilst the XH90 does it all on board its built in smart TV chip.

Dark room performance is fine on the XH95 but you have to bear in mind different strategies imposed by each manufacturer. Sony choose to keep picture accuracy on their TVs in HDR mode so they don't control the black level quite as good as Samsung TVs do. That means in a dark environment a Samsung TV like the Q90T will have better blacks and less blooming.

Its a trade off though, Samsung do this by deviating from recommended picture quality standards, so they also blow out bright highlights and crush some shadow detail compared to Sony.
 

Spectator

Member
SDR brightness doesn't really matter. Both TVs are more than bright enough for SDR. To give you an idea, most TVs before HDR were calibrated around 110-120nits full screen. HDR brightness is what counts, as HDR is mastered with as much as 10,000 nits in mind. In HDR, every TV puts its backlight on full to achieve maximum effect and colour volume.

On cheaper TVs with lower peak brightness these high peaks are lost and the image looks flat when using HDR. Can depend title to title, with some releases using an average light output that's a lot higher or lower than others.

Contrast Ratio is better on the XH90 due to no wide viewing angle filter, local dimming performs similar to the XH95 but the key difference is HDR where the brightness of the XH95 is a lot better. Rtings.com measure contrast in SDR mode, in HDR mode contrast will be a lot higher on the XH95 as it can get brighter.

XH95 also has a dedicated picture processor, whilst the XH90 does it all on board its built in smart TV chip.

Dark room performance is fine on the XH95 but you have to bear in mind different strategies imposed by each manufacturer. Sony choose to keep picture accuracy on their TVs in HDR mode so they don't control the black level quite as good as Samsung TVs do. That means in a dark environment a Samsung TV like the Q90T will have better blacks and less blooming.

Its a trade off though, Samsung do this by deviating from recommended picture quality standards, so they also blow out bright highlights and crush some shadow detail compared to Sony.
What I meant with SDR brightness, is that if current TV's can have different ratings for it, then surely my 10 year old TV is much worse in SDR brightness. But maybe it doesn't matter at all, especially in darker environment.

The new QN90A seems to be even brighter TV, but is this development that important if one is not striving for the ultimate quality, but updating a very old TV?

I'm still wondering if I should wait a couple years with my upgrade, but Sony is making it harder because the XJ95 isn't in 55" size, so getting enough quality for LCD might be harder in this size.

I also saw some complaints for the XR processor in the new Sony's, but not sure if that can be trusted.

The developments don't seem to be huge at this point, and if XH95 & Q90T can reach that Dolby Vision ~1000 nits recommendation, then this would seem like a good time to get a TV.

It just seems that Q90T won't get a price drop here, because the model this certain big shop has only under 30 units left, so I would have to go with XH95 for 1000 €.

Next upgrade can come many years later when MicroLED is affordable enough, or OLED has got rid of most of the burn-in issues for desktop computer & sports use.

But what I'm getting from you is that I shouldn't upgrade my 10 year old FullHD 52" at all, if my use is photo editing, web browsing, sports and HD series/movies. Only with Netflix 4K I would get benefits, and if I feel like playing games at some point?

Sounds unbelievable that technology has not gone forward at all for basic uses in 10 years, or actually this model is 11 years old.
 
Last edited:

Spectator

Member
Does anyone think this would be important? It would be a benefit of the HDMI 2.1 to get 4K/120hz from computer.

Or would it make sense to use 1440p/120hz for desktop use from 2,5 meters distance with the XH95?

Edit. Actually HDMI 2.0 can't do 1440p/120hz 4:4:4. Even 4K/60hz is only 4:2:2.


Of course I would need a new graphics card for this 4K/120hz to work, and I won't buy it for a couple of years.

"Refresh Your Monitor​

When you look at a computer monitor, you’re not seeing a single, static image. You’re actually seeing the same image refreshed many times per second—creating the illusion that your screen is smooth. If you’re using an older monitor, you may notice it flickering slightly. This means the refresh rate is so low your eyes can see it. This is typical at the frequencies 59 Hz and 60 Hz, with the latter being the most commonly used refresh rate in laptops and LCD (flat-screen) monitors.

A higher refresh rate means a smoother-looking screen that’s easier on the eyes. So, if you’re trying to ease your eyestrain, a refresh rate of 120 Hz is optimal. There’s no need to pursue those high-end 144 Hz or 240 Hz monitors from Amazon or Best Buy. Unless you’re doing heavy gaming or video watching and editing, you most likely won’t see the difference between 120 Hz and anything higher. Instead, try using a 120 Hz screen for a few hours, then switch back to a 60 Hz monitor. The difference between the two is so palpable your eyes will start to strain almost immediately."

 
Last edited:

Dodgexander

Moderator
SDR brightness isn't important. Your current TV will be well below any new TV you buy now.

I tried to explain this in my last reply but all the TVs you are considering have a 120hz refresh rate, you are only talking about the refresh rate accepted by the TV. Eye strain will be impacted by things such as PWM flicker (Samsung models tend to use it worse than Sony) but not the input refresh rate as the TVs will always refresh at 120 even with a 60hz input.

This is not the same behavior as computer monitors that refresh exactly as the signal you send.
Sounds unbelievable that technology has not gone forward at all for basic uses in 10 years, or actually this model is 11 years old.
You are comparing an LCD TV to other LCD TVs, of course it hasn't moved on that much. LCD is practically at the end of the road now, which is why OLED is becoming more popular and companies are investing in researching new technology. Samsung themselves are currently working on QD-OLED TVs, and companies like Panasonic and Philips don't even sell higher end LCD TVs any more.

But the key factor is a TV is only as good as its source quality, and you shouldn't expect a new model to polish everything and make it look better. It will look better when fed a native 4k signal, and be most impressive with HDR. If you send less than native signal upscaling is required, so whilst you'll benefit from things that have moved on a bit in other areas of the picture (blacks, contrast, colour) you'll take a step back in sharpness due to upscaling.

To look at it in the most simple way possible, only buy a new TV with a mind to use 4k HDR sources. If you aren't buying a TV to benefit from these, you'll be disappointed in your purchase.
 

Spectator

Member
SDR brightness isn't important. Your current TV will be well below any new TV you buy now.

I tried to explain this in my last reply but all the TVs you are considering have a 120hz refresh rate, you are only talking about the refresh rate accepted by the TV. Eye strain will be impacted by things such as PWM flicker (Samsung models tend to use it worse than Sony) but not the input refresh rate as the TVs will always refresh at 120 even with a 60hz input.

This is not the same behavior as computer monitors that refresh exactly as the signal you send.

You are comparing an LCD TV to other LCD TVs, of course it hasn't moved on that much. LCD is practically at the end of the road now, which is why OLED is becoming more popular and companies are investing in researching new technology. Samsung themselves are currently working on QD-OLED TVs, and companies like Panasonic and Philips don't even sell higher end LCD TVs any more.

But the key factor is a TV is only as good as its source quality, and you shouldn't expect a new model to polish everything and make it look better. It will look better when fed a native 4k signal, and be most impressive with HDR. If you send less than native signal upscaling is required, so whilst you'll benefit from things that have moved on a bit in other areas of the picture (blacks, contrast, colour) you'll take a step back in sharpness due to upscaling.

To look at it in the most simple way possible, only buy a new TV with a mind to use 4k HDR sources. If you aren't buying a TV to benefit from these, you'll be disappointed in your purchase.
Okay, thanks again for the patience and answers. Seems like I could easily wait a couple years still, but I wonder if the TV's have developed enough for not getting big upgrades in the next years for LCD's, even for HDR. I don't dare to buy OLED for my use, although it is a bit tempting to risk it.

I just don't like the blacks in my TV anymore, now that I have seen that they're not very good. It's hard to unsee it. In certain TV shows, even with SDR, I can mostly see highlights when the scene is darker, and the blacks are either pretty grey or crushed.

I'm wondering how can this Samsung JU7100 have 4K/60hz 4:4:4, when the wikipedia page says that HDMI 2.0 can do only 4K/60hz 4:2:2. I'm pretty sure it doesn't have HDMI 2.1?

I would think it is important to have 4:4:4 in desktop computer use, so text is not blurred? So can I get this with XH95, or do I need HDMI 2.1 for it? I'd rather not use 1440p/60hz for desktop use?

"Samsung UN55JU7100 4k @ 60Hz under PC mode
Chroma 4:4:4"

 

Spectator

Member
This TV arena seems like a total minefield. One, more reputable tester said that X90J has some serious problems, like very bad judder for gaming, and green cast for Dolby Vision. So I have no option in these new LCD's from Sony. Also it seems like the brightness is not as good as in XH95, like you guessed.

With Samsung the issue is that Q90T is almost sold out here and likely won't drop in price to 1000-1100 €. AND, actually the stutter issue is not the only risk here, but many people say that Samsung LCD's this year have very likely Dirty Screen Effect, unless you happen to get a good panel. This is very bad for sports and games I think. I don't fancy likely having to return a huge TV many times to get a good one. And as there are so few units left, who knows if the last ones have been returned already by some buyers.

CX would otherwise be a perfect solution, but I just don't have the guts to risk the burn-in for my desktop use. I tried to switch to dark mode, and I would need to take the favicons out of browser, use F11 mode and maybe switch the brightness to pretty low. But some pages still have bright areas, as does my sports channel.

So if I want to upgrade during the next year, it basically has to be 55" XH95. Sony might never make a good LCD in this size again.

Only option would be Samsung QN90A next year for maybe 1300 € again, if that has less DSE, but is this problem persistent with Samsung? Hmmh.. and ARGH!
 

The latest video from AVForums

AVForums Movies Podcast: We review Dune and ask which is the best decade for horror movies?
Subscribe to our YouTube channel

Latest News

What's new on UK streaming services for November 2021
  • By Andy Bassett
  • Published
PROMOTED: Which OLED is King? 2021 Shootout at Abbey Road Studios
  • By Promoted Content Poster
  • Published
Hisense TVs get Disney+ on VIDAA smart OS
  • By Andy Bassett
  • Published
Astell&Kern launches entry level SR25 MKII portable music player
  • By Andy Bassett
  • Published
LG rumoured to release 97-inch OLED TV in 2022
  • By Andy Bassett
  • Published
Support AVForums with Patreon

Top Bottom