The great upscaling battle: Sony Bravia XR vs NVIDIA Shield


Novice Member
In this thread I'd like us to compare the upscaling performance of the Bravia XR engine to that of the NVIDIA Shield. This is an important topic as the Shield offers lossless passthrough of various audio formats that, to my understanding, no TV on the market today does. It therefore has value beyond the platforms of the Android TVs sold by Sony, particularly for those of us with high quality digital media collections. That being said, the new XR engine is being described by many as best in class. There are ways to force the Shield to output content at native resolution, so Shield owners can pick between the two, at least if they're willing to jump through a few hoops.

So, I'll begin with my initial subjective impressions of my 77A80J vs the Shield comparing a 1080p remux.
Round 1: NVIDIA AI-Enhanced, Detail=High vs XR
To test the XR engine I changed the settings within Plex to override the refresh rate and resolution of the shield ("Resolution Switching"), that way all upscaling would be handled by the A80J. The A80J was the clear winner. The film grain was preserved and the finer details were much smoother. The Shield looked comparatively "digital" and noisy.

Round 2: NVIDIA AI-Enhanced, Detail=Medium vs XR
Setting detail to medium made the picture seem more organic, but looked slightly too sharp and noisy. A80J wins again

Round 3: NVIDIA AI-Enhanced, Detail=Low vs XR
This was the best the Shield looked. The picture felt slightly less sharp than with detail set to medium, but much more natural. The digital noise could only be observed by looking closely at the grain of the film. The A80J preserved details of objects/people more sharply while still making the film grain look natural.

Winner: A80J
Overall, the A80J preserved a more cinematic and organic feel while being sharper and less noisy than the Shield. That's not to say the Shield fared poorly, AI-Enhanced with detail set to low looked really good.

Bonus round: AI-Enhanced, Detail=Low vs Shield Basic
Surprisingly, both basic and AI-enhanced with detail set to low were tolerable to me, but the biggest difference was shaprness. The AI-enhanced was noticeably sharper without being much noisier. Basic looked softer. Both had a natural/organic feel, and did not appear digitized the way the higher levels of detail enhancement did.

I was really only looking at the sharpness, and wasn't able to evaluate accurately how the different engines deal with other things like color and motion. From what I saw, both were good in those regards.

Of course, the best test is your own eyes, but hopefully this thread can provide some benefit for those working on their setup. I also think we need to be clear about what content is being watched - the NVIDIA may handle YouTube or streaming better than the Sonys, but XR might be better for cinematic content.

What do y'all think?

The latest video from AVForums

AVForums Movies Podcast: Which is the best decade for horror movies?
Subscribe to our YouTube channel
Support AVForums with Patreon

Top Bottom