Disney+ 4K, really?

dan~

Established Member
Joined
May 7, 2009
Messages
202
Reaction score
85
Points
56
I’ve got an LG C1, recently got Disney+ and everything so far is 4K, Dolby Vision & Atmos.

What I’ve seen so far I’d hardly say it’s 4K, if I watch something on Netflix like Blue Planet then I can tell that’s 4K easily.

Does anyone else have the same? I’ve only watched Marvel stuff pretty much so far and Mulan earlier, I’m just not convinced it’s 4K.

I’ve not watched anything on National Geographic to find somewhere similar to Blue Planet etc, anything on there to recommend to show off 4K and to check it’s actually working?

I use the built in apps on the TV, VM 220 internet so ample for 4K.

TV is connected to my 5ghz WiFi and Netflix app built in speed checker confirms 220mbps, but like I said using Netflix app I can tell it’s 4K

Thanks
 
No issues on my disney + and have compared to 4k marvel disks i own too.

I would use ethernet rather than wifi as wifi is prone to dropouts and interference which will affect signal quality.
 
No issues on my disney + and have compared to 4k marvel disks i own too.

I would use ethernet rather than wifi as wifi is prone to dropouts and interference which will affect signal quality.
Which Marvel films have you tried it on?

I’ve only watched the new series’ so far like Hawkeye, Wanda vision.

I am about to fire up End Game for a test and will see what that’s like, if that’s good then it’s obviously just the other programmes not great in 4k
 
I've read over on the TV section that some native apps that are built into the TV aren't as good as. when they are on say the Apple TV 4K box

Maybe have a look on the LG TV section where there is a thread for LG C1 owners

 
Which Marvel films have you tried it on?

I’ve only watched the new series’ so far like Hawkeye, Wanda vision.

I am about to fire up End Game for a test and will see what that’s like, if that’s good then it’s obviously just the other programmes not great in 4k
Guardians of galaxy, endgame.

On disney plus + atmos is not as good as disk, but picture should be very similar.

In what way did you think picture was not 4k?

Blueplanet is reference picture quality, so will 'pop' more than other 4 k content. I have it on disk and it outshines other 4k disks in picture quality.
 
Last edited:
Just played some endgame and looked perfect, so guess it’s just the series's I’ve been watching, says 4K but looks more like 1080 which is odd
 
So the plots thickens a bit. Just tried to watch Iron Man and it kept going bad quality then stopping all together

Fired up Netflix app on LG went to the help and ram a network test, came back as 3.4mbps which was wrong

Looked at LG network settings and it connected to my 2.4ghz band and not my 5ghz, so I guess when I set it up months ago I set it incorrectly, weird how Netflix has been fine.

Connected on 5ghz, re ran the Netflix help and confirmed 230mbps and Iron Man perfect now

So all along the series’s I’ve been watching on Disney could have looked a lot better 😂
 
My Netflix speed run shows 53Mbps vie the Google TV Chromecast thingy.
Go in to Disney+ and check App settings and its set on to the highest quality which maxes out at 4HD Gb/hr
I'm not sure how to work out the math between the two but I'm sure that 4Gb/hr is a LOT less data then 53Mbps...

#edit

Just thought I'd check the same data via the TV Apps...
Disney+ on the TV app is shown as 4k UHD 7.7GB/hr, almost double the Google TV.
Now the Google TV app looks absolutely stunning and much better than the TV app..
Then I remembered.. Google TV is set to Dolby Vision which maxes out at 1080p and 60hz...
I switch it back to 4k 60Hz and go back to Disney+ and now that is 4k UHD 7.7GB/hr....
 
Last edited:
There is a difference: Ethernet vs. Wi-Fi
If you want The Best streaming picture-quality, you should get it wired if possible.
 
There is a difference: Ethernet vs. Wi-Fi
If you want The Best streaming picture-quality, you should get it wired if possible.

That's simply not true, providing you have a stable connection, wireless or wired the picture quality is the same.
 
Last edited:
It moves more data, if you can't see the difference, it's still there!
 
I've tested it myself, so I know it's true.
No matter which device, which is why now I have Ethernet for all my devices.

It will probably be a while before a wireless connection can move as much data as a wired one.
 
I've tested it myself, so I know it's true.
No matter which device, which is why now I have Ethernet for all my devices.

It will probably be a while before a wireless connection can move as much data as a wired one.

But lets say a streaming service, netflix for example pulls in 15mbps I think for 4k, I get stable 200mbps over wifi between my apple tv 4k and router, why would a wired connection improve picture quality?
I'm sorry but I don't buy that at all.
 
If you don't believe me, test if for yourself.
Watch how much more data moves when it's wired.

You'll probably need a UHD source to see the difference.
 
That's simply not true, providing you have a stable connection, wireless or wired the picture quality is the same.
That is not true and why business's will use ethernet over wifi where possible.

Wifi is subject to interference and dropouts on the data it carries with everything from neighbouring wifi signals to microwaves . Optimal signal strength is -30db, but down to -70db, at a push and you will get a good signal, though will be slower speed at -60db/ -70db, once it drops below that then quality is significantly affected with lags, drops outs etc and spikes of interference can see -90db or lower signals occur.

You can get lucky and be in a house which has no interference and then suddenly something changes from a neighbouring property or other external interence source and then have real difficulty identifying the cause.

You can try what businesses do and use multiple access points or use a nest system like Orbi which operates in a similar way to business access points wifi systems. But that is more expensive than cabling or using powerline to get ethernet to an area and is not guaranteed to fix the issue.

Ethernet is stable and not subject to any interference, even in hospitals where interference is very high in some areas; where xrays, MRI's and other potential interference is present ethernet will always work. It may need shielded cables but will not have the issues wifi has with stability and signal and data loss in the network traffic.

Interference on wifi will see signal loss, signal loss will see picture and sound affected on streaming of video with pixelation in extreme cases.

I would go with cat5e/cat6 over wifi if i have a choice and cable every room where possible. Wifi on TV offers no benefits unless you require speeds over 100 mbs on a tv. The only thing that requires that is streaming 4k ripped disks froma NAS or Sonys Bravia core service. If manufacturers ever put gigabyte nics on TVs then faster speeds over 100 mbs will be possible.

Wifi is good for tablets and other handheld devices, but that is it.
And for businesses that is what their wifi is predominately for tablets/laptops etc. Or for areas where cabling is problematic, for example in older buildings with asbestos in the walls and roof space.
 
That is not true and why business's will use ethernet over wifi where possible.

Wifi is subject to interference and dropouts on the data it carries with everything from neighbouring wifi signals to microwaves . Optimal signal strength is -30db, but down to -70db, at a push and you will get a good signal, though will be slower speed at -60db/ -70db, once it drops below that then quality is significantly affected with lags, drops outs etc and spikes of interference can see -90db or lower signals occur.

You can get lucky and be in a house which has no interference and then suddenly something changes from a neighbouring property or other external interence source and then have real difficulty identifying the cause.

You can try what businesses do and use multiple access points or use a nest system like Orbi which operates in a similar way to business access points wifi systems. But that is more expensive than cabling or using powerline to get ethernet to an area and is not guaranteed to fix the issue.

Ethernet is stable and not subject to any interference, even in hospitals where interference is very high in some areas; where xrays, MRI's and other potential interference is present ethernet will always work. It may need shielded cables but will not have the issues wifi has with stability and signal and data loss in the network traffic.

Interference on wifi will see signal loss, signal loss will see picture and sound affected on streaming of video with pixelation in extreme cases.

I would go with cat5e/cat6 over wifi if i have a choice and cable every room where possible. Wifi on TV offers no benefits unless you require speeds over 100 mbs on a tv. The only thing that requires that is streaming 4k ripped disks froma NAS or Sonys Bravia core service. If manufacturers ever put gigabyte nics on TVs then faster speeds over 100 mbs will be possible.

Wifi is good for tablets and other handheld devices, but that is it.
And for businesses that is what their wifi is predominately for tablets/laptops etc. Or for areas where cabling is problematic, for example in older buildings with asbestos in the walls and roof space.
I guess it’s just I’ve not seen any posts about “ my picture quality looks worse over Wi-Fi “

Apart from the stability side of things of course.
 
I've tested it myself, so I know it's true.
No matter which device, which is why now I have Ethernet for all my devices.

It will probably be a while before a wireless connection can move as much data as a wired one.
Depends on the speed of the ethernet port and wifi. The ethernet port on an LG G1 TV for example is only 100mbit. I get far higher data rates over WiFi, almost 200mbit.

As long as your WiFi isn't fluctuating and you can maintain the data rate for max quality, around 16Mbit on D+, there will not be any difference between WiFi and wired.

Only if your WiFi connection has trouble keeping a stable 20mbits would you be better off on wired.
 
Last edited:
I guess you're doing something wrong then, because I can get connections up to 480 Mbps wired.

I guess it’s just I’ve not seen any posts about “ my picture quality looks worse over Wi-Fi “
I noticed you agreed with the previous, so I take it you haven't actually tested it yet.
 
I guess you're doing something wrong then, because I can get connections up to 480 Mbps wired.
Not on an ethernet cable plugged into a 100mbit ethernet port you wont.
Like I said, it depends on the port and the wifi.

For me wifi is faster. Data is data, the tv decoder doesnt care if it arrived over wifi or a wire, as long as the bandwidth is sufficient there is no difference.
 
I guess you're doing something wrong then, because I can get connections up to 480 Mbps wired.

Which model TV do you have and they have changed it to a gigabit port recently? If its just 100Mbps port then you software testing is telling porkies.

If you don't believe me, test if for yourself.
Watch how much more data moves when it's wired.

You'll probably need a UHD source to see the difference.

For Netflix the UHD bitrate is 15Mbps (Covid benefit :( ) and I have forgotten what button but on my CX you can see the bitrate and clearly see when it drops below that.

It doesn't matter if it's on WiFi or Ethernet but so long as you can maintain that 15Mbps then there isn't going to be any difference in PQ. Wired data does not move more data for the size\bitrate as that would be inefficient.

Having said that I am an advocate for hard wiring too and have played around with the TVs WiFi and ethernet (due to the 100Mbps port), they are currently connected via a USB -> Ethernet adapter and my smaller TV gives north of 200Mbps when it feels like it. The larger one is a bit more variable though and I think like others some models have an issue with their network stack. (They are both CXs)

When it comes to moving other types of data around then yes it's wired is generally better and not subject to contention (which will impact speed) or interference etc.
 
I guess you're doing something wrong then, because I can get connections up to 480 Mbps wired.


I noticed you agreed with the previous, so I take it you haven't actually tested it yet.

Nope, because I have no easy means to do the test, not that I believe I will notice any difference what's so ever.
 
Last edited:
I would say for a lot of people wired is probably the better option. It gives a consistent stable connection.
But to say wired is always better than wifi is pure nonsense. It just comes down to whether you can achieve the bandwidth necessary for the particular stream quality you want.
 
For Netflix the UHD bitrate is 15Mbps (Covid benefit :( ) and I have forgotten what button but on my CX you can see the bitrate and clearly see when it drops below that.

It doesn't matter if it's on WiFi or Ethernet but so long as you can maintain that 15Mbps then there isn't going to be any difference in PQ. Wired data does not move more data for the size\bitrate as that would be inefficient.
That's what I was trying to say, but he seemed to suggest wired gives better picture quality regardless.
 

The latest video from AVForums

Is 4K Blu-ray Worth It?
Subscribe to our YouTube channel
Back
Top Bottom