Disney+ 4K, really?

One problem with disney + and MCU that I havent fixed yet is the credits shrink down to a PIP at the end so it can suggest what to watch next , this means you miss the credit cut scenes ..... anyone know how to stop that from happening ?
I don't have Disney+, but I have Hulu, which is owned by Disney.
On there you just back up a bit, rewind, & then it plays through again without the suggested next.
 
I was going to use your rude turn of phrase in return when you originally confronted me on this thread, but it seems to have been edited out. Anyway, your comment is patently incorrect.
Remember, we are commenting about UHD content.
I know we're talking about UHD.

From this link you can see bitrates.

Disney+ UHD is around 16.5Mbps, so I wasn't far off.
If you have at least 40Mbps over wifi, then you're easily capable of getting the 16.5Mbps stream, and it will be identical to pulling it over Ethernet. Provided, no one on a tablet/phone starts hammering the wifi.

Though even on Ethernet you could still suffer if little Johnny upstairs starts downloading big files from fast servers and maxes out your total home connection.

Offices use ethernet because no-one there cares about cables everywhere, and from an interference point of view, it removes it from the equation. This is not a valid comparison to a living room with much smaller amount of equipment.

And no-one is saying never use Ethernet, we're saying there will be no difference IF you can get as good or better speeds over Wifi.
 
Last edited:
No wonder people think Disney+ isn't UHD.
16 Mpbs is just good HD.
maybe equivalent to HD+Dolby-Vision, like Paramount's Picard or Discovery
 
Done already, now it's your onus to prove me wrong, which you haven't & can't.
Get the weekend started!

Ok so you have shown us two speed tests with no mention of the actual connection speed to your ISP. Additionally you have shown us no bitrates or anything from the actual devices in questions, it's all a bit of guess work.

Here are some figures from an Apple 4K TV to compare both WiFi and Ethernet including Mulan for @dan~ which demonstrates certainly on an Apple TV 4K that Disney+ runs at full 4K at around 16.5Mbps.

The Apple TV in question is normally wired to a Gigabit Ethernet Switch running at full speed. A speedtest running from the native App gives this speed. That's about as fast as Gigabit LAN will get.

1651228388818.png



Ok so now I have connected to WiFi, where the Apple TV is located is in the middle of a metal rack so doesn't get great WiFi connection. Actual report connected speed is a mere 72Mbps so pretty crap really.

Results are pretty poor

1651228515422.png


As you can see download speed is pretty rubbish, I thought I would even have to move but thought I would try and see what the results looked like anyway. You can see that the upload was a bit quicker and I have a reasonably busy WiFi network with 20 odd devices currently connected.

Since the OP asked about Mulan (not sure if the Cartoon or the newer one) but both are available on Disney+ in full 4K and here is the bit rates as reported by Apple TV.

Wired:

1651228863608.png


So as you can see the network bandwidth (which is the peak throughput is higher) but average bitrate is around 16.4Mbps and is running at 4K.

WiFi:

1651228896607.png


Network bandwidth is lower as expected because it's a rubbish connection. Average bitrate was slightly higher at 16.77Mbps.

Indicated bitrate is identical in both as it's playing the exact same stream i.e. the max quality available and so Picture Quality is identical.

Long Way Up Apple TV Original

Wired:

1651229149370.png


WiFi:

1651229176494.png


Same indicated bitrate identical PQ

Netflix "Is it cake?"

Wired:

1651229251480.png


WiFi:

1651229289351.png



So in summary the indicated peak and average bitrates are identical even on this pretty crap WiFi connection. The average bitrate which is measured in realtime and is constantly altering is in pretty much the same. Typically on Netflix when the quality does drop down it for example drops to around 11Mbps.

To be clear no one is arguing that wired is not a better connection and ultimately allows more speed. However for streaming so long as you can sustain a reasonable WiFi connection then there will be no differences in the bitrate of the stream and since it is playing an identical file stream no differences in PQ.

If you are seeing otherwise suggest you look at actually getting some decent stats off the your equipment to check you don't have a fault somewhere.

And to @dan~ yes it would appear Disney is in 4K+
 

Attachments

  • 1651228731227.png
    1651228731227.png
    667.2 KB · Views: 42
  • 1651229068912.png
    1651229068912.png
    667.2 KB · Views: 42
  • 1651229102644.png
    1651229102644.png
    661.9 KB · Views: 43
No wonder people think Disney+ isn't UHD.
16 Mpbs is just good HD.
maybe equivalent to HD+Dolby-Vision, like Paramount's Picard or Discovery

No one is happy about it but thanks to Covid a number of big providers dropped from around the 30Mbps or so to half that :(
 
How do yo get that dialogue box on the CX?
I'm sure it's an Easter-Egg button, but which?
The only one I currently know is pressing the green remote button 6 or 7 times quickly for VRR info.

I will test it that way, but just doing the Math of Wi-Fi vs. Ethernet for me every time it's lower with Wi-Fi even with the device right next to the router/modem.
every time!
with no other devices running on the line

I've been asking about these Easter-Egg button combos in the CX thread, but no one responds.
 
I have a reasonably busy WiFi network with 20 odd devices currently connected.

Since the OP asked about Mulan (not sure if the Cartoon or the newer one) but both are available on Disney+ in full 4K and here is the bit rates as reported by Apple TV.

Wired:

1651228863608-png.1689766
I want to make a joke about ODD devices, but it would make more sense if your Wi-Fi was slower like mine.
Thanks for putting your money where your mouth was, factual evidence.
Now could you please tell me where this graphical information is coming from, is that from the ATV or the TV?
I've been fooling around with my LG-CX for over 33 minutes & can't figure out how to get that dialogue.
 
I want to make a joke about ODD devices, but it would make more sense if your Wi-Fi was slower like mine.
Thanks for putting your money where your mouth was, factual evidence.
Now could you please tell me where this graphical information is coming from, is that from the ATV or the TV?
I've been fooling around with my LG-CX for over 33 minutes & can't figure out how to get that dialogue.

I would like to know how to get that info as well.

The evidence he shows is clear that picture quality will be the same, providing wifi maintains a stable connection, which in most cases does for most, which is what most have been trying to point out, if you notice differences in picture quality maybe the issue lies elsewhere or your wifi connection is just not very stable, causing the picture to drop in quality.
 
I figured it out, but I'm not sure I want to get the MacBook Pro out to modify my UHD-ATV.
Maybe I can find something similar on the Roku-Ultra, as ALL my devices show they're using more data when on Ethernet.
 
If you are using the assumption that your bit rate/bandwidth is the difference between your normal Speedtest and one running when streaming a source then the logic may be flawed. Routers don’t always treat Wi-Fi and wired the same. When you run a speed test You are demonstrating in your environment your Wi-Fi has a lower priority than wired. Unless you can see the bitrate from Apple TV in dev mode then you may be interpreting the results wrongly.
 
I spent part of my morning figuring out how to get a Developer's display so I can see the bitrate reported by the device itself.
Roku is out, they patched the previous Easter-Egg where you could throttle the bandwidth or display the speed on-screen.

FireStick has a Developer's Menu & will show the current bitrate right on-screen:

Jack Ryan S01E01, the scene where the boys are playing on the roof, 50 Mbps wired
15 Mbps on Wi-Fi, but I thought that might be from the cache helping out the 2nd time playing the same scene, so I rebooted the FireStick & watched the same scene via Wi-Fi at 50 Mbps.

So then I checked using my previous method of subtraction & came up with only a 2 Mbps difference, starting the speedtest at the same spot of the scene each time. That could just be down to ISP fluctuation, especially on a Friday arvo.

If Wired is a higher priority to the router then my devices will stay wired.

Holy Hell I am hard-headed, wish I had looked up that Developer's screen sooner!
 
I spent part of my morning figuring out how to get a Developer's display so I can see the bitrate reported by the device itself.
Roku is out, they patched the previous Easter-Egg where you could throttle the bandwidth or display the speed on-screen.

FireStick has a Developer's Menu & will show the current bitrate right on-screen:

Jack Ryan S01E01, the scene where the boys are playing on the roof, 50 Mbps wired
15 Mbps on Wi-Fi, but I thought that might be from the cache helping out the 2nd time playing the same scene, so I rebooted the FireStick & watched the same scene via Wi-Fi at 50 Mbps.

So then I checked using my previous method of subtraction & came up with only a 2 Mbps difference, starting the speedtest at the same spot of the scene each time. That could just be down to ISP fluctuation, especially on a Friday arvo.

If Wired is a higher priority to the router then my devices will stay wired.

Holy Hell I am hard-headed, wish I had looked up that Developer's screen sooner!

Well we were trying to tell you ;)

But on serious note 2Mbps network variance can be down to other devices on the network as well as time of day and ISP etc. My internal network is never very quiet regardless of time of day

My wireless access points are completely separate from my router so by the time the IP traffic gets to the router it doesn’t know the transition method.

My father in laws sky router only allows wireless to occupy half the available bandwidth and doesn’t seem to be tuneable. We thought it was underperforming till we did two runs at same time and probably maxed the connection. The wired connection always have full bandwidth
 
If Ethernet is not an option, it might be worth some time to check which channel gives the best performance.
Some access points have an integral interference detector & will automatically select the best channel, but if not...
 
You can get WiFi scanner apps for phone, they show all the signals and which channels they're on so you can pick a channel that has least contention.
 
The thing with channels from neighbouring routers and access points is that they can and will move channels. What might be clear one day might be congested another. At least with 5GHz channels there is a lot more to choose.

Having said that wire whenever you can do devices are not contending all for the same airtime.
 
Not read whole thread (sorry on a quick break) but i do remember early doors in the Pandemic during lockdown most of the big streamers said they would reduce bandwidth in UK, I bet that hasn't gone back up again as if they can get away with less than why pay more to boost it back up
 
The majority of - if not all - the Marvel films are struck from 2K sources and upscale so are not native 4K.
 
The new Star Trek movies are also upscaled, so I suffice with my old HD blu-rays & let my blu-ray player upscale it & convert to Dolby Vision.
 
Starwars movies are made at 2K then upscaled at 4K , as other mentioned but not suprise all movies are made at lower resolution than 4K
Phil Spencer did say somewhere in one of his interviews that the only true 4K Source are games made at 4K pure full fat 4K pixels
 
Starwars movies are made at 2K then upscaled at 4K , as other mentioned but not suprise all movies are made at lower resolution than 4K
Phil Spencer did say somewhere in one of his interviews that the only true 4K Source are games made at 4K pure full fat 4K pixels

Star trek 4k fioms are remastered from orgininal 35mm negative with 4k scans.

Star Trek: First Contact was shot on 35mm film using Panavision Panaflex Platinum cameras and for this release Paramount has completed a new 4K scan of the original camera negative and master interpositive elements producing a new 4K DI from which this UHD is sourced.


If films are remasted from 35 mm stock native 4k is achievable, probabably higher than 4k too.


Orginal star wars trilogy was shot on 35mm film.

film stock Imax is far higher than 35mm resolution.
 
There is a difference: Ethernet vs. Wi-Fi
If you want The Best streaming picture-quality, you should get it wired if possible.
I agree. Once I upgraded my TV I noticed a lot of fluctuation in picture quantity when streaming in 4K. I connected my Apple TV via Ethernet and picture quality has been rock solid.
 
Most pro cinema cameras are at least 4k these days so maybe sometime ago but not an accurate statement now.
The Camera's are 6/8k for the Marvel films but they are mastered in 2K . Meaning all the CGI/colour grading/editing/footage is processed in 2K.

The vast majority of those 4K blu ray films you're watching are actually sourced from the 2K material, very few films are mastered in 4K due to time and cost.

If you watch the film 'Lucy' that's a great example of a 4k mastered film.
 
The Camera's are 6/8k for the Marvel films but they are mastered in 2K . Meaning all the CGI/colour grading/editing/footage is processed in 2K.

The vast majority of those 4K blu ray films you're watching are actually sourced from the 2K material, very few films are mastered in 4K due to time and cost.

If you watch the film 'Lucy' that's a great example of a 4k mastered film.

Yeah for Marvel stuff though newer stuff might be coming through. What about other studios?
 

The latest video from AVForums

Is 4K Blu-ray Worth It?
Subscribe to our YouTube channel
Back
Top Bottom