I'm trying to educate myself on current TV technologies and this got me confused. Even most LED LCDs don't come close to 4000 nits. OLEDs are having a hard time reaching 1000 nits. Why are so many movies mastered at 4000 nits when there are no TVs that can benefit from that? I don't think it's future proofing considering that most 4k blu-ray movies are mastered at 2k resolution still. I understand that most TVs will perform a tonemapping algorythm to map the metadata to the capabilities of the TV but seeing as there is no standard how to map the PQ eotf curve to a lower peak brightnes panel manufacturers have taken very different approaches with some trying to preserve specular highlight detail by sacrificing APL and others trying to preserve APL while clipping away the specular highlight detail. I've read that Bladerunner 2049 is even going further than that. It's being mastered to a peak brightness of 10000 nits. Can someone explain to me why hollywood studios are choosing to go for such unrealistic figures?