Novice Member
Many of the 4k HDR videos that we can buy in stores today were not shot in 4K. They are scaled from digital scans of analog media, the optical recording standard of which can hypothetically be defined as 2k (everyone has an opinion on this topic and there are some who say that 35mm film films can be scanned even up to 80K resolution. This, however, is about enlarging the smallest discernible pigment to the size of a melon, but keeping the sharpness of the final image, the loss of which is noticeable even when printing a photo in the most popular album format [the size and type of the plate used are of great importance, but it is not a key factor, because the spread the image of the light reaching it is technically more limited than in the case of digital cameras, with a larger size of the photosensitive matrix and technically dense because of the separative density of "light receptors"]. which is much larger than the negative, you can and scan another copy in incomparably higher resolution. However, it will not keep its sharpness, even when displayed on the screen with a resolution of 80K). Probably, therefore, better results are achieved by recording the image with a HDR digital camera, with technically determined recording parameters (in quality above 2K), than filming with traditional anological methods on the "economic" image recording media used so far. Most digitally scanned filmstrips require HDR post processing, including but not limited to sharpness of the image. When we put a DVD in a player, we no longer have to fool ourselves today. It can be safely said that even on a 15 "screen with higher image resolution, the movie looks only" interesting ". With a movie in the 2k standard, or 1080p, the amount of visible details increases enormously and the movie actually seems more interesting, even on larger monitors 4K movies are probably something that everyone has been waiting for a long time, and although most have not yet had time to equip the receivers with 4K HDR10 + DolbyVision image sensors, there is already talk about the 8K quality that is entering the market, or even 16K. So the question is: When looking for commercially available Blu-ray media in 4k HDR quality, we will find many movie titles available. However, not all movies were originally shot in 4K quality, and even most of them were only digitally scaled from 2k (1080p) to 4K standards. Previously available on Blu-ray movies as 2k bones additionally require HDR processing as a cosmetic procedure, but in the case of 2k films scanned from analog media, it is plastic surgery. So how do you know that the purchased 4k movie, which was originally shot in 4K quality, is not just an upscaled 2K copy of its release like the other titles originally shot in 2k quality? If we compare the commercially available movies that were originally shot in 4K quality with those originally shot in 2k quality and sold in the 4k standard, we will not notice the price difference.
Last edited by a moderator:

Clock'd 0ne

Active Member
That’s a very long-winded way of saying ‘fake 4K’ - you kind of answered your own question too - if you can’t tell the difference it doesn’t matter.

The reality is that some 2K upscales look absolutely stunning and resolution only matters when you can actually resolve it, which even at 4K most people sit too far away or do not have large enough screens to really tell the difference. No, the real benefits of the UHD format come from HDR/WCG, better encoding quality, etc.


Distinguished Member
No paragraphs = don't read.


Distinguished Member
Gave it a go.

Gave up after two sentences.

Muddy Funker

Active Member
One big block of text.

That could be the most important information ever written on the forum, but I gave up after a few lines.


Distinguished Member
My brain hurts. And that’s after reading half. Legendary post

The latest video from AVForums

Podcast: Samsung HW-Q800A Soundbar & Rotel Michi X3 Reviews + TV show and Movie News & Reviews
Subscribe to our YouTube channel
Support AVForums with Patreon

Top Bottom