• New Patreon Tier and Early Access Content available. If you would like to support AVForums, we now have a new Patreon Tier which gives you access to selected news, reviews and articles before they are available to the public. Read more.

The HDR debate thread

Jackass

Prominent Member
Before we get actual bluray 4K HDR movie content, I've been trying out the limited HDR content,
That will work with my Sony X10 media player and 520ES Projector.

A grand total of 1 clip so far :D

I'm posting a series of pics I've taken with an iPad Air 2 (not the best...) to throw it out there which image is preferred.
(You can see in the bottom left corner the projector HDR setting on or off)

If it's the correct matching version of HDR for the source and Projector,
I'm not sure but with it on, I think it doesn't look to bad, some loss of detail in bright parts, thoughts?

image.jpeg
image.jpeg
image.jpeg
image.jpeg
image.jpeg
image.jpeg
image.jpeg
image.jpeg
image.jpeg
image.jpeg
 

stevos

Distinguished Member
Which version of HDR is it and how is the projector 'turning it off' is it through the meta data or just cropping the dynamic range?

I can't see if the mediaplayer has hdmi 2.0a.

HDR seems to currently be a bit of a mess on how it works, depending on the standard used.
 
Last edited:

Peter Parker

Distinguished Member
Sometimes hard to tell from pics what is actually happening on screen, but the very first pic seems to have more detail due to more brightness in the ends of the logs, which looks more three dimensional compared to the same image with HDR on.

The last two pics seem to show that HDR has lost a lot of the image detail that's visible in the glass behind the fire.

Is that what you are really seeing, or is the camera compensating for the extra brightness HDR should be giving in the highlights, and closing down so it doesn't show the detail that we can see with HDR off in the other picture?

If the HDR images were the ones that look better from the images, then i'd say it's delivering the hype, but currently it seems like it needs work.

This is what I was asking for in another thread - has anyone actually seen it - rather than rely on some glowing online appraisments, it's nice to know what people are actually seeing in real terms on real machines with real content - assuming everything is working as it should be.

As Stevos says, it does seem a bit of a mess right now and it's possible that it really is very good when working as it should be, but currently I'd wait until things have settled down and it's actually delivering results before buying into the tech just yet.

Gary
 

Gilbers

Established Member
Given that these images are in an 8-bit image format, can we be seeing anything other than relative differences in exposure? It seems a bit like the old 'auditioning speakers by telephone' scenario. :)
 

Peter Parker

Distinguished Member
That was my point and why I was asking if it was representative of what was actually seen on screen. Screen shots are fun, but usually not a good indicator for overall image quality. Obvious differences like those above can give you a clue about what's going on, provided they are showing what is actually visible on screen.

Gary
 

Normal Bias

Established Member
I don't think it is easy, or maybe even possible, to compare HDR vs non-HDR with still shots. This is because of the conflicting arguments that a) the two photos should use the same settings (exposure time, ISO, aperture etc) to represent each image under the same conditions, so as to be comparable and b) the human eye has a variable aperture controlled by the iris.

Simply put, what you show with two still images cannot possibly represent what you will see watching the screen. Whilst this might apply more generally, I think for HDR it is particularly true. Everything we see in the real world is super HDR and ultra wide gamut. When we see that on a screen it is a subset of what we are able to see. A photograph of that is another clip in terms of gamut and dynamic range.
 
Last edited:
To me those pics looks like 2 different gamma settings.
HDR off is clipping in the high end, and HDR on is clipping the low end, but i find your pictures of the HDR on most natural, where the HDR off looks overexposed.

Why is it that the media is the key to HDR, is it a brand new gamma curve.? I set my max light output, and gamma on my display for the room/ inviroment im in, so that i find it comfortable to wach the movie, and dont need to bring my sunglases like if im walking around town, or driving my car.

If i want higher DR i just put more light on my screen, and adjust my gamma up.
I dont ever want my eyes not beeing able to adapt to the light on screen, so i loos information going from a bright to a dark scene, or the other way around

For me it all sounds like another sales trick to sell another media, and new displays.
 
Last edited:

Trollslayer

Distinguished Member
Given that these images are in an 8-bit image format, can we be seeing anything other than relative differences in exposure? It seems a bit like the old 'auditioning speakers by telephone' scenario. :)
And they have been compressed.
Maybe RAW files could be hosted somewhere.
 

Peter Parker

Distinguished Member
Just see this comment from Phil Hinton in his review of the new Sony 520ES:

Sony VPL-VW520ES 4K Projector Review

One other item we got to play was a Fox TV clip of a fire burning with logs and flames. Again we had to manually switch on the HDR button but this time it robbed the image of detail and crushed the blacks. It actually looked better and had more depth when HDR was switched off.

So the image of the logs is representative of what people are actually seeing.

Gary
 

Pecker

Distinguished Member
As most (all?) of us will be viewing these stills on a non-HDR display, so...

You get the idea.

I really want to see this, and will keep my mind open until I do, but that doesn't mean we're not going to have educated guesses and ideas in advance.

The old SDR specs were apparently designed for 100 nit displays. 100 nits is almost 30 fl, and I'm not convinced too many projectors can do that at the moment.

HDR has apparently become a possibility as displays start to edge towards 1,000 nits. That'd be almost 300fl, and I'm not convinced a projector kicking out that sort of brightness would even be allowed to be sold legally as a home cinema model.

Meanwhile, flat screen TVs are edging down towards zero light output for blacks. Projectors can't get close. As black levels get higher, you need to get brighter at the top end to maintain the higher contrast, which, as we've seen, isn't going to be possible.

In short, with (relatively) duller whites and greyer blacks, the contrast range of current projectors appear to be struggling to get to that of SDR flatscreens, let alone HDR.

This doesn't mean there'll be no benefits (I'm pretty sure banding can be eliminated - though it's not much of an issue on well-mastered contet anyway), but it strongly suggests many (most?) of the benefits will be largely negated.

It'd be interesting to see how this fire demo looks on a HDR flatscreen. It would appear to be very well suited to the concept of HDR, with parts of the image pitch black (or close) and other parts very bright.

But who knows? Maybe someone will find a way of pulling something out of the hat.

Ultimately, it may well just mean the projector will accept a HDR input, and you'll at least be squeezing the last few drops of improved quality possible, regardless of how little that is. There doesn't appear to be much reason that the HDR chips would need to cost a lot (or any?) more than SDR versions, so why not.

Steve W
 
anyone know what kind of contrast the human eye can handle within lets say ½ a sec to adapt.
30fl in a bat cave would make me totaly blind for quite some time if changing into a very low level scene, im very comfortable with total black to 10fl, and as the black level go up, the light output can follow.

if 1000nit is the goal, im thinking they must have very elevated black level, or going for bright rooms, if not it might be a good idea to master the movies with 5-10 sec nothing and a warning like, for realism we advise to put your sunglasses as we are about to show a scene driving with the sun in your eyes.
 

Peter Parker

Distinguished Member
Darinp2 has lots of data on the HVS including stuff from NASA, and he's done some experimenting so he can probably tell you what you want to know.

I have seen the Epson laser pj (LS1000?) and that does a very good fade to black which I initially found a little uncomfortable (complete blackout in the room), but can certainly add something to a movie if they want to add a claustrophobic element (and would be great with 3D audio), and I think that would be more beneficial than brighter images with respect to movie immersion. If HDR can give us a more 3D looking image (without the need for glasses), then that would be a good thing too, but I'm not sure we need very bright, almost blinding images that might be physically rather than psychologically uncomfortable. That's just an opinion btw, but would like to see those things to see if it does add something - maybe more realism like you say when we drive into direct sunlight, which would make us wince and look away - a kind of visual tactile element.

At the moment, with those images and the comments from those who have seen it, I get the impression that it looks like HDR may be compressing the dynamic range of a pj into two thirds of what it can achieve for normal viewing, and saving the upper third and higher brightness for the brighter elements to make HDR work. It also looks like it's not adapting the gamma so it's dulling the image down and crushing detail so that's a negative thing which needs fixing.

I'm sure it'll look great when it's been fully understood by the mastering people as well as how to present it by the display manufacturers, but at the moment it just seems a bit of a mess doing more harm than good. The content that is out there can't even be played by displays that are supposedly up to spec, so perhaps it's the content that is at fault at the moment. Some of the early BDs were a bit crap until they got the hang of things, so I guess this is no different. but it just seems different this time.

Because of that, I've not really been following the current technical aspects of the new standards as so far there seems to be a lot of confusion about what we should be getting and who and what can achive these specs, or even partial specs, and it just seems like a lot of 'noise' at the moment, especially with some people claiming some things as fact on how it works and then a day later saying they were wrong and now this is how it works, so a lot of confusion out there right now from some people that also appears to be visible in the content. Admitted it's early days, but I can't say I remember anything like this before in the last 15 years. It's nothing unusual on forums I guess with people being excited about the next best thing, but this, along with the rest of the new specs for UHD just seems to be the biggest changes and the biggest mess of them all since I got into this hobby.

Not a rant, just a disappointed perception.

AOOTTFITR...
 
Last edited:

Pecker

Distinguished Member
If HDR can give us a more 3D looking image (without the need for glasses), then that would be a good thing too, but I'm not sure we need very bright, almost blinding images that might be physically rather than psychologically uncomfortable. That's just an opinion btw, but would like to see those things to see if it does add something - maybe more realism like you say when we drive into direct sunlight, which would make us wince and look away - a kind of visual tactile element.

Yup, I'm of the same opinion. Looking at raw figures and stats, cinema presentations have never been all that bright, but they've often been very impressive.

I'm sure it'll look great when it's been fully understood by the mastering people as well as how to present it by the display manufacturers, but at the moment it just seems a bit of a mess doing more harm than good. The content that is out there can't even be played by displays that are supposedly up to spec, so perhaps it's the content that is at fault at the moment. Some of the early BDs were a bit crap until they got the hang of things, so I guess this is no different. but it just seems different this time.

Because of that, I've not really been following the current technical aspects of the new standards as so far their seems to be a lot of confusion about what we should be getting and who and what can achive these specs, or even partial specs, and it just seems like a lot of 'noise' at the moment, especially with some people claiming some things as fact on how it works and then a day later saying they were wrong and now this is how it works, so a lot of confusion out there right now from some people that also appears to be visible in the content. Admitted it's early days, but I can't say I remember anything like this before in the last 15 years. It's nothing unusual on forums I guess with people being excited about the next best thing, but this, along with the rest of the new specs for UHD just seems to be the biggest changes and the biggest mess of them all since I got into this hobby.

Not a rant, just a disappointed perception.

I couldn't agree more.

There is a bit of a hint here, I don't know if you've picked this up at the TV forums at all, but for a while it looked like OLED had come along and trounced LCD TVs with their ability to show zero light at the bottom end, and were ready to launch UHD OLED, the thought of which had many having heart palpitations, then all of a sudden the LCD manufacturers came up with SUHD - basically "our tellys can go brighter than yours", dressed up as HDR.

Now I'm not sure if this perception of HDR as an 'OLED-spoiler' is accurate, and much of what I've heard since suggests it was on its way anyway. But I suspect the concept was rolled out far earlier than anyone wanted. It reminds me of the launch of HD DVD, which forced Blu-ray Disc to come out before it was ready with 1.0 players, and so on.

Perhaps OLED was part of it. Perhaps they were just finding that the step up to 4k from 1080p wasn't as impressive as they'd hoped at current TV sizes/average home seating distances. There's certainly been a lot that's been said about 4K + expanded colour + HDR being needed together to impress people enough to upgrade.

Time will tell.

Interesting times.

But what an almighty cock up of a launch.

Steve W
 

Peter Parker

Distinguished Member
That's interesting in what you say there - almost like they're in competition with each other and rushing out something to try and take the market share from something else that is proving popular. I can understand that drive, but it does seem to be the wrong way of doing things.

By all means introduce new tech, but only when it is fit for purpose and adds to the experience.

Pecker said:
But what an almighty cock up of a launch.

Glad to see I'm not the only one who feels that way.

Gary
 

Trollslayer

Distinguished Member
They are driven by new product cycles to get the sales to keep the balancing act going.
One year without new models all over and they would be in trouble.
 

Peter Parker

Distinguished Member
Yes I know that, or they'd go out of business after everyone had bought something, but this feels like a completely different animal. You can at least make sure it works in the first place but this seems like a very rushed and not fully understood bunch of improvements that don't deliver. I guess I'm new to this, having only been in this this hobby for 15 years, but this is the biggest mess I can remember.

I wonder how Gordon feels about it as he's been doing this for a lot longer.

Gary
 

Trollslayer

Distinguished Member
I don't think you realise the pressure engineers are under.
I have know places in the US where the whole team are in over Christmas and another where at the end of the year most of the team had 80% of their holidays to take.
Marketing say it has to happen because management have the schedule arranged a couple of years ahead regardless of what happens to the business in the mean time. This is what lead to the 4G phone fiasco in 2001 - everyone going for it, there was no way the network could deliver enough calls or the phone batteries last.
 

Peter Parker

Distinguished Member
I don't think you realise the pressure engineers are under.
I have know places in the US where the whole team are in over Christmas and another where at the end of the year most of the team had 80% of their holidays to take.
Marketing say it has to happen because management have the schedule arranged a couple of years ahead regardless of what happens to the business in the mean time. This is what lead to the 4G phone fiasco in 2001 - everyone going for it, there was no way the network could deliver enough calls or the phone batteries last.

I'm an engineer (I use the term loosely) but in a different industry, and I understand the situation completely, possibly more than most.

As usual, management want something but with no idea of what's involved because they're 12 years old and not engineers, and if the engineers don't deliver in the timescales given it's their fault.

That certainly explains the mess. :)

Gary
 

Pecker

Distinguished Member
I'll open up a little on my concerns for HDR in general.

We can all argue the toss as to which are the greatest film ever made, but most would accept (if not agree) that Citizen Kane is right up there, having won the Sight & Sound poll for decades, as is Hitchcock's Vertigo, which recently replaced it.

The use of silhouette in both films is well documented and a quick Google will show up any number of links for the uninitiated.

To take a couple of examples, first Vertigo:

vertigo silhouette.jpg


And secondly CK:

CK Silhouette.jpg


The intention is clear for both of these shots, and for many others throughout both films, and many others. Equally clear is that, whilst certain parts of those images are evidently supposed to be a full silhouette, with no detail visible at all, there will almost certainly be some detail deep down on the OCN.

Any attempt to bring out the full level of detail will destroy the director's intent.

Perhaps the only DoP for whose films you could give free reign would be the Prince of Darkness himself Gordon Willis. GW lit and shot his films exactly as they were supposed to be seen - often with use of silhouette and great darkness - and as Robert Harris has said "There's only one way to print Gordon Willis: dark!"

GW himself said:

"On every movie I shot, I maintained strict developing and printing control — everything was printed on one light. In fact, much of the negative on the Godfather films will only work when printed that way. I lit and exposed things at the level I wanted to be perceived on the screen; if you don’t do that, anyone can decide what your work is supposed to look like, and I never believed in giving the studios that kind of flexibility. So when making exposures, I based my exposures on the full curve of the film, shoulder to toe. The exposures are right where they should be to achieve a given look on the screen as long as they’re printed as designed. There’s no room to move things around on the printer.”

The important point here is that Willis is rare if not unique in this regard.

I fear that the way in which the vast majority of films (other than GW's) were shot will give studios exactly the sort of flexibility that Willis feared studios having, and the availability of HDR will potentially allow for any number of abominations.

Steve W
 

camelot1971

Prominent Member
Just a quick observation on my experience with HDR so far.

I've seen episode 1 of Amazon's The Man in the High Castle in HDR on my Samsung JS9500 and there's a scene where a car is driving towards you with the headlights on and it felt like an actual car had it lights on and I needed to look away slightly momentarily!

There were also a few scenes where someone was welding in a garage and again, the brightness of the light was very realistic.

I was skeptical of the value of HDR but have came away impressed, certainly more so than the transition from 1080p to 4k.
 

Pecker

Distinguished Member
The specs for HDR are finally up:

Ultra HD Alliance announce new 4K specs and logo

...a certified TV can either offer a combination of at least 1,000 nits peak brightness and less than 0.05 nits black level (aimed at LCD TVs) or, alternatively, more than 540 nits peak brightness and 0.0005 nits black level (aimed at OLED TVs).

Even cherry picking from the two specs, can any projector on a modest 100" screen get blacks as high as 540 nits (that's 157fl), or as low as 0.05 nits (0.014 fl)?

What's the brightest home cinema projector, and what brightness can it reach on a 100" screen?

And what's the darkest a JVC will go on a 140" screen?

Steve W
 

The latest video from AVForums

CES 2023 Round Up: New TV Lineups for 2023 from LG, Samsung, Panasonic, Hisense & TCL
Subscribe to our YouTube channel

Full fat HDMI teeshirts

Support AVForums with Patreon

Top Bottom