Question Might sound like a silly question, but...

andypandy

Established Member
Joined
Feb 6, 2002
Messages
654
Reaction score
18
Points
158
Location
Surrey, UK
How can one tell if the video I am watching via my projector (Optoma UHD300X) is HDR?

After messing around (and doing lots of web surfing) , i have found that my win 10 PC (1050ti GEForce graphics-PH-GTX1050TI-4G) needs to be configured with windows display HDR set off plus Nvidia control panel, I've set:
resolution: 4k 60hz
colour depth: highest 32 bit
output colour depth: 8bpc
Output colour format: RGB
output dynamic range: Full
Then simply use VLC as my player - not changed any settings apart from subtitles off from start.

result is certainly 4K images & colors look in a league of their own compared to 1080p or lower.
But saying that there are is the odd 1080p film that is actually exceptional good quality,e g: Master.Z.Ip.Man.Legacy.2018.BluRay & The Raid

The issue is can I improve on this? Basically how can i really tell the movie is being displayed in super stunning HDR and not just simply stunning 4K?

Also I am outputting from PC to projector (ON port 2-4K) from a PC display port. using an adapter to HDMI and then a HDMI to HDMI Fibre CABLE: ie.

https://www.amazon.co.uk/gp/product/B07LCNWLPM/ref=ppx_yo_dt_b_search_asin_title?ie=UTF8&psc=1

https://www.amazon.co.uk/gp/product/B07MBCLHYC/ref=ppx_yo_dt_b_search_asin_title?ie=UTF8&psc=1

One reason I did the connection as above was that I could not actually find a suitable 5 to 10m display port (not mini) cable to HDMI capable of handling 4K @ 60hz....let alone even one at a reasonable cost - under £50
 
— As an Amazon Associate, AVForums earns from qualifying purchases —
Doubt you can get HDR, and for sure not with RGB 8 bit, try YCbCr 4:2:2 10 bit, and find out if your output on your PC supports HDMI 2.0, not sure how that works with displayport, or if you can use displayport for HDR at all.

In the case you actually get HDR you also need to figuer out how to tonemap as no projector can properly display HDR, and if you can handle Rec 2020 or your limited to Rec 709.

Your Optoma is a ultra low contrast projector, ill guess below 1000:1 wich is the nature of DLP so there is a limit to how dynamic it can actually be.

First thing you should do is calibrate your grayscale and gamma, and measure contrast, to get some idea how well it can trac D65 Rec709 and aim for a 2.2 gamma, before wondering into HDR.
 
Ah yep set windows to HDR on. then used YCbCr 4:2:2 10 bit on nvidia and voila the popup message at the top appears "HDR".
Everything I play now even if its not a HDR film my projector always thinks its a HDR settings on.
I've noticed the colours not as strong/vibrant plus a bit too bright with the above settings.

When I change it down to 8 bit the colours just seem to pop into a different league, the projector then shows NO HDR. The display mode on the projector moves from HDR to Cinema mode.

Viewing NON HDR movies also look way better with 8 bit as when its in 10 bit they too look a bit on the bright side with colours slightly washed out.
If I try and lower the brightness, it does improve the HDR films, but the others are then a bit on the darker side and the colours just don't pop out as much.

The projector is rec709 colours (220 Ansi Lumens) , not sure how to calibrate, bearing in mind I only connect my PC.
 
As with most projectors 1080P 24fps 4:2:2 8 bit SDR Rec709 works and look best, most of them will follow that standard fairly well, wich is not the case with UHD WCG HDR material.

Sounds like your source forces HDR out, and your projector dont know how to handle HDR properly.

Not beeing able to measure your colors and gamma tracking makes it quite hard to adjust it.
Ill recommend you to start with some clipping testpatterns for both SDR and HDR, you should never have to adjust your brightness, unless your black clipping is off, and be sure your video levels are set right in both your pc and projector, ill guess your pc should be set to 0-255 and projector to 16-235/ video level.
 
The projector appears to understand the HDR and switches the display mode to HDR . I can see that when i go in to its menu options. When I am feeding non hdr material it automatically changes into whatever mode I have preset - in my case Cinema mode, which does look the best.

I cannot select 0-255 (which I guess is output dynamic range=Unlimited) on my PC when connected with color format YCbCr 422 regardless of 10 or 8 bit. It greys out dynamic range with Limited. Perhaps Nvidia actually automatically sets it to 0-255 based on the colour format I chose - hence no option top choose.?

I will check what the projector sets colour wise this evening. Although I thought the whole point of switching automatically to detect HDR is that all colours, brightness and sharpness go to whatever level HDR sets?
 
try test the black clipping with RGB 0-255 and then 4:2:2 see if it apply same level, normaly PC is RGB 0-255 wich is the same as a player sending out RGB 16-255/ video level.
 
i am going to be away for a short while...will have to continue this when i get back.
 
I think i will simply go by what i see with my eyes using movies rather than those tests - its the simple route for me as I don't get how to use those tests and then how to make modifications. there are quite a few picture options everywhere on this projector.

One quick questions, is there a possibility that the screen menu options to be slightly out of focus and yet the film being viewed is in focus?
or should it be the norm that you focus perfectly on the menu options text first?
 
strange thing, but I just read that my gtx 1050ti card has HDMI 2.0 port , which from what i understand does not pass HDR content. Only the 2.0b ports do - which can be found on 1060 onward cards.
I say strange becuase I can pass HDR content onto my projector without an issue!
 

The latest video from AVForums

Is 4K Blu-ray Worth It?
Subscribe to our YouTube channel
Back
Top Bottom