How to output 12 bit from a Radeon?

paulfoley

Prominent Member
Joined
Sep 17, 2003
Messages
3,890
Reaction score
140
Points
925
Location
Hants
Just connected my HTPC (via DVI to HDMI lead) to my new HD350 JVC Projector and it tells me that the signal is 1920 x 1080 (8 bit). Its going through my Onkyo AV Amp but I know its not that to blame as the PS3 also goes through it & is reported as "12 bit".

Played with Catalyst settings but couldnt see anything obvious - any ideas guys?

:lease:
 
Anyone?
 
Just get it sorted before I come around later on :rotfl:
 
Doesn't really matter as the JVC converts everything back to 8-bit before it displays it. The ATI card is as good as the JVC processing, so you won't see any difference.
 
Thanks for the reply but the 8 bit colours look far more saturated than those from the PS3 (displayed as 12 bit when pressing the info button)?

I was wondering wether the EDID exchange information was awry & the PC was thinking the PJ couldnt handle 12 bit? Anyway I can force it?
 
Ahh that won't be anything t odo with bit depth, that will be because the ATI card is outputting RGB Full and the PS3 is outputting YCbCr. You can change the ATI card in the CCC to be YCbCr too.
 
Cheers, I'll give it a go.
 
Turned out that the card was only giving me DVI settings i.e. there was no YPBR option or video colours. After lots of research & hunting I needed to buy an ATI HDMI Dongle & HDMI cable rather than use my "DVID - HDMI" lead.

If anybody else wants one then you can get them from Amazon, be careful as theres one advert but two possible types (you have to email your GPU model to him).

:thumbsup:
 

The latest video from AVForums

Is 4K Blu-ray Worth It?
Subscribe to our YouTube channel
Back
Top Bottom