Why does Component video never look as good as VGA?

Seriously, it's depressing in a sense. The best video quality I've ever seen for standard definition sources on any TV is from my Sega Dreamcast (from the late 90s).

No console that outputs Component video has ever given quality as good as this, on any display I've used. It's pixel sharp, there's no blurring, ringing, or colour bleed.

I realise Component uses a compressed colour space whereas VGA is just pure RGB, but what gives? Shouldn't Component look nearly as good?

HDMI is the same - all of the problems, just none of the noise. Which seems wrong since HDMI was derived from a monitor connection (DVI). It just seems like anything designed for Video rather than computer use is automatically of a lower quality.
 
I doubt it can be that simple though - if companies were using processing surely they'd notice if it was making the picture worse. Or is that wishful thinking...

So what does everyone think, is my hunch that "video" is just done with far less care right?
 

Nielo TM

Novice Member
Finally, someone actually agrees with me. VGA is superior then component accept VGA on the X360.

VGA is superior them component because it doesn’t go through any image enhancers or processors. When the display receives information from VGA, it converts to digital and map it to corresponding pixels (1:1). However, some TVs actually process the VGA or do a horrible job of displaying it.



I suppose if the display has EDID on HDMI, it can display the information without processing but I’m not sure on that.
 

Neil Davidson

Well-known Member
AVForums Sponsor
A lot of the problem is just pure bad maths!

In theory it should be possible to convert losslessly from RGB to YPbPr and vice versa but as stated this is often dropped to 4:2:2 or less to sve space. This instantly reduces the colour detail.

Next you are at the mercy of the decoder in the display. We have all heard of the tricks manufacturers use to compensate for overly blue greyscale - these can lead to some horrible effects. Calibration can often help a lot here though :) RGB inputs bypass the encoding and decoding steps of course. Remember that the decoding matrices rely on correct proportions of RGB for best results but again the actual gamut of every display is different and a lot of manufacturers seem to pay little heed to trying to match the Rec601 or 709 coordinates. You should see how good a well calibrated PJ with working colour management can look (aghhh lets not get on to the subject of correct colourspace selection)!

Then you get in to the numerous video processing monstrosities inflicted by the manufacturers. It is true that VGA input or (RGBHV sockets on some units) allow to define a pretty much perfect pixel map bypassing all the gubbins.

Basically pretty much a mess out of the box but some displays can be corrected up to a point. A good VP makes even more of a difference!

HTH

Neil
 

bidermaier

Standard Member
Nielo TM said:
Finally, someone actually agrees with me. VGA is superior then component accept VGA on the X360.

VGA is superior them component because it doesn't go through any image enhancers or processors. When the display receives information from VGA, it converts to digital and map it to corresponding pixels (1:1). However, some TVs actually process the VGA or do a horrible job of displaying it.

I suppose if the display has EDID on HDMI, it can display the information without processing but I'm not sure on that.
I am not so sure about the 360. When i tried the VGA cable on my 360 i was very dissapointed. i noticed the contrast was very poor and the colors washed out. Exactly the same problem i had using my Gamecube with a component cable and a component to VGA adaptor.

What i am sugesting is that the 360 does not have native VGA but it converts the signal from component.

Needles to say the vga signal is already scaled so it is going to look sharper and crisper, but the lack of contrast and the washed out colours still there.
 

Nielo TM

Novice Member
bidermaier said:
I am not so sure about the 360. When i tried the VGA cable on my 360 i was very dissapointed. i noticed the contrast was very poor and the colors washed out. Exactly the same problem i had using my Gamecube with a component cable and a component to VGA adaptor.

What i am sugesting is that the 360 does not have native VGA but it converts the signal from component.

Needles to say the vga signal is already scaled so it is going to look sharper and crisper, but the lack of contrast and the washed out colours still there.
because they actually use a video encoder instead an RAMDAC because its easier for the developers to develop with few resolutions. In another words, developers don’t have spend time to optimizing every resolution below 1360 x 768. Therefore, if you select 1280 x 1024, games are still rendered at 1280 x 720 and outputted as 1280 x 1024. Anyway, I’ve created a thread some time ago.




----------------------------------------------------------------------


Ok. Let’s start with the basics. RAMDAC stands for Random Access Memory Digital to Analog Converter and it primarily used in graphics cards. The purpose of this chip is to convert digital images that are stored in the video memory to analog. In another words, RAMDAC converts the digital images to analog with no manipulation (pixel, aspect ratio etc...) and send the information in a form of RGB through VGA to the display. This is why VGA always produced ultra sharp images. Now, why isn’t the X360 producing the same level of quality? That’s because the X360 is not using RAMDAC for VGA, instead, its using a video encoder similar to the one found in the original Xbox. This is why there have some issues and degraded image quality with X360's VGA.


I have found this info on the net while searching to see if the X360 contains a RAMDAC but all I found is this, which describes the video output functionality of the original Xbox. Unfortunately, I can’t provide the link to this article due to the forums rules. Therefore, here is the copy of the article.




Video Encoder:


The Xbox does not use an ordinary RAMDAC (http://en.wikipedia.org/wiki/RAMDAC) for video output. Instead, it employs a video encoder.

Video encoder is a chip that converts a digital pixel data stream (coming from the nVidia NV2A graphics processor) into analog video signal, just like a RAMDAC would. An ordinary RAMDAC, however, can only output VGA-style RGB signal. The video encoder used in the Xbox is more flexible, and can generate several different types of signals that adhere to various video standards and color formats. These include, but are not necessarily limited to:

* VGA-style >31 kHz RGB, though only with Sync-on-Green sync signals. (If needed, separate HSYNC and VSYNC signals can be obtained from the motherboard, or by building a special video cable with active electronics for stripping and separating the Sync-on-Green sync signal. In any case, separate HSYNC and VSYNC are not available directly through the AV connector.)
* TV-compatible 15 kHz RGB (with composite sync) – suitable for European-style SCART RGB output (are progressive 625/50 signals supported?)
* Component (Y'PbPr) signal, both in SDTV and HDTV resolutions; suitable for American-style "component" output
* PAL color signal with typical PAL timings (including PAL60), in both composite (CVBS) and s-video (Y/C) formats
* SECAM color signal with typical SECAM timings, in both composite (CVBS) and s-video (Y/C) formats
* NTSC color signal with typical NTSC timings, in both composite (CVBS) and s-video (Y/C) formats
* Black and white composite video signal without a color carrier

The video encoder is also capable of PALplus style Line 23 Wide Screen Signalling (WSS), and the Xbox PIC is rigged with the capability of controlling Scart pin 8 (the function switching pin, which is used as an alternative method of Wide Screen Signalling) and pin 16 (the fast switching pin.)

The make and model of the video encoder has varied through the times – three different video encoders have been used this far. All three are very similar in their features; they support various modes and are flexible enough to be able to output a VGA compatible signal (which is not supported by the Xbox kernel.) They are, however, not register-compatible.

Two of the video encoders (namely, Conexant CX25871 and Focus FS454) also have extensive scaling and filtering functionality, which allows for overscan compensation (http://scanline.ca/overscan/) in desktop-style "TV out" usage. (This means that the GPU can output ordinary VGA resolutions with VGA timings and the video encoder can convert them to SDTV resolutions with TV-style timings on the fly, adding borders around the image so that a projection of the VGA framebuffer image falls within the "safe area" of the video signal.) The capabilities of the Xcalibur chip, however, remain a mystery in this regard: it is not known whether it has a scaler.

All video encoders are connected to (and controlled via) I²C/SMBus.



Conexant CX25871:


Conexant CX25871 (http://www.conexant.com/products/entry.jsp?id=278) is a close relative of the Brooktree BT868/BT869. There is also a sister model (CX25870) without the Macrovision capability. This chip was used in Xbox versions v1.0 through v1.3. If you follow the link, you will find a product brief and a complete data sheet, with register-level programming information.


Focus FS454:


Focus FS454 (http://www.focusinfo.com/solutions/catalog.asp?id=30) was used in v1.4 (and possibly v1.5) Xboxes. There is also a sister model (FS453) without the Macrovision capability. The data sheet containing the necessary programming information is available from the manufacturer by separate request. Copies of it have also been seen floating around the net.


Xcalibur:

The "Xcalibur" video encoder is a custom chip manufactured for Microsoft. It was first used in the Xbox hardware revision 1.6.

Nielo TM​


-----------------------------------------------------------------

Connect a PC and X360 to a display via VGA and display some high quality images (the resolutions of those images must be 1280 x 720 or 1280 x 1024. You'll immediately notice that the PQ is extremely sharp on the PC as where on the X360, it’s blurred and have an unrealistic look.

I have a monitor with dual VGA and single DVI. I connected my friend's X360 to the VGA port 1 and my PC to VGA port 2. I displayed number of very high quality images and the PC always produced superb quality as where the X360 is blurred and lost some colors. There was also some visible video noise as well as ghosting (multiple transparent images on a static frame). I later found out that the Joytech cable caused ghosting so we tried the official but it still nowhere good as the PC.



To check the quality of my monitor’s ADC (Analog to Digital Converter), I connected one of my PC to the display using VGA and the other using DVI. Amazingly, the image quality is virtually identical. There was no evidence of ghosting, video noise, color degradation, moiré and flickering.

Since most problems are being solved, I take it that the X360 has a programmable video encoder, maybe something similar to this Philips encoder .





PS: Let’s say you’ve selected 1280 x 1024 when playing HALO CE (PC) and using VGA. Your GFX renders frames at chosen resolution, which is 1280 x 1024 and converts it to analog using RAMDAC (internal scaling is not possible with RAMDAC as far as I’m aware of), and get send to the monitor to be displayed.
Now, if you select 1280 x 1024 on the X360, games are still rendered at 1280 x 720. This is clear evidence of encoding chip in the works.

Anyway, this is just a theory based on available facts.
 

hamster

Novice Member
Often the difference is because the different inputs go through different chips. In particular SCART RGB and Component. The differing performance of the ADCs, plus whether they are 8- or 10-bit is the cause.

Therefore it depends very much on the ICs that the setmaker has chosen.

When well done, it is very hard to see the difference between good analog and HDMI.
 

Woodywizz

Distinguished Member
Nielo TM said:
because they actually use a video encoder instead an RAMDAC because its easier for the developers to develop with few resolutions. In another words, developers don’t have spend time to optimizing every resolution below 1360 x 768. Therefore, if you select 1280 x 1024, games are still rendered at 1280 x 720 and outputted as 1280 x 1024. Anyway, I’ve created a thread some time ago.




----------------------------------------------------------------------


Ok. Let’s start with the basics. RAMDAC stands for Random Access Memory Digital to Analog Converter and it primarily used in graphics cards. The purpose of this chip is to convert digital images that are stored in the video memory to analog. In another words, RAMDAC converts the digital images to analog with no manipulation (pixel, aspect ratio etc...) and send the information in a form of RGB through VGA to the display. This is why VGA always produced ultra sharp images. Now, why isn’t the X360 producing the same level of quality? That’s because the X360 is not using RAMDAC for VGA, instead, its using a video encoder similar to the one found in the original Xbox. This is why there have some issues and degraded image quality with X360's VGA.


I have found this info on the net while searching to see if the X360 contains a RAMDAC but all I found is this, which describes the video output functionality of the original Xbox. Unfortunately, I can’t provide the link to this article due to the forums rules. Therefore, here is the copy of the article.




Video Encoder:


The Xbox does not use an ordinary RAMDAC (http://en.wikipedia.org/wiki/RAMDAC) for video output. Instead, it employs a video encoder.

Video encoder is a chip that converts a digital pixel data stream (coming from the nVidia NV2A graphics processor) into analog video signal, just like a RAMDAC would. An ordinary RAMDAC, however, can only output VGA-style RGB signal. The video encoder used in the Xbox is more flexible, and can generate several different types of signals that adhere to various video standards and color formats. These include, but are not necessarily limited to:

* VGA-style >31 kHz RGB, though only with Sync-on-Green sync signals. (If needed, separate HSYNC and VSYNC signals can be obtained from the motherboard, or by building a special video cable with active electronics for stripping and separating the Sync-on-Green sync signal. In any case, separate HSYNC and VSYNC are not available directly through the AV connector.)
* TV-compatible 15 kHz RGB (with composite sync) – suitable for European-style SCART RGB output (are progressive 625/50 signals supported?)
* Component (Y'PbPr) signal, both in SDTV and HDTV resolutions; suitable for American-style "component" output
* PAL color signal with typical PAL timings (including PAL60), in both composite (CVBS) and s-video (Y/C) formats
* SECAM color signal with typical SECAM timings, in both composite (CVBS) and s-video (Y/C) formats
* NTSC color signal with typical NTSC timings, in both composite (CVBS) and s-video (Y/C) formats
* Black and white composite video signal without a color carrier

The video encoder is also capable of PALplus style Line 23 Wide Screen Signalling (WSS), and the Xbox PIC is rigged with the capability of controlling Scart pin 8 (the function switching pin, which is used as an alternative method of Wide Screen Signalling) and pin 16 (the fast switching pin.)

The make and model of the video encoder has varied through the times – three different video encoders have been used this far. All three are very similar in their features; they support various modes and are flexible enough to be able to output a VGA compatible signal (which is not supported by the Xbox kernel.) They are, however, not register-compatible.

Two of the video encoders (namely, Conexant CX25871 and Focus FS454) also have extensive scaling and filtering functionality, which allows for overscan compensation (http://scanline.ca/overscan/) in desktop-style "TV out" usage. (This means that the GPU can output ordinary VGA resolutions with VGA timings and the video encoder can convert them to SDTV resolutions with TV-style timings on the fly, adding borders around the image so that a projection of the VGA framebuffer image falls within the "safe area" of the video signal.) The capabilities of the Xcalibur chip, however, remain a mystery in this regard: it is not known whether it has a scaler.

All video encoders are connected to (and controlled via) I²C/SMBus.



Conexant CX25871:


Conexant CX25871 (http://www.conexant.com/products/entry.jsp?id=278) is a close relative of the Brooktree BT868/BT869. There is also a sister model (CX25870) without the Macrovision capability. This chip was used in Xbox versions v1.0 through v1.3. If you follow the link, you will find a product brief and a complete data sheet, with register-level programming information.


Focus FS454:


Focus FS454 (http://www.focusinfo.com/solutions/catalog.asp?id=30) was used in v1.4 (and possibly v1.5) Xboxes. There is also a sister model (FS453) without the Macrovision capability. The data sheet containing the necessary programming information is available from the manufacturer by separate request. Copies of it have also been seen floating around the net.


Xcalibur:

The "Xcalibur" video encoder is a custom chip manufactured for Microsoft. It was first used in the Xbox hardware revision 1.6.

Nielo TM​


-----------------------------------------------------------------

Connect a PC and X360 to a display via VGA and display some high quality images (the resolutions of those images must be 1280 x 720 or 1280 x 1024. You'll immediately notice that the PQ is extremely sharp on the PC as where on the X360, it’s blurred and have an unrealistic look.

I have a monitor with dual VGA and single DVI. I connected my friend's X360 to the VGA port 1 and my PC to VGA port 2. I displayed number of very high quality images and the PC always produced superb quality as where the X360 is blurred and lost some colors. There was also some visible video noise as well as ghosting (multiple transparent images on a static frame). I later found out that the Joytech cable caused ghosting so we tried the official but it still nowhere good as the PC.



To check the quality of my monitor’s ADC (Analog to Digital Converter), I connected one of my PC to the display using VGA and the other using DVI. Amazingly, the image quality is virtually identical. There was no evidence of ghosting, video noise, color degradation, moiré and flickering.

Since most problems are being solved, I take it that the X360 has a programmable video encoder, maybe something similar to this Philips encoder .





PS: Let’s say you’ve selected 1280 x 1024 when playing HALO CE (PC) and using VGA. Your GFX renders frames at chosen resolution, which is 1280 x 1024 and converts it to analog using RAMDAC (internal scaling is not possible with RAMDAC as far as I’m aware of), and get send to the monitor to be displayed.
Now, if you select 1280 x 1024 on the X360, games are still rendered at 1280 x 720. This is clear evidence of encoding chip in the works.

Anyway, this is just a theory based on available facts.



:eek:
 
hamster said:
Often the difference is because the different inputs go through different chips. In particular SCART RGB and Component. The differing performance of the ADCs, plus whether they are 8- or 10-bit is the cause.

Therefore it depends very much on the ICs that the setmaker has chosen.

When well done, it is very hard to see the difference between good analog and HDMI.
Actually Hamster, this is why I ask. From looking at the service manual, it seems that Component video, HDMI, and the RGBHV signals from the PC input, are actually handled by the same Trident chip in the BRAVIA I have (surprised me as well). Only Composite and S-Video seem to be done on a separate chip.
 

pjclark1

Novice Member
My ATI graphics card outputs, component, VGA and DVI
component looks vastly inferior to VGA and DVI (all fed into same HDTV)
 

geordie10

Novice Member
i use vga from my xbox360 to a samsung 26 inch hd lvd tv.
i tried component but find vga is much better.
picture is sharper and colours are not washed out.
cheers ian.
 

hamster

Novice Member
Lyris said:
Actually Hamster, this is why I ask. From looking at the service manual, it seems that Component video, HDMI, and the RGBHV signals from the PC input, are actually handled by the same Trident chip in the BRAVIA I have (surprised me as well). Only Composite and S-Video seem to be done on a separate chip.
That's because Trident are not very good at making colour decoders...so Sony are using one of their own CXA devices or a Philips SAA7118/9 I guess.
Sony are very finicky about VCR playback and have a library of nightmare ex rental tapes that have spent months in the sun on a car parcel shelf or some such that they use to test good colour decoder and sync operation.

I'd have to check the Trident chip's block diagram, but I suspect that the VGA goes directly into the scaler. It's usual not to scale or do PQ on OSDs or graphics (as it usually looks worse.) As the gaming machine is in theory not video, that's probably the difference.
I also agree that often picture improvement looks worse, especially as it's often very over-excitable in demo / factory preset mode. For Philips stuff I always prefer Pixel plus switched off, Sony looks a bit over processed to me and Samsung DNIE just looks unnatural.
 

DanDT

Active Member
Let's also not forget that through VGA you get 1:1 pixel mapping, whereas "video" signal have to go through not only video processing, but also scaling.

HDMI on the X-series can take PC inputs, and 1:1 pixel mapping through HDMI at 1920x1080 on the X-series looks absolutely magnificent.
 

DanDT

Active Member
Lyris said:
Why do they do it...
Overscan, well everyone uses it... As for the "little flaws" you see on the sets, i can only think that They're evil people who can't sleep at night if the product They make hasn't got "just that one (or two, or three...) little thing that keeps it form being perfect". Every set has them, and it's a guarantee for The Manufacturers to secure sales of "the next set" with those niggles solved, but a few others added. :devil:
 

cooperda

Active Member
Lyris said:
Dan - you forgot overscan - that too :suicide:

Why do they do it...
With regards overscan:-

I have a Samsung DVD R125 Recorder connected to my LCD by HDMI (yes I know you're talking about VGA etc.) but one problem I have is that the Samsung drives the LCD with no overscan.

This means that all recordings show a visible narrow green line along the top edge of the picture. See:-

http://www.avforums.com/forums/attachment.php?attachmentid=23585&d=1137189969

If I watch the recording on a CRT TV the overscan hides it.

I do agree the some TV's seem to vastly overscan. One friends TV - when watching 'Who wants to be a millionaire' - half of the lower answer choices are cut off.

Cheers, Dave C.
 
Well yeah, there's often ragged edges and gunk around the picture so I see why an overscan option is good. But they should allow you to turn it off.
 

bodgerx

Standard Member
pjclark1, I think you are confusing Composite with Component. I don't think there is an ATI card out there with a 3 socket Component group. Composite (a single yellow connecter) will be a lot inferior to VGA or DVI on your card, Componet will be closer in quality...

Incidently, how does RGB SCART compare with Component? I would of thought they would be pretty much identical?
(other than the quality of the cable used in each connection, which is typically unshielded for each component color in RGB scart)
 

cwick

Novice Member
bodgerx said:
pjclark1, I think you are confusing Composite with Component. I don't think there is an ATI card out there with a 3 socket Component group. Composite (a single yellow connecter) will be a lot inferior to VGA or DVI on your card, Componet will be closer in quality...
Nah, ATI have had a component dongle for their cards for ages now. That said, I always found the component output to look pretty asstastic using it.

Linky to ATI Component FAQ
 

pjclark1

Novice Member
bodgerx said:
pjclark1, I think you are confusing Composite with Component. I don't think there is an ATI card out there with a 3 socket Component group. Composite (a single yellow connecter) will be a lot inferior to VGA or DVI on your card, Componet will be closer in quality...
Newbies ........ don'tcha just love em:hiya: .
 

RockySpieler

Novice Member
Isn't component capable of 12-bit colours whereas RGB and DVI can only offer 8-bit (laters HDMI 1.3 will improve things)??

[drunken disclaimer........this is partly a question as opposed to a statement]

ps

just read page 1................I need another drink, you guy'ssssssss and your chips!
 

hamster

Novice Member
DanDT said:
Let's also not forget that through VGA you get 1:1 pixel mapping, whereas "video" signal have to go through not only video processing, but also scaling.
I don't see how you get 1:1 pixel mapping via VGA, as it's analogue.
 
I certainly get 1:1 pixel mapping through analogue VGA - unless I've been misled as to what "1:1" means?
 

DanDT

Active Member
Lyris said:
I certainly get 1:1 pixel mapping through analogue VGA - unless I've been misled as to what "1:1" means?
The previous poster is just a bit confused. VGA can and does give you 1:1 pixel mapping, always did and always will, since the days of VGA LCD computer monitors.
 
Could just be a mistake, but the reason I asked is because Hamster knows quite a bit about what makes TVs tick on the inside.
 

Trending threads

Latest news

Samsung applies for Q-Symphony trademark for TVs and soundbars
  • By Andy Bassett
  • Published
Best Hi-Fi Products 2019 - Editor's Choice Awards
  • By Ed Selley
  • Published
McIntosh releases DA2 Digital Audio Module Upgrade Kit
  • By Andy Bassett
  • Published
Sky TV offers cheaper Sky Q box with updated features
  • By Andy Bassett
  • Published
Fyne Audio unveils its first home install loudspeaker line up
  • By Andy Bassett
  • Published
Top Bottom