Discussion in 'Projectors, Screens & Video Processors' started by CrispyXUK, Feb 5, 2004.
seriously! Is DVI better than VGA, if so why?
If all else is equal yes ..
VGA means a digital->analog->digital conversion cycle losing PQ by introducing signal noise and other unwanted effects ... using DVI the picture remains in the digital domain from DVD to display.
but things are never equal
............. like the distance between your hcpc and your projector..... mine runs across a 10m VGA cable, distance can be an issue with DVI, afaik.
Hence my caveat. .. 10m can be done apparently, read someone saying they'd done it but not using the [in]famous Lindy 10m cables.
I've used a 10m Lindy cable successfully with many sources (except my latest, but that probably isn't the cable).
I've heard of plenty of others who have too.
The Lindy cables seem somewhat variable in quality, I recall several threads shortly after the Z2 was shipping from people who had problems with the 10m which disappeared with a shorter cable.
I personally have never used one longer than 5m so can only go on what I read.
The Z2 seems to have weaker performance on long DVI cables than other hardware.
Seem my post in the Z2/10m DVI thread.
Anything over 5m and it is a 'bonus' with DVI though I have heard of 25m from reliable sources (no optical repeaters). Nothing wrong with the Lindy cable though there are better ones like bettercables
I only ask as VGA looks great on my monitor and I cant see it improving.
In my messing with various systems VGA is always great too and if you are happy with what you have then enjoy it!
I couldn't really tell a huge difference until I got to pretty large screen sizes.
There are two issues here when it comes to what impact using an analog feed makes to the final PQ: the quality of the DAC in the player and the quality of the ADC in the display.
DACs are generally pretty good these days and noise injection not too much of an issue, from my limited experience admittedly, what IMX is more significant is the ADC in the display.
When you feed an analog signal to a digital panel the display has to sync to the signal so it 'knows' where each pixel's data can be found, in a lesser quality display this process can sometimes 'lose' the occasional pixel during one frame and so introduce what looks like noise.
As Marc says, if you don't see a difference then what does the theory matter. If you're not seeing a difference then you have a good DVD player and display device .
well put The work needs to be done in the display devices
I'm running a 10 metre Lindy from a Momitsu V880 to an Epson TW100 with no problems, and the PQ is jaw-dropping.
Must be one of the lucky ones.
Ah yes I remember this very well, shortly after the jaw drops, a load of sick comes out. If you manage to get DVI working properly without any side effects then consider yourself lucky. The other negative side of DVI is that it is 8bit colour. Although not really a problem if the display device is native 8bit anyway (i.e. it can't be improved)
If you're sourcing from DVD then 8 bit per component shouldn't ever be an issue. Although if you're doing a bunch of processing player end that tweaks any of the picture params, e.g. brightness contrast, sat etc then you're asking for problems...
It is very rare for the picture to remain untouched through out the digital domain, it might even not be possible. Also remember that scaling is usually done at some stage and it definitely benefits from better than 8bit colour.
However, the quality of PC DACs is pretty average so the noise introduced into the VGA output usually still makes the final PQ less than the bit-constrained DVI output I'd say.
No not really, the DACs are fine, the problem when it exits has more to do with the overall board design and limitations of the D15 VGA connector with high resolution video. Give me 10bit colour analogue over 8bit digital any day. Analogue is so much more flexible. Of course if your display is natively digital then most of the time the benefits go out of the window, but not always. I have an LCD screen that pretty much syncs to any resolution and refresh rate like a CRT, seeing it do its stuff at 48 and 50Hz brings me hope for the future.
I've compared DVI and VGA from a Radeon 9600 to my NEC HT1000, and the difference is negligable at best.
Ah but that's cheating .. the HT1000 is a bloody good projector and obviously has first-class ADCs, not all are so well endowed.
Are we still talking projectors here?
Personally I think 'digital' like DVI has much 'growing' up to do yet, at a decent level of kit (used by most here) then there are far to many issues re digital, analogue is the stable brother you know, digital is far more 'flighty' and often get things wrong, horribly wrong. Give it 2 years for HDMI to 'bed' in
it also depends how the manufactues designed there projectors.
Did they make it to deliver a solid component image(as the S3 seems to do a wonderful job with component, in fact superior to dvi)
or did they have in mind that dvi was there output of choice.
DVI can wash out colour and on some projectors soften the image. i see this on my h56. the denon A11 via component is better than my HTPC via dvi which is a formiddble beast but the image is inferior to the denon.(not by much)
but on a ht1000 i definately got a cleaner, less noisey pic via dvi with the denon.
i think the main advantage of dvi is the lack of pic noise. but downside is a less filmic image depending on PJ
DVI has no future for consumer PJs. it will move to HDMI which will benefit from its increased 12bits but again more focus by manufactures is needed to get the best out of these inputs.
I must say that I can confirm Gary Lightfoot's conclusion that the differences between DVI and VGA are slight at best. I recently had the pleasure of reviewing the new SIM2 Ht300Xtra entensively and it was EXTREMELY difficult to tell the images apart after hours of going back and forth between VGA and DVI. As an owner of the original HT300 which does not have DVI, I was naturally particularly curious about this diffierence. What I did notice was that the at low ire values, esp 20 and 30ire, the VGA had a kind of noisy interference like faint ripples or waves going across the screen. The DVI was as clean as a whistle at these ire's. This does suggest that DVI could produce the least noisy picture of the two, but I could never actually see this difference when viewing DVD's, including film ike Dark City. On the other hand, I felt that VGA had slightly better colour saturation at D65, keeping all the picture adjustments the same. I could easily match the colour by pushing up colour saturation for DVI slightly. My conclusion is that those of us stuck with VGA can rest easy for the time being.
With regard to colour bit rate, as I understand it, DVD's are currently encoded with 8 bits of colour. In addition, most current projectors only have an 8 bit capability and even if they had more, this would only be effective if your source is encoded with 10 or 12 bits. I think that higher colour bit rates are really for the future generation of HD DVD's and next year or even the year after's projectors. Only time will tell.
The comparison you described was between DVI and VGA, did you get chance to compare DVI to component ? If so, how would you rate a well setup HTPC (via DVI/VGA) to a good DVD player via component ?
I did compare using the old Pioneer 717 RGBS input. This is a little worse than component based on the AVIA resolution test pattern for 6.75 Mhz. DVI and VGA show the vertical lines in a razor sharp way, the RGBS resolved none of them, while component does resolve some (via component on the Nec 1000 on a 10ft wide screen). Having said this, the RGBS did look surprisingly good compared to HTPC VGA and DVI. However, details in the background are less well delineated and overall, there is generally less transparency in the image and significantly more video noise in some scenes. Also, component and RGBS will utilise the Faroudja chipset. On NTSC this is generally fine. However, on Pal, the Faroudja processing seems to struggle to resolve twitter. It takes a few seconds for the Faroudja to click in and resolve the muddle. This problem is common to all DLP's using the Faroudja chip set. One useful test for this is during the Arena scene in Gladiator. Maximus asks if anyone has been in the army. Look at the chain mail on the German. Using a Pal disc via component or progressive also using Faroudja in the DVD player, the chain mail jitters. After a few seconds the Faroudja locks in and the twitter is resolved. The advantage of VGA and DVI is that the Faroudja is bypassed.
Basically component is very good as long as you don't compare it with VGA/DVI which is my favourite. But I did live the RGBS solution for over a year and was delighted with the image. Hope this helps.
What about smoothness during pans ?
I'm on a quest. I'm looking for an image that has no noticeable aspects that draw your attention to something not being quite right. That doesn't necessarily mean the biggest image and it doesn't necessarily mean the most detailed it means one you can sit and watch for two hours and not be aware of the means it was produced by. I can cope with sins of ommision better than sins of commision.
Right now, for me, the biggest hurdle to overcome with HTPC is micro-stutter. If a really good DVD player can produce butter-smooth pans with minimal loss of detail and clarity compared to HTPC then it's probably worth investigating.
THE best and most natural images I have seen involve using the new Pioneer player ( I cant remeber the model title) and hooking it up to an HDMI compliant projector. The pj i saw was the SIM2 HT300Link projecting onto a 1.3 gain screen, 104 inches wide. We compared it with DVI out of an HTPC and it really did look noticeably better.
Separate names with a comma.