Is DVI better?


Prominent Member
Feb 5, 2003
Reaction score
Land of the Living, Essex
seriously! Is DVI better than VGA, if so why?
If all else is equal yes ..

VGA means a digital->analog->digital conversion cycle losing PQ by introducing signal noise and other unwanted effects ... using DVI the picture remains in the digital domain from DVD to display.
but things are never equal :)
but things are never equal

............. like the distance between your hcpc and your projector..... mine runs across a 10m VGA cable, distance can be an issue with DVI, afaik.

Sean G.
Originally posted by theritz
............. like the distance between your hcpc and your projector..... mine runs across a 10m VGA cable, distance can be an issue with DVI, afaik.
Hence my caveat. ;) .. 10m can be done apparently, read someone saying they'd done it but not using the [in]famous Lindy 10m cables.
I've used a 10m Lindy cable successfully with many sources (except my latest, but that probably isn't the cable).

I've heard of plenty of others who have too.

The Lindy cables seem somewhat variable in quality, I recall several threads shortly after the Z2 was shipping from people who had problems with the 10m which disappeared with a shorter cable.

I personally have never used one longer than 5m so can only go on what I read. :)
The Z2 seems to have weaker performance on long DVI cables than other hardware.

Seem my post in the Z2/10m DVI thread.

Anything over 5m and it is a 'bonus' with DVI though I have heard of 25m from reliable sources (no optical repeaters). Nothing wrong with the Lindy cable though there are better ones like bettercables
I only ask as VGA looks great on my monitor and I cant see it improving.
In my messing with various systems VGA is always great too and if you are happy with what you have then enjoy it!

I couldn't really tell a huge difference until I got to pretty large screen sizes.

There are two issues here when it comes to what impact using an analog feed makes to the final PQ: the quality of the DAC in the player and the quality of the ADC in the display.

DACs are generally pretty good these days and noise injection not too much of an issue, from my limited experience admittedly, what IMX is more significant is the ADC in the display.

When you feed an analog signal to a digital panel the display has to sync to the signal so it 'knows' where each pixel's data can be found, in a lesser quality display this process can sometimes 'lose' the occasional pixel during one frame and so introduce what looks like noise.

As Marc says, if you don't see a difference then what does the theory matter. If you're not seeing a difference then you have a good DVD player and display device :).
well put :) The work needs to be done in the display devices
I'm running a 10 metre Lindy from a Momitsu V880 to an Epson TW100 with no problems, and the PQ is jaw-dropping.

Must be one of the lucky ones. :clap:
Ah yes I remember this very well, shortly after the jaw drops, a load of sick comes out. ;) If you manage to get DVI working properly without any side effects then consider yourself lucky. The other negative side of DVI is that it is 8bit colour. Although not really a problem if the display device is native 8bit anyway (i.e. it can't be improved)
If you're sourcing from DVD then 8 bit per component shouldn't ever be an issue. Although if you're doing a bunch of processing player end that tweaks any of the picture params, e.g. brightness contrast, sat etc then you're asking for problems...

It is very rare for the picture to remain untouched through out the digital domain, it might even not be possible. Also remember that scaling is usually done at some stage and it definitely benefits from better than 8bit colour.
However, the quality of PC DACs is pretty average so the noise introduced into the VGA output usually still makes the final PQ less than the bit-constrained DVI output I'd say.
No not really, the DACs are fine, the problem when it exits has more to do with the overall board design and limitations of the D15 VGA connector with high resolution video. Give me 10bit colour analogue over 8bit digital any day. Analogue is so much more flexible. Of course if your display is natively digital then most of the time the benefits go out of the window, but not always. I have an LCD screen that pretty much syncs to any resolution and refresh rate like a CRT, seeing it do its stuff at 48 and 50Hz brings me hope for the future. :)
I've compared DVI and VGA from a Radeon 9600 to my NEC HT1000, and the difference is negligable at best.

Ah but that's cheating :D .. the HT1000 is a bloody good projector and obviously has first-class ADCs, not all are so well endowed. :)
Personally I think 'digital' like DVI has much 'growing' up to do yet, at a decent level of kit (used by most here) then there are far to many issues re digital, analogue is the stable brother you know, digital is far more 'flighty' and often get things wrong, horribly wrong. Give it 2 years for HDMI to 'bed' in :)
it also depends how the manufactues designed there projectors.
Did they make it to deliver a solid component image(as the S3 seems to do a wonderful job with component, in fact superior to dvi)
or did they have in mind that dvi was there output of choice.

DVI can wash out colour and on some projectors soften the image. i see this on my h56. the denon A11 via component is better than my HTPC via dvi which is a formiddble beast but the image is inferior to the denon.(not by much)
but on a ht1000 i definately got a cleaner, less noisey pic via dvi with the denon.

i think the main advantage of dvi is the lack of pic noise. but downside is a less filmic image depending on PJ

DVI has no future for consumer PJs. it will move to HDMI which will benefit from its increased 12bits but again more focus by manufactures is needed to get the best out of these inputs.
Hello Guys,

I must say that I can confirm Gary Lightfoot's conclusion that the differences between DVI and VGA are slight at best. I recently had the pleasure of reviewing the new SIM2 Ht300Xtra entensively and it was EXTREMELY difficult to tell the images apart after hours of going back and forth between VGA and DVI. As an owner of the original HT300 which does not have DVI, I was naturally particularly curious about this diffierence. What I did notice was that the at low ire values, esp 20 and 30ire, the VGA had a kind of noisy interference like faint ripples or waves going across the screen. The DVI was as clean as a whistle at these ire's. This does suggest that DVI could produce the least noisy picture of the two, but I could never actually see this difference when viewing DVD's, including film ike Dark City. On the other hand, I felt that VGA had slightly better colour saturation at D65, keeping all the picture adjustments the same. I could easily match the colour by pushing up colour saturation for DVI slightly. My conclusion is that those of us stuck with VGA can rest easy for the time being.

Best Wishes,

Paul H

The latest video from AVForums

Subscribe to our YouTube channel
Top Bottom