1. Join Now

    AVForums.com uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

DVI versus analogue

Discussion in 'Projectors, Screens & Video Processors' started by philipb, Mar 31, 2005.

  1. philipb

    philipb
    Active Member

    Joined:
    Mar 6, 2002
    Messages:
    1,986
    Products Owned:
    1
    Products Wanted:
    0
    Trophy Points:
    63
    Location:
    Swindon
    Ratings:
    +132
    Just thought I'd share this, and get some comments hopefully.

    since installing my Cinemateq and hooking it up to SDI DVD and Sky+, I've been watching via the DVI out to my Pio 43MXE1. All digital and therefore best - certainly the manual suggests this. PQ is excellent. I had to adjust the signal to the Pio - the scaler's XGA output did not fit the screen exactly.

    But being an inveterate fiddler, I wanted to try some of the other resolutions the scaler can output such as doubling, quadrupling, 720p and 1080i. Trouble is the Pio DVI won't take these because it only wants PC not video signals. So I connected scaler to plasma via component and had a play around.

    Input was still SDI, and on Sky+ PQ was good over analogue but no better than DVI, maybe slightly worse. From the DVD however (Denon 2900) the PQ was stunning and probably better than DVI. 720p seemed to give best results but all the resolutions were fantastic.

    I have heard that digital isn't always better than analogue. This might all be my imagination but any comments would be useful, maybe from those who know the foibles of the 43MXE1?
     
  2. Thunder

    Thunder
    Active Member

    Joined:
    Feb 26, 2004
    Messages:
    1,763
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    41
    Ratings:
    +14
    Its true that Digital isnt always better than analogue. It all depends on the implementation :) I would say on average though that its harder and generally more expensive to build a high quality analogue output than a digital one :smashin: especially when working with a digital source e.g DVD.
     
  3. ihan

    ihan
    Active Member

    Joined:
    Sep 10, 2004
    Messages:
    624
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    28
    Location:
    Stockport
    Ratings:
    +25
    Hi,

    I prefer the YUV component card or the VGA input of my panasonic TH50PHD7 over the DVI card. Colour banding is more evident with the DVI card. This isn't to say that the DVI or HDMI format is worse, just that Panasonic's implementation is poor.

    Regards,
    Ian
     
  4. Fidelio

    Fidelio
    Active Member

    Joined:
    Jul 26, 2004
    Messages:
    142
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    21
    Location:
    Bristol
    Ratings:
    +5
    Thats funny, I prefer the DVI input to the VGA on the HD7 42". This is from a HDP using Wireworld. Much more cleaner, less noise and natural looking.

    Fidelio
     
  5. ihan

    ihan
    Active Member

    Joined:
    Sep 10, 2004
    Messages:
    624
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    28
    Location:
    Stockport
    Ratings:
    +25
    I reckon it's a question of what you personally prefer - I agree with you that the noise is slightly less with the DVI card. With DVI, there appears to be less luminance & chrominace levels, leading to additional banding.

    Ian
     
  6. Chris5

    Chris5
    Active Member

    Joined:
    Mar 21, 2004
    Messages:
    1,163
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    63
    Location:
    Herts., uk
    Ratings:
    +122
    is that because the DVI can resolve more bits (steps/detail) then the analog that may have a noise floor greater that 1 bit, I wonder?
     
  7. gizlaroc

    gizlaroc
    Well-known Member

    Joined:
    Jul 26, 2001
    Messages:
    8,767
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    136
    Location:
    Norwich
    Ratings:
    +645
    I have a Philips 963 sdi and sky sdi going into a Crystalio and then back out to the screen with 5xbnc from scaler to screen and also dvi.
    I actually prefer the 5xbnc to dvi and also to the VGA input, less noise and or artifacts in the black areas. and the light areas are pretty much the same.

    It all depends on the display, if I had a Pioneer FDE/XDE I would try and use dvi as the analogue inputs are awful, but the Panasonics analogue inputs are superb.
     
  8. StooMonster

    StooMonster
    Well-known Member

    Joined:
    Nov 20, 2002
    Messages:
    4,970
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    106
    Location:
    Kent
    Ratings:
    +314
    Or IMO more likely that the ISF Calibration settings for greyscale are different on DVI input to Component input (DVI default was probably assumed to be PC) which show up as banding.

    ISF Calibration made all banding disapear on my plasma, cheer Gordon (although it was long, long ago).

    StooMonster
     
  9. ihan

    ihan
    Active Member

    Joined:
    Sep 10, 2004
    Messages:
    624
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    28
    Location:
    Stockport
    Ratings:
    +25
    Both inputs were setup with a test disc, so different calibration wasn't the issue.

    Ian
     
  10. philipb

    philipb
    Active Member

    Joined:
    Mar 6, 2002
    Messages:
    1,986
    Products Owned:
    1
    Products Wanted:
    0
    Trophy Points:
    63
    Location:
    Swindon
    Ratings:
    +132
    Its a Pio 43MXE1 and the analogue inputs are excellent.
     
  11. Welwynnick

    Welwynnick
    Well-known Member

    Joined:
    Mar 16, 2005
    Messages:
    7,084
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    133
    Location:
    Welwyn, Herts
    Ratings:
    +825
    It's good to hear people questioning the superiority of digital interfaces, and DVI in particular. If it makes you think about what is happening to the signal, this may stop people shovelling anything into the display and hoping it will look after everything.

    Of course, if you have a digital source and display, any analogue conversion is bound to degrade the signal. But that only applies if you compare like-with-like. Analogue interfaces have much more flexibility over the video format than DVI, which is very restricted in the combinations of video format, frame rate, line rate and pixel rate it can connect. The digital interface will often be unable to input a native signal to the display. That may stop the scaler taking all the de-interlacing, frame rate conversion, filtering, bug-fixing and scaling away from the source and display, and you only want to do that once.

    Worse, most DVD players don't output all the bits in the frame; most slice a few rows off the edges, so their outputs are rarely the same. Good scalers can live with all of this, but only if their interfaces work at the native formats of the source and display. This can be done with analogue interfaces with customised resolutions and positioning, that are essentially unique to a particular source/display combination.

    Otherwise, video processing will be performed in three different boxes and the picture will suffer. (I'll preach about the down sides to DSP in another post) I don't think you can expect an off-the-shelf set-up to give you optimal performance, and if you're not getting it, you won't go WOW!
     
  12. Eddy Boy

    Eddy Boy
    Active Member

    Joined:
    Dec 9, 2004
    Messages:
    937
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    28
    Location:
    London
    Ratings:
    +27
    In the case of DVI or HDMI if the player does the scaling or de-interlacing the display only, well, displays (I'm keeping it simple I know). Its when you feed it an interlaced signal at 480 or 576 that the TV then needs to scale the image and in some cases de-interlaces itself. Hence when testing a screen you want to feed it an interlaced signal and at 480 lines to see how good it is. Otherwise what you are mostly seeing the you DVD players ability.
     
  13. Welwynnick

    Welwynnick
    Well-known Member

    Joined:
    Mar 16, 2005
    Messages:
    7,084
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    133
    Location:
    Welwyn, Herts
    Ratings:
    +825
    I depends what you want to test the screen for. Do you want to look at the quality of the display (which will be the ultimate limit of performance) , or of the video processing as well (in which case it will be difficult to distinguish the performance of the processing from the display?) The answer has probably got be what represents your intended application. No point going gooey over a DLP rear-projection TV, with a pixel perfect feed from an HTPC if you are only going to feed it S-video from Freeview.

    DVI or HDMI don't really enter into this, and in any case, the player (even if it does do scaling) is unlikely to output video at the diplays native resolution, so further scaling will usually be required. And de-interlacing will always be done by a PJ or flat panel if you feed it de-interlaced. Which partly explains what you usually see in Currys.

    Best not worry about 480 lines (NTSC) in this country, though. When we watch PAL TV/DVD on a plasma, it is usually 576 lines scaled down to 480 lines.
     

Share This Page

Loading...