1. Join Now

    AVForums.com uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

VGA to RGB Scart & DigiTV issue

Discussion in 'Desktop & Laptop Computers Forum' started by wyerd, Aug 9, 2004.

  1. wyerd

    wyerd
    Active Member

    Joined:
    Dec 29, 2000
    Messages:
    761
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    28
    Location:
    Upstate New York. USA
    Ratings:
    +27
    I've successfully managed to hook up my HCPC to my Sony KD32DX40 WS TV using the VGA to RGB Scart as described on the idiots guide and on this forum. I had to tweak the Powerscript setting to get it to work on my TV. I've set it to 720x540:-

    PowerStrip timing parameters:
    720x540=720,77,40,131,540,19,5,60,15101,312

    Generic timing details for 720x540:
    HFP=77 HSW=40 HBP=131 kHz=16 VFP=19 VSW=5 VBP=60 Hz=25

    Linux modeline parameters:
    "720x540" 15.101 720 797 837 968 540 559 564 624 interlace +hsync +vsync

    The image is great, but I had to configure DigiTV to use Line Average de-interlacing otherwise the picture is all jaggy. The trouble is now I get motion blur. Is this down to the interlacing method? Is there another way to improve the picture? It's really noticeable when viewing studio programs such as Coronation Street, East Enders (not that I watch them!). I don't appear to have a problem with DVD playback.

    The other thing is that it's buggered up the AV signal coming into AV2 which is where I've now downgraded Sky pending its removal. I get white scanning line at 45degress right to left. Any ideas?

    Cheers,
    David.
     
  2. MikeTV

    MikeTV
    Well-known Member

    Joined:
    Feb 1, 2003
    Messages:
    7,781
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    136
    Location:
    Ealing, London
    Ratings:
    +844
    I could be talking out of hat here, but I think the TV supports NTSC which may be why 720x540i works. It may also be that if you can get 720x576i working, you may not need to deinterlace (because the TV should do it, I'd have thought). If the digitv software is scaling the picture (in order to fit into 540 lines), then the TV's deinterlacing won't work (because the odd and even lines will be jumbled up or merged together, instead of consecutive discrete lines, and hence fuzzy). But I'm really just guessing. If the lines don't match what the TV is expecting exactly, my suggestion won't work, and so I am not even sure it is possible.

    Now, when I use bob deinterlace, and scale the digitv picture in a window smaller than 720x576, the picture goes jaggy (well stripey, actually), which is probably what you are seeing. I wish they'd fix that.

    On film based material, you can select no deinterlacing, and that should work fine?

    Otherwise, I think your only option is to use line-averaging for video based material, which isn't very good for motion, as you have observed.
     
  3. groovyclam

    groovyclam
    Standard Member

    Joined:
    Mar 28, 2004
    Messages:
    202
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    18
    Ratings:
    +1
    Just to back Mike up really.

    I find that I have to use a vertical resolution of 576 in order to watch video material ( e.g. Big Brother ) without the stripes.

    Try and get 720x576 or ( 1024x576 if a widescreen TV ) working on your TV and then use BOB deinterlacing in DigiTV and all will be well.

    See this thread http://www.avforums.com/frame.html?http://www.avforums.com/forums/showthread.php?t=136811&page=3
    for tips on how to get the resolutions working.

    And don't use version 3.52 of Powerstrip ( the above thread explains why )
     
  4. wyerd

    wyerd
    Active Member

    Joined:
    Dec 29, 2000
    Messages:
    761
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    28
    Location:
    Upstate New York. USA
    Ratings:
    +27
    Thanks for the info guys.

    I did try the 720x576 timings as described at http://ryoandr.free.fr/english.html but I couldn't get a decent image - the picture went all bendy and lost at the top of the screen. I also tried the 960x540, but the picture was all over the place. I supose it's just a matter of tweaking and with luck you'll get a picture.

    I'm using v3.47
     
  5. wyerd

    wyerd
    Active Member

    Joined:
    Dec 29, 2000
    Messages:
    761
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    28
    Location:
    Upstate New York. USA
    Ratings:
    +27
    I've managed to get 960x540 working

    PowerStrip timing parameters:
    960x540=960,96,56,176,540,21,9,58,20222,312

    Generic timing details for 960x540:
    HFP=96 HSW=56 HBP=176 kHz=16 VFP=21 VSW=9 VBP=58 Hz=25

    Linux modeline parameters:
    "960x540" 20.222 960 1056 1112 1288 540 561 570 628 interlace +hsync +vsync

    I did try 1042x576, but there were borders on the sides and overscan at the top and bottom I couldn't get rid of.

    I still get a bit of motion blur on video content via DigiTV when using line average interlacing. When using Bob, I get a sort of a dark wave going through the picture. Any ideas?
     
  6. groovyclam

    groovyclam
    Standard Member

    Joined:
    Mar 28, 2004
    Messages:
    202
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    18
    Ratings:
    +1
    You will have to use a native 576 vertical resolution to cure it.

    The only other way is to not use the Nebula DigiTV software to play recordings - DScaler has lots of options for better video output and you may be able to configure it to play VIDEO recordings nicely at 540 vertical resolution.

    This doesn't help if you are watching TV live on the Nebula since you have to use the DigiTV software for that - in which case you will have to suffer with what you have or switch to the 1024x576 res.

    A little overscan is usual at 576 - that is what happens via a CRT TV's "normal" broadcast signals.
     
  7. wyerd

    wyerd
    Active Member

    Joined:
    Dec 29, 2000
    Messages:
    761
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    28
    Location:
    Upstate New York. USA
    Ratings:
    +27
    Okay - got 1024x576 working, sort of!

    PowerStrip timing parameters:
    1024x576=1024,120,48,208,576,-1,5,39,21665,312

    Generic timing details for 1024x576:
    HFP=120 HSW=48 HBP=208 kHz=15 VFP=0 VSW=5 VBP=39 Hz=25

    Linux modeline parameters:
    "1024x576" 21.665 1024 1144 1192 1400 576 576 581 619 interlace +hsync +vsync

    Video now looks great and using Bob, but i'm getting a slight flicker at the top of about 10 pixels. which is really noticable when vertical lines are in the picture ie doors, pillars etc. If I can get that fixed the picture would be perfect. Any ideas?

    Thanks for your help Groovyclam.
     
  8. groovyclam

    groovyclam
    Standard Member

    Joined:
    Mar 28, 2004
    Messages:
    202
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    18
    Ratings:
    +1
    Not sure what you mean about flicker, need a better description really. Is it only when watching the Nebula ?

    Is is it there on your static desktop as well ? Try a desktop wallpaper with a vertical line pattern ( like you mention with doorframes ), also at high contrast like black and white lines.

    If it's there on your desktop it will be your PowerStrip settings stressing your TV and you will need to adjust them or risk mucking your tube up.

    If it isn't on the desktop - is it only when playing video material via DigiTV. Try playing a DVD full screen and see if it happens with that.
     
  9. wyerd

    wyerd
    Active Member

    Joined:
    Dec 29, 2000
    Messages:
    761
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    28
    Location:
    Upstate New York. USA
    Ratings:
    +27
    It's on the desktop and Nebula. Back to fiddling with PowerStrip! (Which bits!)
     
  10. groovyclam

    groovyclam
    Standard Member

    Joined:
    Mar 28, 2004
    Messages:
    202
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    18
    Ratings:
    +1
    Start with the 1024x576 you have and then open display configuration in PowerStrip. Click "advanced options"

    Type 25 into the vertical frequency box directly and then tab out.

    Now tick the "Lock frequencies" box

    Now use the two sets of four arrow buttons on the left to alter the size of your desktop and the position of the desktop.

    The left four are the position, the right four are the size.

    Make sure you have backed up your powerstrip.ini first, you can press ESC if you get an unviewable desktop and start again.

    When you have it as best you can save the config as your new 1024x576
     
  11. wyerd

    wyerd
    Active Member

    Joined:
    Dec 29, 2000
    Messages:
    761
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    28
    Location:
    Upstate New York. USA
    Ratings:
    +27
    Ahhh... makes the whole configuration a lot easier ticking that box.

    Well the best I can come up with now is :-

    PowerStrip timing parameters:
    1024x576=1024,101,48,195,576,1,5,41,21307,312

    Generic timing details for 1024x576:
    HFP=101 HSW=48 HBP=195 kHz=16 VFP=1 VSW=5 VBP=41 Hz=25

    Linux modeline parameters:
    "1024x576" 21.307 1024 1125 1173 1368 576 577 582 623 interlace +hsync +vsync

    It's a bit wavey in Nebula when using Bob, but DVD/Video playback using Zoom Player is perfect.

    I'm still trying to get my head around this de/interlace thing. I understand the basics, but why are the PS timings set to interlace and nebula is set to deinterlace if it's output to the TV which is an interlace device??

    Thanks again for your help.
     
  12. groovyclam

    groovyclam
    Standard Member

    Joined:
    Mar 28, 2004
    Messages:
    202
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    18
    Ratings:
    +1
    As far as I understand it...

    The Nebula *has to* deinterlace because it is going to present the picture to Windows on your HTPC which it thinks is, at the end of the day, a "normal" computer setup producing a progressive scan signal which normally gets sent to non-interlaced monitors.

    This would not be a problem if you were going to squirt your desktop to a progressive device ( plasma, LCD, projector ) but you ( and I and others ) are choosing to squirt it to their CRT TVs ( a 25Hz interlace device )

    So we use PowerStrip to take the output and convert it from a progressive picture at the operating system level to a 25Hz interlace signal which leaves via the VGA port, along the adapted SCART and into the telly.

    At least I think that is the order of events, someone correct me if there is a better explanation.
     
  13. jwexqm

    jwexqm
    Standard Member

    Joined:
    Jan 29, 2004
    Messages:
    20
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    3
    Location:
    UK
    Ratings:
    +1
    I have also also noticed that the Nebula software is pretty awful when using direct VGA to RGB SCART.

    My guess is that the DigiTV software was written with progressive output devices in mind. This makes sense on many levels because the vast majority of PCs are hooked-up to progressive displays (modern analogue VGA, TFTs etc).

    Ideally the Nebula would not deinterlace video when presenting over an RGB SCART connection. It would not resize the frame vertically and would present each field on alternating scanlines of the 576 line display mode. It would also be synchronised to the VGA card's scanning clock and be field order aware. I suspect that getting all this right isn't a particuarly trivial software problem - particularly in Windows.

    As mentioned already in this thread, the jaggies are an artefact experienced when an arbitrary vertical resize is performed on interlaced video without a deinterlacing algorithm. Really a 1:1 relationship between video frame lines and field scan lines on the output device needs to be preserved.

    DigiTV judders so badly when using the 25Hz (50fields/sec interlaced) display mode because it's not attempting to synchronise to the VGA card's frame scanning rate. Drifting between the clock references then results in temporal jitter and smooth video is lost. The PC has always been bad at this sort of thing. It was something that the Amiga had licked right from day one (hence its strong video heritage).

    One piece of software goes a long way to solving this issue on the PC called ReClock. It's a DirectShow component and works by synchronising the timing reference used by the player software to the actual presentation frequency of the graphics card. When configured correctly ReClock ensures that no frames are dropped and so video plays smoothly.

    You can see the effect of ReClock on playback of some MPEG2 files saved from DigiTV. You need an MPEG2 directshow filter that can be configured *not* to deinterlace (e.g. Elecard). Wire it to VMR9 (I had trouble with overlay) and finally make sure Zoom Player isn't doing any kind of vertical resize operation on the video (e.g. aspect distortion). If you run at 720x540 then the Zoom Player preset should be 576 lines high with something like 18 lines of overscan clipping at top and bottom.
    Record some nice heavy field motion video footage from DigiTV (sports & scrolling news tickers are good) and play it with this setup. Note that when you hit 'play' you get a 50% chance that playback starts with the correct field order presentation (rapid backwards/forwards judder in motion scenes). If it's wrong then either cycle play/pause until it's correct or nudge the frame up or down by one scanline.

    When this works it really does look every bit as good as a DVB set-top box would on your TV and you immediately start wanting something that looks this good all the time.

    I would imagine that Nebula could re-code DigiTV using DirectShow, possibly licensing ReClock or designing something themselves to do similar.
    Another option would be to support some kind of DScaler interface where high quality (& CPU intensive) deinterlacing and field interpolation algorithms bridge the gap between incoherent clock frequencies.
    It's also possible that even ReClock may not be suitable for live stream viewing because there's also the broadcaster's clock to worry about. I noticed on a forum that someone was wondering if it would be possible to dynamically modulate the VGA card's clocks to sync to the incoming video. This is probably the Rolls Royce solution though.

    Also noticed that DigiTV doesn't handle 50Hz progressive modes well either. When I tried it on a projector that used a 50Hz mode it stuttered badly. In reality 50Hz is the field rate of PAL so there could theoretically be no stutter. Interestingly the ShowShifter DVB software seemed to perform differently, not perfectly, but certainly better.

    DigiTV works best at higher 'desktop' resolutions. Judder in these modes isn't such an issue because the time gap between frames being scanned out by the VGA controller is shorter.

    It'd be great if Nebula could address this. They've added a lot of fancy features to DigiTV but for me this fix most important.

    James
     
  14. groovyclam

    groovyclam
    Standard Member

    Joined:
    Mar 28, 2004
    Messages:
    202
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    18
    Ratings:
    +1
    Thanks for a very informative post James.

    I shall investigate ReClock immediately.

    Surely as another solution we should be petitioning the DScaler developers to include DVB hardware directly, in future versions rather than hoping the Nebula team decide to eventually get around to better decoding.
     
  15. MikeTV

    MikeTV
    Well-known Member

    Joined:
    Feb 1, 2003
    Messages:
    7,781
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    136
    Location:
    Ealing, London
    Ratings:
    +844
    Yes. Terrific reply, James!

    I agree that Dscaler support for the Nebula would be the iceing on the cake.

    I've tried running digitv at 640x480 @60hz, and it had weird interlace related artefacts (even though I had bob deinterlacing selected). The results were really quite poor, even though, at higher resolutions, it looks great. My feeling now is that the DigiTV scaling algorithm is wrong somehow, but that it isn't noticeable at higher resolutions. But that doesn't quite add up either, because I'd imagine that it was the graphics card doing the resizing, not the digiTV. So it is strange, in any case.
     
  16. wyerd

    wyerd
    Active Member

    Joined:
    Dec 29, 2000
    Messages:
    761
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    28
    Location:
    Upstate New York. USA
    Ratings:
    +27
    Thanks for the in-depth explaination James.

    Has anyone emailed Nebula about this? Perhaps they could modify the s/w somehow?
     
  17. groovyclam

    groovyclam
    Standard Member

    Joined:
    Mar 28, 2004
    Messages:
    202
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    18
    Ratings:
    +1
    The Nebula team have a new BOB decoding algorithm in their longer-term development roadmap and also a move to use DirectX instead of their bespoke graphic i/o so I suppose ReClock could be used with it then.

    Their development roadmap is here:

    http://www.nebula-electronics.com/beta/roadmap.asp

    We just have to sit and wait to see what they come up with and how long it takes them.

    I would imagine if the DScaler collective included DVB hardware into DScaler then you couldn't get a better output.

    Maybe it might be worth searching the DScaler forums and posting a query about it there.
     

Share This Page

Loading...