1. Join Now

    AVForums.com uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Dell 2405 and DVI problems

Discussion in 'Desktop & Laptop Computers Forum' started by fraggle, Jun 23, 2005.

  1. fraggle

    fraggle
    Member

    Joined:
    Dec 11, 2002
    Messages:
    1,441
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    51
    Location:
    Milton Keynes
    Ratings:
    +29
    I've just got one of Dells 24" TFT monitors, 1920x1200 res. :)

    Graphics card in the PC is an Inno3D GeForce FX 5900 XT which has VGA and DVI outputs.

    Connecting with VGA and there's no problems, everything works great, apart from text isn't as sharp/clean as DVI.

    Connecting with DVI and there's a very nice improvement in the quality. *But* there's a really annoying problem, which is hard to describe.

    On say a dark grey background with a dialog box, the right (and sometimes left) sides of the dialog box will slightly "tear" by maybe 5 pixels. If you move another dialog box around ontop of the first one the "tearing" moves round. The tearing sparkles all the time too, like TV static, weather you're doing nothing or not.

    Another way this problem shows is when you have slight changes in colour in an image, say you've got the standard Microsoft "TellyTubbies" hill wallpaper set, open a window and move it around and you can see bright cyan 'lines' coming and going and sparkling in the sky and clouds of the wallpaper.

    Also text looks crud sometimes.

    The effect seems to be triggered when there's a solid horizontal line that is a different colour to the rest of the back ground, to the right of this line its as if the pixel clock looses lock and the line is shifting right/left fast.

    I'm aware that 1920x1200 is actually beyond the spec of a single DVI-D link, and they need to tweek the timing settings, etc, to fit the extra res in, but could this be down to a cheapo graphics card? Poor quality DVI cable? Monitor problem? Graphics card problem?

    I'm going to have a play with powerstrip and see if that can improve things, but there's so many timing settings it could take me ages :-(

    Anyone close by got a GeForce (or ATI) card with DVI output I could borrow for half a day or so?

    :confused:
     
  2. Kopite4Ever

    Kopite4Ever
    Member

    Joined:
    Mar 15, 2004
    Messages:
    4,302
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    71
    Location:
    Liverpool: European Capital of Football 2005/2006
    Ratings:
    +76
    sounds like a poor quality cable to me. obviously it could be any of the components but everytime ive seen a problem with "sparklies" its been a cable problem wether it be too long or really bad quality. its not expensive for a decent Dual Link DVI-D cable. i want one of those monitors badly :(
     
  3. fraggle

    fraggle
    Member

    Joined:
    Dec 11, 2002
    Messages:
    1,441
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    51
    Location:
    Milton Keynes
    Ratings:
    +29
    I used the cable that came with it. It's a single DVI cable.

    Looking in aria/scan double DVI Belkin 'Pro' cables are about £20 delivered, so thought they'd be maybe £30 from PC World, went down there and £50 :eek: :censored:

    I'm trying to scrounge another card to test with and I'll order a decent DVI cable in a bit.

    Whats a decent make of double DVI cable? I don't recall Belkin as being known as a quality cable manufacturer.
     
  4. Kramer

    Kramer
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    I initially used the supplied DVI cable with my 2405 & had no such issues/problems.

    :smoke:
     
  5. Mr.D

    Mr.D
    Well-known Member

    Joined:
    Jul 14, 2000
    Messages:
    11,041
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    133
    Ratings:
    +1,113
    I read a review of this display where they had what sounds like the same problem. Turned out it was the driver for the graphics card . Is it the latest driver?
     
  6. fraggle

    fraggle
    Member

    Joined:
    Dec 11, 2002
    Messages:
    1,441
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    51
    Location:
    Milton Keynes
    Ratings:
    +29
    Latest drivers as of yesterday.
     
  7. Yaka

    Yaka
    Active Member

    Joined:
    Dec 20, 2000
    Messages:
    1,664
    Products Owned:
    2
    Products Wanted:
    0
    Trophy Points:
    63
    Ratings:
    +110
    tried resetting the screen to factory defualts?
     
  8. fraggle

    fraggle
    Member

    Joined:
    Dec 11, 2002
    Messages:
    1,441
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    51
    Location:
    Milton Keynes
    Ratings:
    +29
    Yup, tried that, same problems.

    I thought it might have been fixed with a more recent firmware (BIOS) in the GeForce card so found a site listing all the revisions, grabbed the latest one they had, flashed it but no difference to the DVI display but the analogue D-SUB displays a little bit clearer :)

    Also tried a new dual channel DVI-D cable, no difference.

    Got to check mobo BIOS updates, have a physical look at card, make sure its seated properly and its power connectors connected OK.

    Then try another GFX card and if no joy contact Dell about the monitor :(

    Anyway, thanks for the suggestions, appreciated! :)
     
  9. fraggle

    fraggle
    Member

    Joined:
    Dec 11, 2002
    Messages:
    1,441
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    51
    Location:
    Milton Keynes
    Ratings:
    +29
    Oh, one thing I have noticed is that if I set the screen number of colours to 16 bit (as opposed to 32 bit) the problem gets better, similarly when I used powerstrip to drop the refresh rate down the 48Hz (but the monitor then refused to show the DVI input once I'd switched away from it and back again), also dropping resolution helps.

    Which leads me to believe that something can't cope with the maximum data bandwidth too well, and I'm hoping its the GFX card cause thats cheaper than the monitor!
     
  10. Mr.D

    Mr.D
    Well-known Member

    Joined:
    Jul 14, 2000
    Messages:
    11,041
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    133
    Ratings:
    +1,113
    It does sound like the graphics card to be honest.
     
  11. Kramer

    Kramer
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    The 2405 does deserve a good (expensive!) graphics card anyway like a 6800GT etc. :thumbsup:

    :smoke:
     
  12. PinkPig

    PinkPig
    Standard Member

    Joined:
    Feb 14, 2005
    Messages:
    98
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    6
    Ratings:
    +0
    The 2405fpw usually works fine with a normal DVI connection, then?

    It's just that I've been reading things like, "Single Link DVI supports a maximum bandwidth of 165 MHz (1920x1080 at 60 Hz, 1280x1024 at 85Hz)". The 1920x1200 resolution of the Dell 2405 seems above that - and surely it runs at 60Hz normally?

    Why isn't a dual link DVI cable required? Have Dell found some magic solution for getting around the bandwidth limit?
     
  13. stlic

    stlic
    Active Member

    Joined:
    Feb 17, 2004
    Messages:
    722
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    18
    Ratings:
    +15
    The actual specifications for DVI support is 1600x1200/60. A couple of years back it became clear that there was a demand for 1920x1200 displays and so many manufecturers increased the spec on their cards to support this resolution even though it isn't officially supported. The Dell like any other DVI 23/24" screen will work at native res with the right card which in the past was higher end models only but now is pretty much across the board. For example I know of some 9200s that do not support 1920x1200 DVI but the 9800 had no problem.
     
  14. Son of Shaft

    Son of Shaft
    Standard Member

    Joined:
    Mar 11, 2005
    Messages:
    157
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    18
    Location:
    Breda
    Ratings:
    +1
    Don't have the screen myself but interested. I read that when you enable reduced blanking you'll be within DVI bandwith spec. Don't know if reduced blanking is a menu option of the screen or a driver option of the VGA card.
     
  15. fraggle

    fraggle
    Member

    Joined:
    Dec 11, 2002
    Messages:
    1,441
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    51
    Location:
    Milton Keynes
    Ratings:
    +29
    The card I have automatically sets up the reduced blanking so the signal is within the 165MHz bandwidth limit, but it still has problems.

    I'm just assuming its cause its an old very card, and maybe a cheap brand (Inno3D) so when its operating at very close to its maximum specs its just pushed too far. Could be a faulty card of course but too late now, its not something I would have ever found out had I not got this nice DVI monitor.

    I've been reading up a lot about high res and GeForce 6800 cards and DVI monitors.

    The data is sent over DVI 'links'. Each DVI socket can have one or two 'links' in it. And some GFX cards have two DVI sockets. So if the manufacturer wanted they could give you two physical sockets, with two DVI links in each one, allowing you to drive two very high res monitors.

    Each link can drive upto 165MHz pixel clock, or approx 1600x1200@60Hz@32bpp max res. By tweeking the timing settings this can be pushed upto 1920x1200@60Hz@32bpp, but the DVI chips operating absolutely flat out at this rate.

    Normal 6800 cards have one socket, which has a single DVI link in it.

    Some 6800 cards have two DVI sockets, each with a single link in them.

    A very few have two DVI sockets, one has a single link, the other has a dual-link, and only those are able to drive monitors like the Apple 30" monster.

    To confuse matters further Silicon Image, one of the companies that makes the chips that encode the DVI link signal, have produced a new version of their chip that raised the 165MHz limit upto 225MHz. If a transmitter and receiver pair of these are used it'd be perfectly possible to use 1920x1200@60Hz with absolutely no problem, but no ones heard of the receiver chip being used in any monitors yet. Just using the transmitter chip on a GFX card does help though, the Asus LE9999 6800GE uses this enhanced chip, and they think that the Asus LE9999 6800GT 256MB card does (but not the older 128MB version, just the newer 256MB card), the 6800GE definitely works with the Apple 30" monitor (but one user has reported minor problems).

    The new 7xxx GeForce cards supposedly do dual-link on a single DVI socket too.

    Anyway, as far as I'm concerned I'm just after a decent 6800GT card, preferably with dual DVI sockets on it. If I ever get a second TFT monitor I don't want to have to buy another card to drive it using DVI! :)

    Decent 6800GT card... I'd like the Asus but the GPU and memory clock speeds aren't very fast (325 & 700) and its only got single DVI :(
     
  16. Son of Shaft

    Son of Shaft
    Standard Member

    Joined:
    Mar 11, 2005
    Messages:
    157
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    18
    Location:
    Breda
    Ratings:
    +1
    I've read that the 6x00 nvidia chips have one analog and one digital transmitter on die. And that that digital transmitter isn't of really good quality and because of that people with higher res. LCD panels need to use VGA cards with 2 DVI outputs since the second DVI output makes use of a separate transmitter and is of better quality.
     
  17. stlic

    stlic
    Active Member

    Joined:
    Feb 17, 2004
    Messages:
    722
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    18
    Ratings:
    +15
    Don't think so.
     
  18. fraggle

    fraggle
    Member

    Joined:
    Dec 11, 2002
    Messages:
    1,441
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    51
    Location:
    Milton Keynes
    Ratings:
    +29
    From what I've read the part about the on die DVI transmitter being poor quality is true. It just can't handle operating at or near its max frequency (i.e. high res)

    If you look at the reference design for the 68xx boards they've got space for another two Silicon Image TDMS transmitters, and apparently the front one is nearly always populated to get over this problem, the on die one just isn't used. On dual socket boards there's an external one on the front and back of the board.

    So if you see someone in a shop with a 6800GT card in their hands peering all over it, it'll be me trying to see how many TDMS chips there are :)
     
  19. stlic

    stlic
    Active Member

    Joined:
    Feb 17, 2004
    Messages:
    722
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    18
    Ratings:
    +15
    I would surprised if a 6800GT card, even one with single DVI and D-Sub, would be unable to drive 1920x1200/60 unless it's faulty. I guess they could be out there though.
     
  20. Son of Shaft

    Son of Shaft
    Standard Member

    Joined:
    Mar 11, 2005
    Messages:
    157
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    18
    Location:
    Breda
    Ratings:
    +1
    well there are people who get shimmering and shadows with the on die transmitter on 1600x1200. So as fraggle said you have to check the number of tdms transmitters if you want to use DVI with higher res.
     
  21. fraggle

    fraggle
    Member

    Joined:
    Dec 11, 2002
    Messages:
    1,441
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    51
    Location:
    Milton Keynes
    Ratings:
    +29
    Sorted the problem now.

    Got a new XFX GeForce 6800GT, two DVI outputs (single link on each output)

    Took a lot of fiddling round to get it to work though, due to me having 3GB of RAM, and using the '/3GB' switch in boot.ini.

    With that switch present 9 times out of 10 when I booted the machine the nvidia driver didin't load and I got a bog standard VGA adapter with 1280x1024 res!

    Tried all sorts of different settings (BIOS and windows), forceware driver versions, etc, but nothing worked. I eventually tried taking out the '/3GB' switch and its now working very nicely (apart from windows programs can only use 2GB now rather than 3GB...)

    I notice that setting the AGP aparture size to 512MB reduced the 3GB memory to 2.5GB in the BIOS boot screen and in the Windows machine details. So I presume the 6800GT I/O area is mapped to somewhere in the 2GB -> 3GB address area, and telling windows to make that available to programs it couldn't lock the AGP aparture and gfx card memory regions and so the driver couldn't work. Letting windows do its normal thing (reserve the 2GB -> 3GB region for Windows exclusive use) I assume it can map the AGP aparture and 6800GTs memory mapped I/O into that region no problem...

    Computers, don't ya love em?!?!

    (I haven't dared plug the original GFX card back in to see if that was the cause of the original problem - the 6800GT works v nicely with HL2 and Doom3 at high res :) )
     
  22. VanAsh

    VanAsh
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    I know this question has been asked a billion times i am sure so sorry to be asking again but I am building a new gaming system and have just ordered the dell 24" and am trying to decide on what graphics card i need to buy so I can view games and other apps in full DVI at 1920 x 1200 Resolution. I would like to spend under £300 (around £250) if possible but does anyone have any advice or experience who could reccomend the best graphics card for the job or ata least maybe a few different cards that would be capable of the job. My new PC supports PCI-E so I cannot use any AGP cards.

    Thank you in advance.....
     
  23. Kopite4Ever

    Kopite4Ever
    Member

    Joined:
    Mar 15, 2004
    Messages:
    4,302
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    71
    Location:
    Liverpool: European Capital of Football 2005/2006
    Ratings:
    +76
    if you want full she bang details at that Res you need the new 7800 Gfx cards. im talking if you want high details and AF cos at that res you dont need AA. seeing as its well above your budget the minimum you want is a nvidia 6800ultra or ATi X850.
     
  24. VanAsh

    VanAsh
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    Thanks for the very quick reply....
    Sorry for being ignorant but what is AF and AA? Are there any particular manufactures for the 6800 cards you would recommend?
     
  25. Kopite4Ever

    Kopite4Ever
    Member

    Joined:
    Mar 15, 2004
    Messages:
    4,302
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    71
    Location:
    Liverpool: European Capital of Football 2005/2006
    Ratings:
    +76
    dont be offended but seeing as you dont know i will try to keep it in laymans terms

    AA stands for antialiasing, also know as Full screen AA (FSAA). basically what it does it smooth out 'jagged edges on shapes, textures and objects etc. the high res you go the less likely these 'jaggies' are to appear be cause you using alot more pixels on the screen so these jaggies seem alot les noticable. but higher res comes at a price on performance. basically technically speaking it blends the surrounding colour or greys around these graphical idificiencies to create a smooth like appearence. you will not see these jaggies really at 1900x1200 i can assure you of that.

    AF stands for anisotropic filtering. basically this improves perceived image quality on textures. basically the higher the setting the higher it render's sharp, detailed textures. As more texture samples are filtered, the image quality improves. again this comes at performance price but not as much as FSAA (full screen AA, same thing as above)

    as for manufactures there isnt much in them at all. some cards come pre overclocked but you cant really go wrong with any of the top brands.

    hope this all helps
     
  26. VanAsh

    VanAsh
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    Really excellent help, thank you very much. I will let you know how I get on and post some pics once i have the system setup. :)
     
  27. Kopite4Ever

    Kopite4Ever
    Member

    Joined:
    Mar 15, 2004
    Messages:
    4,302
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    71
    Location:
    Liverpool: European Capital of Football 2005/2006
    Ratings:
    +76
    no prob's glad to help. good luck with it
     
  28. Son of Shaft

    Son of Shaft
    Standard Member

    Joined:
    Mar 11, 2005
    Messages:
    157
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    18
    Location:
    Breda
    Ratings:
    +1
  29. fraggle

    fraggle
    Member

    Joined:
    Dec 11, 2002
    Messages:
    1,441
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    51
    Location:
    Milton Keynes
    Ratings:
    +29
    Well I can run HalfLife 2 at 1920x1600 with everything at full settings (actually put AA at 4x cause its better looking at 6x, I think the 6800s only do 2x, 4x and 8x so i suspect 6x is doing nothing)

    I can run Doom3 at 1600x1200 but reducing it to 1280x1024 increases the frame rate to very nicely playable. Quality setting at one down from maximum with a few things turned up more in the advanced menu.

    So I'd disagree, the 6800GT plays HL2 and Doom3 very well at high res, a 7800 card is overkill (and an absolute fortune!)

    AA stands for anti-aliasing (smooths out jagged lines and edges of things that are not perfectly horizontal or vertical)

    AF stands for Anisotropic Filtering.

    See here for a good explanation of both :-
    http://www.tweaktown.com/document.php?dType=review&dId=601

    If you've got the money and can afford a 7800, go for it. The 6800Ultra only gives you a 10% increase in memory and graphics processor speed, so in my opinion not worth the extra money over an 6800GT (many of which can be safely overclocked to Ultra speeds anyway)

    The only one I can recommend is the XFX 6800GT. Comes in PCI-E flavours too, can get it in twin DVI outputs or single DVI and VGA (both plus a TV output). There's a retail pack including 3 games, one of em Doom3 selling fairly cheap at the moment.
     
  30. Mr.D

    Mr.D
    Well-known Member

    Joined:
    Jul 14, 2000
    Messages:
    11,041
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    133
    Ratings:
    +1,113
    I got a really nice X850XT PE (off the classifieds for a song). Its the AGP version although differences are minor between the PCI-E flavour.

    Can do 1920x1200 on Far Cry with most settings on high/medium but it gets a little choppy when there is a lot of transparent objects in the scene. ( looks frickin great though...the bad guys are slimey with bump mapping never noticed that before)

    HL2 engine seems to like this card and I can max out everything. (CS:S seems to offer a large advantage at high resolutions...head shots even on the run seems a lot easier to pull off).

    BF2 (ironically the main reason I bought the card) 1280x960 with most things on high or medium ...any higher and I get stutters and terrible lag :rolleyes:

    Doom3 ( can't be bothered firing it up again to be honest , once through was enough in spite of the lush visuals)
     

Share This Page

Loading...