1. Join Now

    AVForums.com uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Interesting info (ATI v nVidia) regarding Half Life 2.

Discussion in 'PC Gaming & Rigs' started by Kramer, Sep 11, 2003.

  1. Kramer

    Kramer
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
  2. russraff

    russraff
    Active Member

    Joined:
    Aug 1, 2000
    Messages:
    2,311
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    51
    Location:
    Newcastle
    Ratings:
    +50
    So if you have any Nvidia hardware, and want to run Half Life 2, you will need to run the game without DirectX 9 optimisations. Otherwise it'll run at sub 30fps.

    Or buy a superior ATI card, one of the two.

    Russell
     
  3. Kramer

    Kramer
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    So it's a big :smashin: for ATI hardware then :)

    Lucky I've a 9800 Pro here ready & waiting... Looking forward to the release - it's been a long time coming.
     
  4. gingercat

    gingercat
    Active Member

    Joined:
    Nov 19, 2002
    Messages:
    489
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    31
    Location:
    Lancashire, UK
    Ratings:
    +26
  5. Mr.The.Spoon

    Mr.The.Spoon
    Active Member

    Joined:
    Jun 12, 2003
    Messages:
    524
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    18
    Location:
    I'm a Londoner.
    Ratings:
    +13
    I'm a Counter-Strike freak so will have a weird time playing HL2 followed by old graphics Counter Strike.
    Wonder how long before a Half Life 2 version of CS is available....
     
  6. HMHB

    HMHB
    Distinguished Member

    Joined:
    Nov 11, 2001
    Messages:
    25,455
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    166
    Location:
    Nottinghamshire
    Ratings:
    +3,755
    Very interesting to say the least.
    The only thing that puts me off ATI graphics cards is the drivers. One of my PCs has an ATI card and the others all have Nvidia (none of the nvidia cards are the latest generation) - the drivers load fine on the nvidia ones but I have nothing but trouble with the computer with the Radeon 9000. Whatever I do - it always has 2 unknown devices (which I assume if the VIVO bit). So I was leaning towards the FX5900 Ultra - but now I'm not so sure :confused:
     
  7. Kramer

    Kramer
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    The Radeons always install 2 display adaptors, Primary & Secondary (DVI & VGA).

    They should show up as Radeon 9000/9500/9700/9800/Pro etc...

    No problems with drivers here for me. Up to Cat 3.7 now, they are releasing drivers relatively often.

    I think you'll be OK with a 9800 Pro. I have 2 & they're grrrrrrrrrrrrrrreat :smashin:
     
  8. betamac

    betamac
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    Apparently the Beta 50.00 Drivers which Nvidia gave to Valve offer huge improvements for the FX Cards in Half Life 2, but for some reason they decided to use the old drivers as a comparison?

    Nvidia was not very happy atall
     
  9. GrahamC

    GrahamC
    Active Member

    Joined:
    Jul 22, 2001
    Messages:
    2,258
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    51
    Location:
    Morecambe, U.K.
    Ratings:
    +67
    Every new game published is marketed with the 'pushes the boundaries of gaming' tag attached to it, I think that this time it might just be true...:clap:
     
  10. grey torq

    grey torq
    Active Member

    Joined:
    Jul 13, 2002
    Messages:
    394
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    28
    Ratings:
    +26
    If you follow the reports at Tomshardware this might not be just HL2 it could be the case for all next gen directx 9 games like Doom 3, Deux ex 2 etc.

    There is still debate as to the 50+ drivers really sorting out these problems as the design of the Nvidea cards does not follow the defined Direct x9 spec and the drivers are software fixes.

    My fundamental problem with all this is PC gaming could suffer badly and become marginalised on the grounds of cost. A new ATI radeon pro 9800 costs in the region of £300, you would probably also need a MB & CPU upgrade that cost in the region of £200, so to play the latest killer applications on a PC you need to fork out £500+ unless you already have a high spec PC.
    Compare this to console prices for PS2, Gamecube and Xbox and PC users are being ripped off. In the end this could kill the PC games market, £500 to upgrade my PC or £180 for an xbox, it's a no brainer.

    Torq
     
  11. russraff

    russraff
    Active Member

    Joined:
    Aug 1, 2000
    Messages:
    2,311
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    51
    Location:
    Newcastle
    Ratings:
    +50
    Well, yes if you want the latest and greatest, you will be paying over the odds. However, the 9600pro, at £120 would appear to have good performance. MB and CPU upgrades are dependent on what you already have, and whether or not the game is more reliant on graphics card or CPU power. If a new CPU is required, then that may not necessarily mean a 3200+ XP, or a new MB. You could get a 2600+ and may still play the game at acceptable fps. Even if you did need a new MB, it would cost about £260 for the whole lot. Still more expensive than an Xbox, but then I use the PC for more than games, and there are more than a few good games for the PC to start with.

    Russell
     
  12. Sinzer

    Sinzer
    Well-known Member

    Joined:
    May 25, 2002
    Messages:
    5,778
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    136
    Ratings:
    +1,100
    This has always been the case with PCs, you are always faced with heavy upgrade costs to play the latest games at the latest speeds.

    Suffice to say that no current console will be able to match the graphics that will be displayed in HL2...... you get what you pay for!
     
  13. grey torq

    grey torq
    Active Member

    Joined:
    Jul 13, 2002
    Messages:
    394
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    28
    Ratings:
    +26
    Russell,

    Appreciate where you're coming from, I strongly suspect that when these new games are released there will be pressure to cut prices of the hardware that will enable them to be played properly, I'll hold out for a 9800 Pro at the £150 mark.

    Torq
     
  14. KraGorn

    KraGorn
    Active Member

    Joined:
    Aug 30, 2003
    Messages:
    4,740
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    68
    Location:
    Warrington
    Ratings:
    +27
    I'm getting awfully bored with the pitiful in-fighting between ATI and nVidia, they've made the idea of a vendor-neutral API, aka. DirectX, mainingless and we're back in the good old days of Tseng4000 vs. Hercules vs. ATI vs. Trident vs. ...

    :mad:
     
  15. russraff

    russraff
    Active Member

    Joined:
    Aug 1, 2000
    Messages:
    2,311
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    51
    Location:
    Newcastle
    Ratings:
    +50
    I don't think there is any infighting per se. ATI and nVidia (possibly 3dLabs in the future) are different chipset manufacturers and are in opposition by default. I think that ATI have produced a cracking product in the R3xx chipsets and nVidia are lagging behind performance and quality wise. nVidia are responding to this in the only way they can in lieu of no new hardware: by issuing a statement stating "don't worry - we'll fix it with software".

    Russell
     
  16. CENSORED

    CENSORED
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    Russell,

    I agree. ATI can't really put a foot wrong at the moment and the 9800 Pro seems to be getting all the plaudits in the latest round of graphics card shootouts, even if it isn't strictly the fastest card out there. Pricing is good and they have sorted out their driver situation which was arguably holding the cards back from their full potential. There does appear to be a worrying trend, particularly with Nvidia, to address framerate issues by releasing optimised drivers. Time was that a cards settings were 'tweaked' by successive driver updates which generally improved playability over a wide range of games. Now we potentially have a situation where game specific drivers could become the norm. Isn't that the tail wagging the dog?

    Oh, and Nvidia appear to think it acceptable to drop the colour depth of textures down to 16 bit at certain points in a game to maintain framerate. They argue that as long as image quality is maintained this is not an issue. Well, if I had just shelled out £400 for a 5900 Ultra I guarantee it would be a bloody issue with me. Sounds like an act of desperation.
     
  17. russraff

    russraff
    Active Member

    Joined:
    Aug 1, 2000
    Messages:
    2,311
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    51
    Location:
    Newcastle
    Ratings:
    +50
    Desperate may be a bit OTT but they are in a bit of a pickle. The way I see it, nVidia make most of their money from home spec 3d accelerator chipsets (NOT flagship models), with the workstation 3d accelerators and MB chipsets some way behind. Now, Valve, the developer of arguably the most anticipated game for some time, has effectively spurned nVidia. That they have also suggested that all DirectX 9 games will have similar characteristics to Half Life 2’s performance, would imply a fair few people will be looking to ATI for their next card. After all, it is only until now that Deus Ex: Invisible War, Half Life 2, Doom3 et al can make use of DirectX 9 and a lot of people won't have upgraded their cards until these games are released. Why bother beforehand, when performance was perfectly acceptable with older technology. I know I haven't upgraded for that very reason and I also know my next card won't be an nVidia one. nVidia will be quite worried about this trend, especially as the affordable ATI 9600 pro would appear to outstrip the shockingly expensive flagship FX5900 card.
    Let’s look at something else? The Xbox currently uses an nVidia GPU, but Xbox2 won’t as ATI is supplying the new GPU. Again, more money lost to ATI. Long term vision, yes, but this vision still matters to nVidia who will need a plan, now, to make up this future deficit. The only way nVidia can do this is to placate their existing user base by releasing almost bespoke drivers (nVidia have a large software development team as I seem to remember) for current hardware. I think they hope that fps results, more than outright quality, will make people stick them until a more pragmatic solution can be found for their existing customer base to migrate to. If that means 16bit textures, then so be it. After all, I reckon that ATI have been releasing cards capable of better quality graphics than nVidia for a while, now, but most still seem to buy nVidia.

    Russell
     
  18. CENSORED

    CENSORED
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
     
  19. CENSORED

    CENSORED
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    Oops, don't know what happened there. Yeah, I'm one of those people and in fairness they have made some great cards but the balance of power is shifting and hopefully they won't have things all their own way. This is good, it's good for competition, good for consumers and it will make for better products provided there are no knee jerk reactions to release new hardware for the sake of a few more frames.
     
  20. Mr.D

    Mr.D
    Well-known Member

    Joined:
    Jul 14, 2000
    Messages:
    11,040
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    133
    Ratings:
    +1,111
    I wouldn't worry about the textures dropping down to 16bit ( or even 8bit ) as long as the image quality holds up. You can easily drop bit depth on images and maintain image quality depending on the content of the images. Its a smart move if you ask me much more elegant : use it where you need it rather than lazily keep it to the max..
     
  21. Amino

    Amino
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    I've said this before, and I'll say it again. Simply put, PCs are more expensive because they are better. They are more versatile in terms of the things they can do besides games. You can build yourself a PS2 for less than a PS2 costs, but it'll suck as much as a PS2 does, when compared to a PC. I mean, I own an Xbox, PS2 and a high-end PC because I have the money for all that, but I'd much rather play PC games than Xbox/PS2 games (which is why I'm trying to get my PC to work on my HDTV :D).

    They're making HL2 for the PS2 I think, but for god's sake, the PS2 doesn't even support DX8, with it's GF2-based graphics card, so it's clearly not going to look anything like it will on the PC. And that's what you're paying for.

    Sony is coming out the PSX, which will basically be a computer, but it's still not going to be anything like the PC. And it's gonna cost upwards of $400, maybe $500...

    But to address the post's topic (LOL), I think nVidia deserves this for having their slogan "The Way It's Meant To Be Played.":smashin:
     
  22. CENSORED

    CENSORED
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    Hmm, I think it's meant to be played at the proper colour depth at a given resolution. I don't think it's meant to be played with a colour depth fudge to maintain framerate - no matter how clever it might be. If ATI can do it what possible excuse is there for nVidia (the market leader) not being able to do it?

    The console angle is very telling. Todd Hollenshead and Tim Willit's recent interview confirmed that XBox is within the performance requirement for a version of Doom III, in fact work has already started albeit at an early stage. PS2 and Gamecube will miss out as they are unable to process the 3d math calculations required to run the game.

    PC owners may have to look to component upgrades to get the new generation of DX9 games to run at an acceptable framerate/colour depth/bells and whistles level. Console owners (XBox excepted... for now) may need to look to a completely new machine for their fix. Regardless of the type of machine you own the process will be bloody expensive.
     
  23. CENSORED

    CENSORED
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    Here's an interesting snippet. After the driver cheating debacle, that left nVidia with a lot of egg on it's face, 3DMark have issued a press release on the way forward for benchmarking that should adhere to the following guidelines:

    'In order to clarify its stance on driver optimizations and to help those companies who wish to have their products benchmarked with its industry standard 3DMark benchmark, Futuremark hereby publishes the following set of guidelines for creating drivers.

    1. It is prohibited to change the rendering quality level that is requested by 3DMark.

    2. It is prohibited to detect 3DMark directly or indirectly. In its sole discretion, Futuremark may approve detection in order to fix a specified hardware error.

    3. Optimizations that utilize the empirical data of 3DMark are prohibited.

    4. Generic optimizations that do not violate the above rules and benefit applications in general are acceptable only if the rendering is mathematically consistent with that of Microsoft® DirectX® reference rasterizer.

    As a summary, all 3DMark specific optimizations are prohibited. Additionally, all generic optimizations that change the rendering quality requested by 3DMark are prohibited.'

    Looks like nVidia's proposed fiddling with colour depth to maintain framerate is getting kicked in to touch and bloody right too. This should establish an even playing field for hardware benchmarking that avoids underhand driver optimizations and doesn't take the p*ss out of people like us, the buyers. Penny for the thoughts of FX5600/5900 owners. That isn't a gloat by the way, if you're happy then great. If you're not then I don't blame you one bit.
     
  24. Erpland

    Erpland
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    Not entirely true, you get more than you pay for with a console as the hardware is subsidised:smashin:

    Dont forget that the design of the graphics chips used in current consoles is very old compared to what is on the table now for other industries, surely it is not the Half Life 2 designers who are forcing us to upgrade?
    they are just taking advantage of the hardware they know will be available when the game is released, its the graphics chip companies who are pushing the boundaries thanks to the consumer being ever willing to fork out for the latest and fastest in eye candy.
    If Nintendo for example allowed the ATI chip to be removed from the Gamecube and replaced with a latest chip from ATI then the next incarnation of Metroid, Mario etc would include many of the new effects.
    The trouble with comparing GFX on both, is that it is not a level playing field because of the TV v SVGA monitor issue.

    How many people can name the effects used in Metroid Prime, or the new kick ass ones about to appear in HL2?
     
  25. chicken balti

    chicken balti
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    I agree up to a point. Lets not forget how much PS2 was when it was first released. Price reductions come about when the costs of R&D have been recovered and when economies of scale kick in. OK, there is obviously an arguement that despite this the hardware is still sold at a loss, but the sales of games are where it gets clawed back, they get you in the end.

    Since the average price of PS2 games is about £40 and PC games about £30 it could also be argued that, over a period of time, the savings made buying PC games actually finance the hardware upgrades required for future software releases.

    The difference in terms of cost therefore isn't that wide between the formats, but the PC will ALWAYS have the most powerful hardware and the innovative software to exploit it.
     
  26. Erpland

    Erpland
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    It's a bit strong saying they get you in the end, the hobby still has to be paid for, how the companies make the money is up to them, a bit like Sat TV company SKY giving away free boxes and dishes.

    This could be true, but I have mates who scoff at the prices of a PC and yet own 20-30 PS2 games, PC games are usually so involving they can last ages what with all the add ons and everything, I expect Half Life 2 to last me a couple of years, where as I buy console games for a quick arcady blast or for some social gaming with mates.


    Im not sure about this, PC games have been criticized over the years for lack of innovation, but I expect you are talking about advances in graphical techniques, the PC may be more powerful at the higher end or as far as processor speed goes, but when the SNES came out I cant remember there being a PC available with a decent graphics card let alone sound, same goes for the release of the Playstation in 1994, most people I knew had 486 PC's and a top ender was around 2 GRAND! and that was what... a Pentium 100?
     
  27. chicken balti

    chicken balti
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    Precisely my point. :smashin:
     
  28. CENSORED

    CENSORED
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    Yes. There wasn't much in the way of dedicated hardware for the PC as far as gaming was concerned. What gave the whole industry a kick up the arse was the PSX, it clearly showed what was possible and gave a strong indication of the way forward. PC gaming benefitted too and dedicated 3D hardware, coupled with multiple-voice quality audio hardware, provided the impetus for a slew of upgrades which culminated in 3DFX and AWE32. Oddly, consoles then started to exploit the potential of proper 3D rendered graphics and so N64, Dreamcast, PS2, Gamecube and XBox were born.

    I believe that Half Life 2 and DoomIII will turn out to be watersheds in gaming as they are adopted as the reference standards for graphics. I talk purely from the technical standpoint, gameplay will always be subjective but the potential that the hardware provides will, in my opinion, create a yardstick for future hardware. Isn't that a good thing for everyone?
     
  29. micb3rd

    micb3rd
    Active Member

    Joined:
    Dec 8, 2001
    Messages:
    1,047
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    48
    Location:
    Bedfordshire
    Ratings:
    +67
    nVidia has been top dog for a few years now, they have had in the in past had top notch graphics cards but times are changing.

    In the past there are two main parts to a graphics card speed "Fillrate" and "Memory bandwidth".

    So graphics cards are limited by their Fillrate ability and Memory bandwidth speed.

    This the case in Direct X 7 and DX 8 games.

    There now a third limitation in graphics card has recently arived it is the cards "Shading ability" this is the cards ability to render and support special per pixel effects.

    DX9 is the new standerd for Direct 3D games.

    DX9 walks hand in hand with "HLSL" High level Shader language which is the new industary standerd for game shading.

    nVidia does not support HLSL that well, they support CG which is the limited and less efficent propriaty shader language.

    Developers do not want to use CG, they are mostly using HLSL or custom shader code.

    The main problem is now shown, nvidia cards are perform well in Memory bandwidth and fillrate limited situations so here they can compare to ATI cards.

    NV cards even the top end 5900's have serious hardware problems when running DX9 advance features and pixel shading.

    ATI cards the 9700, 9800 and 9800 XT all completely adhere to DX9 standerds and HLSL.

    The ATI cards are very good at DX9 and pixel shading.

    I feel nVidia will need a total redesign of the cards in the future to be competitive with the compertition.

    You may be wondering about the top dogs at the graphics card industary the NV 5800/5900 Ultra and AT 9700/9800 Pro.

    Here is my quick and dirty round up of the main hardware issues.


    NV30: (AKA 5800)

    4 pipelines :

    In each pipline:
    2 Texture address processors or a single FP32 ALU + FX12 + FX12.

    NV35: (AKA 5900)

    4 pipelines :

    In each pipeline
    2 Texture address processors or a single FP32 ALU + Mini FP32 ALU + Mini FP32 ALU.

    FX = Fixed-point integer.
    FP = Floating-point.

    The NV35 while doubling the memory bus from 128 (NV30) to 256 (NV35) mainly helps in memory bandwidth limited situations for example MultiSample AntiAliasing.

    Both cards have a extremely limiting tempary register problem - where if you store more than 2 FP32 registers or more than 4 FP16's the quad pixel pipline stalls.



    ATI

    R300/R350: (AKA 9700/9800)

    8 pipelines :

    In each pipline:
    Texture address processor + FP24 ALU + Mini FP24 ALU.

    It has none of the tempary register problems the the nVidia FX series cards constantly stall on.


    To sum the situation up.

    The FX 5900 Ultra is fast in DirectX 8.1 games that are Vertex Shader, Fillrate or Memory Bandwith limited.

    The 9800 Pro is always fastest in DirectX 8.1 AND DirectX 9 games that are Pixel Shader 1.1, 1.4 or 2.0 limited.

    The current range of 5900 FX cards support some advanced DX9 features but run them *very* slowly. Future games like HL2 have features High Dynamic Range Lighting which nVidia cards wont even be able to display.

    Again the ATI 9800 Pro runs *all* the advanced DX9 features at playable frame rates.

    Games are moving towards DX9 and advanced pixel shading unfortunatly NV cards are lacking in such areas



    Here is a small clarification on the coding of Doom 3 + Halflife 2 (Credit should go to my brother for this part of the write up)

    Let me expand on Doom3, HalfLife2 codepaths and pixel-shader precisions.

    Doom3 (OpenGL):

    If NV30/NV35 (5800/5900) runs Doom3's default ARB2 codepath, then the performance is terrible (as it runs the in full-precision FP32 very slowly).

    To compensate JC wrote a specific NV30 codepath that runs much faster (as it runs in the FX12 Integer and FP16 partial-precision much quicker).

    If the R300/R350 (9700/9800) runs Doom3's default ARB2 codepath, then the performance is fast (as it runs the full-precision FP24 very well).

    It's arguable that Doom3 requires "full" or "partial" precision because in essence D3 is only "DX7 level game". What I mean by this is that all the functions for D3 can run on a regular DX7-level 3D card (using multiple passes). For performance reasons Pixel-Shaders are being used to perform these effects in a single pass.

    HalfLife2 (DirectX)

    The NV30/NV35 (5800/5900) will use a specificly coded DX9/DX8 "Mixed-mode" codepath just to get acceptable Performance.
    (Note: this took Valve x5 the amount of time to create than the standard DX9 path)!

    The R300/R350 (9700/9800) will use the default Full DX9 codepath and will have by far the highest performace and image quality.

    HalfLife2 requires more than just "floating-point precision", for it's advanced DirectX 9 features such as "High Dynamic Range Lighting" it also needs "floating point render target support" which is not supported by the current Nvidia drivers for the NV30 or NV35 - even worse - if it gets enabled then performance will very likely be very poor.

    As the R300/R350 runs the Full DX9 codepath and also have "floating point render target support" in the current drivers so High Dynamic Range Lighting will run fine.
     
  30. CENSORED

    CENSORED
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    micb3rd,

    Great post. If I'm reading it right it appears that John Carmack's NV30 codepath drops from 32 bit to 16 bit precision to maintain framerate. Which is something that has previously been reported on and appears to be addressed in nVidia's beta 50 drivers. Wouldn't setting in-game colour to 16 bit have the same effect? Presumably 32 bit colour performance in some scenes is acceptable and automatically switches to 16 bit as situations dictate. Unfortunately this will be outlawed in future benchmarks which leads me to assume that the D3 codepath and driver revision are stop-gap measures which can only be properly addressed with new hardware iterations. nVidia took a big gamble with their CG shader language, considering no one is now likely to adopt it they have a battle to get the hardware sorted, win back customer faith and get the product on the shelf. I wouldn't bet against them doing it.
     

Share This Page

Loading...