You'll see my own numbers tomorrow night at midnight, but we've been given the go ahead to reveal a bit of information about Half-Life 2. I'll keep it brief and to the point and will explain it in greater detail tomorrow night:

- Valve is pissed at all of the benchmarking "optimizations" they've seen in the hardware community;
- Half-Life 2 has a special NV3x codepath that was necessary to make NVIDIA's architecture perform reasonably under the game;
- Valve recommends running geforce fx 5200 and 5600 cards in dx8 mode in order to get playable frame rates.
- even with the special NV3x codepath, ATI is the clear performance leader under Half-Life 2 with the Radeon 9800 Pro hitting around 60 fps at 10x7. The 5900 ultra is noticeably slower with the special codepath and is horrendously slower under the default dx9 codepath;
- the Radeon 9600 Pro performs very well - it is a good competitor of the 5900 ultra;
- ATI didn't need these special optimizations to perform well and Valve insists that they have not optimized the game specifically for any vendor.

There you have it folks, there's much more coming tomorrow.
Comments Locked

169 Comments

View All Comments

  • Anonymous User - Thursday, September 11, 2003 - link

    I tried both 5900 and 5900 Ultra (both overclocked) and couldn't see any great difference over my Ti4600 - so I sent them both back - I'm glad I did

    I gonna sit on my money until reviews/benchmarks for R360/NV38 come in....

  • Anonymous User - Thursday, September 11, 2003 - link

    Should I run HL2 in DX8 or DX9 on my S3 Virge.
    Mayby I should have paid the extra dollars for a 486DX and not be the cheapoo 486SX...

    FPC - Frames Per Century rules..
    ;)
  • Anonymous User - Thursday, September 11, 2003 - link

    hehe, I got a 9800 pro 256mb (i know, iknow the extra 128mb makes no difference but im a sucker for the numbers!) and I nearly fainted when I saw the price for the equivalent FX card, now im LMAO about this, but I feel really sorry for the guys who payed a fortune for an FX who are probably in denial at the moment. Its a real pain that games are now fighting gfx card technology instead of being able to enhance their software with them. I think we will see the reverse of this FX situation when Doom III comes out though!
  • Anonymous User - Thursday, September 11, 2003 - link

    Note that when anyone says their DX9 code is "not vendor-specific" that the reason NVIDIA's been having so much trouble is that MS basically sold the DX9 spec to ATI in no small part because of its constant squabbles with NVIDIA. Contrary to popular opinion, these hardware architectures were actually far along in development well before DX9 was nailed down. In a reversal of the DX8 development, DX9 was basically a software/API description of the R300. People bitch about NVIDIA using 16 and 32 bit FP and "not being to-spec," but you must realize that these major architectural decisions (all-FP24 vs. fixed+FP16+FP32, 64 instructions and multiple outputs vs. 4k instructions and one 128bit output supporting packing of multiple smaller-precision elements, etc.) were being weighed and worked out well before MS came down with the DX9 spec. The spec was developed, as usual, with the involvement of all the players in the hardware world, but ALL of the bitchy specifics were handed to ATI. Admittedly, this has happened in the past with NVIDIA, but it's particularly problematic once the DX spec starts defining code paths and internal representations for these immensely complex stream programs in today's vertex and fragment units. As such, though it's clearly an important target which NVIDIA bungled largely in its business relationship with Microsoft, DX9 could easily be considered an ATI-specific codepath as sticking to spec forces a very non-optimal code path for the way the NV3x pixel shader is architected.
  • Anonymous User - Thursday, September 11, 2003 - link

    Chazz, very well put.
  • Adul - Thursday, September 11, 2003 - link

    I was post 73 btw
  • Anonymous User - Thursday, September 11, 2003 - link

    Expect Anand to have his number in about 21 hours and 10 minutes ;)
  • Anonymous User - Thursday, September 11, 2003 - link

    ummmm... my 2 cents:

    from what i've gathered.. HL2 = HL + new gfx + physics.

    by new gfx, i just mean that they finally figured out how to make the Quake engine load textures that are > 8-bit, and then they read some soft shadowing/bumpmapping tutorials and cut/pasted that code in there as well.

    concerning the confusingly low sys requirements #39 was referring to:

    if you're running a TNT/GF2 or possibly (?) GF3, you'll probably have to turn OFF the fancy gfx that have gotten HL2 half the hype. just so you can play it, instead of watch it play (ie slideshow). so you'll basically be playing a physics upgrade mod for HL. along with all the new content for HL2 (maps/textures/models/story etc).

    ------------------------

    as for the comparison between HL2/Doom3 that you lamers can't give up on:

    HL2 undoubtably has more dynamic gameplay than Doom3. Doom3 definitely has more atmospheric mood-driven gameplay than HL2.

    imo, there is no such thing as better gameplay. just as there is no such thing as a more fun game. it's just a matter of preference.

    a product is what a product is.. if you prefer apples, eat apples.. if you prefer oranges, eat oranges....

    if it's so important to you to argue why one is better than the other, then ur a politician..

    terrorists are politicians too y'know.

    to all u who bought new hardware to play the game before it comes out... i just avoid gambling all together and wait for the game to come out first.

    -Chazz
  • Anonymous User - Thursday, September 11, 2003 - link

    #62

    DX9 is based on 24 bit coding.

    NVidias Hardware renders at 16 and 32 bit. 32 bit is too slow and 16 bit renders with IQ loss thanks to the lesser precision.

    Also, R3x0 hardware renders 8 textures per pass, while NVidia renders 4 or 8 textures per pass depending on code. Using Single texturing and advanced DX code (ie DX9) the engine works at 4 textures per cycle, even when using smaller precision shader code. The problem is the hardware, not the drivers.
  • Anonymous User - Thursday, September 11, 2003 - link

    #54.

    WTF are you talking about?

    I've ran here games as old as Rogue Squadron, DarkStone, and not so old HomeWorld....they all run fine even when using AA and AF.

Log in

Don't have an account? Sign up now