You'll see my own numbers tomorrow night at midnight, but we've been given the go ahead to reveal a bit of information about Half-Life 2. I'll keep it brief and to the point and will explain it in greater detail tomorrow night:

- Valve is pissed at all of the benchmarking "optimizations" they've seen in the hardware community;
- Half-Life 2 has a special NV3x codepath that was necessary to make NVIDIA's architecture perform reasonably under the game;
- Valve recommends running geforce fx 5200 and 5600 cards in dx8 mode in order to get playable frame rates.
- even with the special NV3x codepath, ATI is the clear performance leader under Half-Life 2 with the Radeon 9800 Pro hitting around 60 fps at 10x7. The 5900 ultra is noticeably slower with the special codepath and is horrendously slower under the default dx9 codepath;
- the Radeon 9600 Pro performs very well - it is a good competitor of the 5900 ultra;
- ATI didn't need these special optimizations to perform well and Valve insists that they have not optimized the game specifically for any vendor.

There you have it folks, there's much more coming tomorrow.
Comments Locked

169 Comments

View All Comments

  • Anonymous User - Thursday, September 11, 2003 - link

    ha... i am glad i don't have the money to buy all those new cards... sticking with my "old" ti4200 :)
  • Anonymous User - Thursday, September 11, 2003 - link

    106 -

    The reason valve went with DX9 over the new OGL engine is that DX9 is more mature then the new OGL standard...which isn't even officially released yet.
  • Icewind - Thursday, September 11, 2003 - link

    #106, Doom 3 is built on the new Glide engine, Duh.
  • Anonymous User - Thursday, September 11, 2003 - link

    In HL1 we had the choice of OGL or DX. Why can't a game support both formats now?

    Would the game have to be completely redesigned to support OGL, or are the 2 formats so much more different from each other than they were in 1998?

  • AgaBooga - Thursday, September 11, 2003 - link

    #104, they did find a way to run Nvidia cards smoothly, DX 8. They can't guarantee full support with all cards, but they have done their best. I don't think HL2 would be geared towards ATI cards, but isntead thats just how it worked out.

    I wonder if Doom 3 will have any similar problems like this... with Nvidia cards
  • Anonymous User - Thursday, September 11, 2003 - link

    Well, IMHO I think Valve will suffer more because they aren't reaching the whole market of Nvidia users out there. For those guys who want to spend the extra money for hardware just to play HL2, good for you, but I personally don't have the money to throw around like that everytime I want to play the lastest game.
  • Anonymous User - Thursday, September 11, 2003 - link

    Yes, they are. I doubt saying this is going to affect anything, but seriously, these flames are getting WAY out of hand.

    People with GeForce FXes who are sore about their cards being slow about HL2, them's the breaks, it's not your fault. Who knew? But don't accuse Valve of going Futuremark on the community until there's at least a grain of evidence. And don't tell me "Valve and ATi working together is evidence enough," because IIRC, the partnership happened after Valve saw how the FXes did with HL2.. and, thus, after NV30. Ahem.

    As for you ATi folks, yes, it's nice. There is no need to gloat to nVidia card owners, because you'd have to have a LOT of contacts to know that this was coming. The FXes do perfectly fine in normal benchmarks, with the obvious exception of NV30. (The 5200 Ultras and 5600 regular are pretty bad cards too, in my opinion, since the 5200U is as expensive as the Ti4200 and a lot slower, and the 5600 regular is pricier than the Ti4200 and slightly slower. You know, the Ti4200 really is a good card. .. uh oh. I've gotten sidetracked.) The FX 5600 Ultra 400/800 was the best midrange card around (well, since nobody knew about HL2 performance), even if it was tricky to find, and the FX 5900 Ultra dominated the high end. (The 9800 Pro came close, but unless one of them cheated I'd say it was a win for nVidia, albeit a small one.) They didn't make a stupid choice, they probably decided on an nVidia card because of benchmarks or because of good experiences with them. Okay? No more of this. It's stupid and immature.

    And to people on both sides of the line, just because someone says something stupid is no reason to flame them. Maybe they're trolling, probably not, but either way you'll do better just politely explaining why what they said was incorrect and/or illogical. Name-calling just makes you look worse. And if there's an all-out flame, ignore it.

    Why am I putting this in a comment thread? Hmm. I guess I have too much time on my hands. OTOH, this HAS gotten sort of ridiculous... well, whatever. It's not as if anyone's going to pay attention to any of what I typed, they'll just skip over it and say something about how stupid those goddamned blind fanATIcs have to be if they don't realize that Valve is totally being bribed by ATi and the Evil Moon People to cripple FXes in HL2 or how stupid those goddamned blind nVidiots are to buy GeForce FXes when they obviously should have a tech demo of HL2 on hand. Eh, I tried.

    Whoever made the comment about OpenGL and DirectX was very right; Doom III is a very different game, and the FXes seem to only fail with lots of DX9 usage. They certainly perform well in OGL, though, looks like.

    God, I remember all the reviews saying the FX 5200 Ultra was decent because while it was slower than the comparably priced Ti4200, it was DX9. Ha. =(

    Since this is a video card thread-thingy, I guess I should end by stating what sort of video card I have and either insinuating that my use of it makes me unbiased (if I use a card from the company that I just explained my problems with, or if I use some cruddy aging integrated thing) or explaining that just because I use it doesn't mean I'm biased (if I use a card from the company I backed up). (You're supposed to include that in posts on these things, usually at the end, just like how in CPU-related posts you have to make a joke about cooking food on a Prescott, or how in heatsink-related posts you have to mention that your current [insert cooling solution here] does just fine for you-- and if it's a non-air-cooled system, you are required to make a happy emoticon afterwards, possibly one with sunglasses. If you don't do these things your opinion is automatically invalidated.) Well, I'm not going to, because then someone would almost certainly call me a fanboy.

    This post is way too long. It ends now.
  • dvinnen - Thursday, September 11, 2003 - link

    they should make you register to post. These kiddie flames are getting annoying.
  • Anonymous User - Thursday, September 11, 2003 - link

    #27, here is something that explains some of the differences between DX8 and DX9 with visuals.

    http://ati.com/vortal/r300/dx9demo/main.html
  • Anonymous User - Thursday, September 11, 2003 - link

    I'd actually put some Value in what Gabe says if he wasn't on the payroll of ATI. ATI and Valve have been working together for quiet some time now. Now lets really think about this, Gabe = former Microsoft worker. Microsoft = known for making bs claims and undercutting the competition. Valve = makes more money if ATI cards sell better. Hmmm, should I really trust this guy? Probably not. I'll wait for a non-bias third party says Nvidia fucked up dx9. Till then, I put Gabe right next to the stats on AMD site comparing P4 and Athlon and the study sponsored by Microsoft that shows Windows 2003 is faster than Linux.

    As for the 5 times longer on developement, I have very little respect for the staff at Valve in that area. They have repeatedly show that they aren't competent when it comes to coding.

Log in

Don't have an account? Sign up now