You'll see my own numbers tomorrow night at midnight, but we've been given the go ahead to reveal a bit of information about Half-Life 2. I'll keep it brief and to the point and will explain it in greater detail tomorrow night:

- Valve is pissed at all of the benchmarking "optimizations" they've seen in the hardware community;
- Half-Life 2 has a special NV3x codepath that was necessary to make NVIDIA's architecture perform reasonably under the game;
- Valve recommends running geforce fx 5200 and 5600 cards in dx8 mode in order to get playable frame rates.
- even with the special NV3x codepath, ATI is the clear performance leader under Half-Life 2 with the Radeon 9800 Pro hitting around 60 fps at 10x7. The 5900 ultra is noticeably slower with the special codepath and is horrendously slower under the default dx9 codepath;
- the Radeon 9600 Pro performs very well - it is a good competitor of the 5900 ultra;
- ATI didn't need these special optimizations to perform well and Valve insists that they have not optimized the game specifically for any vendor.

There you have it folks, there's much more coming tomorrow.
Comments Locked

169 Comments

View All Comments

  • Anonymous User - Thursday, September 11, 2003 - link

    NV is saying that Det50 will be THEIR BEST DRIVERS EVER" i guess they will be great running normal DX9 code without optimazations. HELL NO they are going to do is even more! no fog, 12 presition, crap IQ, no AF+AA with Hi res. But ATI also has "THEIR BEST DRIVERS EVER" i now they will be DX9 "NO OPTM.", 24 presition, awsome IQ, AF+AA with Hi res. Too bad NV the shit hit the fence this generation is crap. And 9600 comming to sub 100 market in a few months. NV has lost High and Mid range market to ATI and DX9 in the low end will be Ati domain too. If you own ATI stock sold it until NV40 this gen is crap!!
  • Anonymous User - Thursday, September 11, 2003 - link

    Basically the nVidia performance stinks, either way IMHO. If the new 5x.xx drivers fix it, then so be it, and that will be great for those cards and then they can run future Valve games.
    Game runs fine a Ti4600 using DX8.

    However, the new ATI cards only have 24bit shaders!
    So would that make ALL current ATI cards without any way to run future Valve titles?

    Perhaps I do not understand the technology fully, can someone elaborate on this?
  • Anonymous User - Thursday, September 11, 2003 - link

    Here is how your performance will be, depending on the card you have:
    If you have a DX9 ATI card, your performance will be ok.
    If you have a current DX9 nVidia card, your performance will not be near as good as an equivalently priced ATI card. Of course nVidia will release a "fixed" card within about 12 months, it would be suicide not to.

    If you're using a DX8.x part, like a Ti4200 or an equivalent generation Radeon, then the performance will be roughly the same.

    Likewise, DooM3 is rendered using OpenGL, and therefore whatever card you own will run as well as it can run OpenGL. DirectX compliance will have no effect on your FPS in Doom. Some websites have benchmarks for OpenGL games, you can review these to get a good impression of how each kind of card will perform.
  • Anonymous User - Thursday, September 11, 2003 - link

    lol@113
  • Anonymous User - Thursday, September 11, 2003 - link

    Well I'm still running on my Ti4400 - will wait to see how "horrible" it is before I make any changes.

    I think it is funny though. I've got some radeon and nvidia cards here. (I'm never cutting edge though - my fastest PC is only a p41.8)

    What a silly thing to gloat or argue about. I was never fond of ATI because I was never satisfied with the timeliness or reliability of their drivers in the past (maybe that's changed now, I'm not sure.) When I upgrade I just buy whatever the best card is at the $150-$175 range.

    To the point of whom is conspiring with whom is silly as well. There is absolutely nothing wrong with a company partering with another, or even making their product work better with another's. Even if that's not what is going on. There isn't anything illegal or nefarious about it. It's called the free market. So your silly talk about a class-action lawsuit against nvidia is meritless. They sold you a card that does work (compatible) with Directx9. Now since the card came out BEFORE the standard, and DX9 games came out AFTER the card, it's your own choice to purchase a card that may not be the BEST card.

    Some of you need a serious lesson in market economics. The power you have is to vote with your wallets. Now that can start with a healthy debate of the facts, but this is a useless debate of speculation and conspiracy theories.

    Valve's biggest interest is to provide a game to the largest audience possible, and to make gaming folks happy. That nets them more profits. And that's the wonderful thing about a free market economy. I highly doubt Valve would want to intentionally do anything to alienate the nvidia market, since there is a gigantic install base of nvidia based GPUs.

    I'll play HL2 on my P41.8 w/GF4-Ti4400 128M card. If I want to turn on ALL the eye-candy and run at more FPS, then I'll have to spend some cash and build myself a new gaming rig. My guess is that it will probably run pretty well and I'll be perfectly satisfied on my current machine. After the initial release I'm guessing that future patches of HL2 and Det5 will eeek out a couple extra FPS, and that will just be an added bonus.

    I will buy HL2 for a couple reasons. The first is rewarding Valve's loyalty to their customers. I spent probably $50 buying HL1 back in 1998 and I got 5 good years of gaming enjoyment. They maintained patches and support WAY BEYOND any resonable expectations for a game. I got some REALLY good extended play out of their game with FREE mods like CS and DoD. I will buy HL:2 to show that their loyalty to me will be rewarded in turn. I'd like to send a message to other game developers that open platformed games and long-term support is what we insist on, and reward with our money. The other reason is Valve showing that they will accept nothing less than cutting edge. HL1 was groundbreaking at the time, and HL2 looks like it will be the same.
  • PandaBear - Thursday, September 11, 2003 - link

    Dude, the problem with Nvidia is not because they intend to build a crappy card based on 3Dfx technologies. They went with the .13 micron process at TSMC that is rather new and too bleeding edge at the time. ATI is using a more conservative .15 micron process that is easier to get good yield out of so they are fine right now.

    From what I heard, if Nvidia is more conservative back then when they design nv30 they would have been ok. ATI decided to wait and let everyone else at TSMC (including Nvidia) works out all the process problem before jumping aboard.
  • Anonymous User - Thursday, September 11, 2003 - link

    It had taken Jen-Hsun Huang many a market cycle and an enormous amount of money, but on that fateful Friday in December, 3dfx was finally his. Yes, the ancient Voodoo technology was his at last.. but, unbeknownst to him, so was its curse.
  • Anonymous User - Thursday, September 11, 2003 - link

    "There is no need to gloat to nVidia card owners, because you'd have to have a LOT of contacts to know that this was coming. "

    Not at all. All you had to do was pay attention to the supposedly "useless" synthetic Pixel Shader 2.0 benchmakrs that have been around for MONTHS.
    They told exactly the same story - the GeForceFX family of cards have abominably bad PS2.0 performance unless you tailor-make a partial precision path specifically for them. That the partial precision path results in lower image quality isnt as important............
    ....when your drivers detect SCREEN CAPTURING and then MANIPULATE THE OUTPUT QUALITY to make it appear better!

    If nvidia had designed a part that ran DX9 code properly and with decent speed, there would be no discussion here.
    The fact is they didn't. And their only recourse until NV40 has been to manipulate their drivers to an almost unbelievable extent, reducing quality and introduciig arbitrary clip planes at will.

    I don't own an ATI PS2.0 capable part, but it has been obvious since the introduction of the 9700 that it is the one to buy.
  • dvinnen - Thursday, September 11, 2003 - link

    #106) Orginal HL is based off the orginal Quake engin which was based off OpenGL. If you notice, the DX version of HL sucks, it sucks alot. Looks terrible is buggy. It isn't supported as fully as the OpenGL version. THere are a buch of lighting effects in OpenGL version that aren't in DX mode. The only time I use DX mode is to debug. Now imagin porting the vastly more complex HL2 over.
  • Anonymous User - Thursday, September 11, 2003 - link

    To get a good overview of this thread minus most of the fanboyism, use this index with my opinions:

    #24 right on! too bad this thread didn't end here though.
    #33 .. um...no
    #39 320x240 (just like in Quake on 486 days)
    #42 agree
    #44 1993
    #62 because of IPC (efficiency)
    #71 correct!
    #72 LOL
    #76 BINGO
    #80 correct
    #81 it is ALWAYS a marketing war, silly.
    #86 heavy denial (and when the fairy godmother came, she turned him into a real DX9 card...)
    #93 e x a c t l y
    #103 DONT READ THIS ONE... I actually did, and always fell asleep and fell out of my chair. Nah, just kidding... but it's the best "summary" of this thread.
    #106 Actually if you've ever done Fortran/VB/C/C++/Java/Perl then you would know that "programming" isn't just fun-time... it's a lot of work, and it sucks if you have to do it twice.

    Personally I think everybody should chill and read this article: http://www.3dcenter.org/artikel/cinefx/index_e.php

Log in

Don't have an account? Sign up now