Yes, NVIDIA leads the way in performance. They own the fastest single GPU card, the fastest multiGPU single card, and the fastest multi card configurations. People who want the best of the best do pay a premium for the privilege, but that isn't something everyone is comfortable with. Most of us would much rather see a high end card that doesn't totally depart from sanity in terms of actual value gained through the purchase. Is the 9800 GTX that solution? That's what we are here to find out.

We've gotten a lot of feedback lately about our test system. Yes, at the very high end we haven't seen what we would have expected if all things were equal between all platforms. But the fact is that making a single platform work for apples to apples comparisons between CrossFire and SLI is worth it. With this review, we aren't quite there, as we just uncovered a HUGE issue that has been holding us back from higher performance with our high end hardware. We do have some numbers showing what's going on, but we just didn't have time to rerun all of our hardware after we discovered the solution to the issue. But we'll get to that shortly.

The major questions we will want to answer with this review are mostly about value. This card isn't a new architecture and it isn't really faster than other single card single GPU solutions. But the price point does make a difference here. At about $400, AMD's Radeon 3870X2 will be a key comparison point to this new $300 part. With the 8800 Ultra and GTX officially leaving the scene, the 9800 GX2 and 9800 GTX are the new top two in terms of high end hardware at NVIDIA. The price gap between these two is very large (the 9800 GX2 costs about twice as much as a stock clocked 9800 GTX) and the 3870X2 falls right in between them. Does this favor AMD or NVIDIA in terms of value? Does either company need to adjust their price point?

Things are rarely straightforward in the graphics world, and with the crazy price points and multi-GPU solutions that recently burst on to the scene, we’ve got a lot of stuff to try and make sense out of. Let us take you through the looking glass...

The 9800 GTX and EVGA’s Cards
POST A COMMENT

48 Comments

View All Comments

  • Jangotat - Friday, April 18, 2008 - link

    The way they're setting this up is great but they need to fix a few things 1 use a 790i Asus motherboard 2 use OCZ 1600 platinum memory 3 let us see some benchmarks with 3-way 8800 ultra cards that would be sweet

    Platinum memory has custom timings for asus, and asus doesn't have issues like EVGA and XFX do. And we really need to see the 3-way ultra setup to see what's really the best for crysis and everything else

    You guys could do this right?
    Reply
  • LSnK - Wednesday, April 02, 2008 - link

    What, are you guys running out of zeros or using some ancient text mode resolution? Reply
  • Mr Roboto - Thursday, April 03, 2008 - link

    Derek, you say that a 25% decrease in performance resulted from disabling VSYNC in Crysis and WIC. However you then say in the next sentence that performance gains can be had by disabling VSYNC? Maybe I'm misunderstanding?

    "Forcing VSYNC off in the driver can decrease performance by 25% under the DX10 applications we tested. We see a heavier impact in CPU limited situations. Interestingly enough, as we discussed last week, with our high end hardware, Crysis and World in Conflict were heavily CPU and system limited. Take a look for yourself at the type of performance gains we saw from disabling VSYNC".
    Reply
  • Evilllchipmunk89 - Wednesday, April 02, 2008 - link

    Seriously what about the AMD 790FX board? you will test the Nvidea cards on thier "home platform/790I" platform, But what not the ATI cards home platform. Obviously you can get more performance if you had the 790FX board that was made more specificly for the Radeon3870s
    where you can tweek more aspects of the card. In an earlyer review you showed us that with nothing changed but the board the 780I outperformed the skulltrail on the Nvidia cards but you wint even mess with the ATI boards
    Reply
  • just4U - Tuesday, April 01, 2008 - link

    I dont quite understand why they just didnt go with a 512bit interface like on the X2's. That's what I was expecting anyway.

    One thing that has me surprised. I was checking my local store on the web for "new arrivals" (a feature where new listings appear daily) and saw the GTX and was thinking hey wait .. Annand hasn't even reviewed this yet and it's in stock???! wow. I imediately came here and there the review was :D So nvidia is trying to stay on top of the hard launch which is nice to see but mmmm.. still troubled by that no 512bit interface. To me it still seems like a GTS/512.
    Reply
  • 7Enigma - Wednesday, April 02, 2008 - link

    And yet the GTS wasn't included in the review... Reply
  • deeznuts - Tuesday, April 01, 2008 - link

    It's actually "lo and behold" and I'm not even sure it's being used right. You propably are, but essentially you're saying, "look, see, I looked, and saw ..." Reply
  • Olaf van der Spek - Tuesday, April 01, 2008 - link

    So what is the cause of the vsync issue? I don't see an explanation of that.
    It'd be interesting to know why performance drops with vsync off.
    Reply
  • finbarqs - Tuesday, April 01, 2008 - link

    Haha Happy April Fools day! Reply
  • prophet001 - Tuesday, April 01, 2008 - link

    you guys write some nice reviews on this website but the visuals are a little lacking. i guess when i read an RSS feed that talks about 9800 gtx triple SLI then i kinda expect to see at least a picture of a mobo with 3 cards on it and a uranium iv. i know, it's about the results, but more neat pictures would be nice :) Reply

Log in

Don't have an account? Sign up now