Power Consumption

Final Words

So, now that we have the 9800 GTX in the mix, what has changed? Honestly, not as much in terms of performance stack as in price. Yes, the 8800 Ultra is better than the 9800 GTX where memory bandwidth is a factor, but other than that the relationship of the 9800 GTX to the 3870X2 is largely the same. Of course, NVIDIA would never sell 8800 Ultra below the 3870X2 price of $400 (the binned 90nm G80 glued on there didn’t come cheap).

The smaller die size of the G92 based 9800 GTX takes away one victory AMD had over NVIDIA: the more expensive 8800 Ultra was slower than AMD’s top of the line. Without significantly improving (and sometimes hurting) performance over the 8800 Ultra (because they didn’t really need to with the 9800 GX2 in their pocket), NVIDIA has brought more competition to AMD’s lineup, which is definitely not something they will be happy about.

It is nice to have this card come in at the $300 price point with decent performance, but the most exciting thing about it is the fact that picking up two of them will give you better performance than a single 9800 GX2 for the same amount of money. Two of them can even start to get by in Crysis with Very High settings (though it might offer a better experience with one or two features turned down a bit).

While our very limited and rocky experience with 3-way SLI may have been tainted by the engineering sample board we used, the fact that we can get near 9800 GX2 Quad SLI performance for 3/4 of the costs is definitely a good thing. The fact this set up MUST be run in an nForce board is a drawback, as we would love to test in a system that can run every configuration under the sun. We’re getting closer with Skulltrail, and we aren’t missing the fact that there are concerns among our readers over its use. But we’re confident that we can push performance up and turn it into our workhorse for graphics, especially now that the VSYNC issue has been cleared up.

While testing this group of cards has been difficult with all the problems we experienced, we are very happy to have a solid explanation for what was causing our decreased performance we were seeing. Now all we need is an explanation for why forcing VSYNC off in the driver causes such a huge performance hit.

Once Again, The Rest
Comments Locked

49 Comments

View All Comments

  • Jangotat - Friday, April 18, 2008 - link

    The way they're setting this up is great but they need to fix a few things 1 use a 790i Asus motherboard 2 use OCZ 1600 platinum memory 3 let us see some benchmarks with 3-way 8800 ultra cards that would be sweet

    Platinum memory has custom timings for asus, and asus doesn't have issues like EVGA and XFX do. And we really need to see the 3-way ultra setup to see what's really the best for crysis and everything else

    You guys could do this right?
  • LSnK - Wednesday, April 2, 2008 - link

    What, are you guys running out of zeros or using some ancient text mode resolution?
  • Mr Roboto - Thursday, April 3, 2008 - link

    Derek, you say that a 25% decrease in performance resulted from disabling VSYNC in Crysis and WIC. However you then say in the next sentence that performance gains can be had by disabling VSYNC? Maybe I'm misunderstanding?

    "Forcing VSYNC off in the driver can decrease performance by 25% under the DX10 applications we tested. We see a heavier impact in CPU limited situations. Interestingly enough, as we discussed last week, with our high end hardware, Crysis and World in Conflict were heavily CPU and system limited. Take a look for yourself at the type of performance gains we saw from disabling VSYNC".
  • Evilllchipmunk89 - Wednesday, April 2, 2008 - link

    Seriously what about the AMD 790FX board? you will test the Nvidea cards on thier "home platform/790I" platform, But what not the ATI cards home platform. Obviously you can get more performance if you had the 790FX board that was made more specificly for the Radeon3870s
    where you can tweek more aspects of the card. In an earlyer review you showed us that with nothing changed but the board the 780I outperformed the skulltrail on the Nvidia cards but you wint even mess with the ATI boards
  • just4U - Tuesday, April 1, 2008 - link

    I dont quite understand why they just didnt go with a 512bit interface like on the X2's. That's what I was expecting anyway.

    One thing that has me surprised. I was checking my local store on the web for "new arrivals" (a feature where new listings appear daily) and saw the GTX and was thinking hey wait .. Annand hasn't even reviewed this yet and it's in stock???! wow. I imediately came here and there the review was :D So nvidia is trying to stay on top of the hard launch which is nice to see but mmmm.. still troubled by that no 512bit interface. To me it still seems like a GTS/512.
  • 7Enigma - Wednesday, April 2, 2008 - link

    And yet the GTS wasn't included in the review...
  • deeznuts - Tuesday, April 1, 2008 - link

    It's actually "lo and behold" and I'm not even sure it's being used right. You propably are, but essentially you're saying, "look, see, I looked, and saw ..."
  • Olaf van der Spek - Tuesday, April 1, 2008 - link

    So what is the cause of the vsync issue? I don't see an explanation of that.
    It'd be interesting to know why performance drops with vsync off.
  • finbarqs - Tuesday, April 1, 2008 - link

    Haha Happy April Fools day!
  • prophet001 - Tuesday, April 1, 2008 - link

    you guys write some nice reviews on this website but the visuals are a little lacking. i guess when i read an RSS feed that talks about 9800 gtx triple SLI then i kinda expect to see at least a picture of a mobo with 3 cards on it and a uranium iv. i know, it's about the results, but more neat pictures would be nice :)

Log in

Don't have an account? Sign up now