The 9800 GTX and EVGA’s Cards

The 9800 GTX is a 128 shader, G92 based card (yes, another one) that comes in at 675MHz core, 1.69GHz shader clock, and 2.2GHz (effective) memory clock. This puts the raw power of the card up over the 8800 Ultra, but there is one major drawback to this high end part: it only has a 256-bit memory bus hooked up to 512MB of RAM.

The added memory might not come into play a lot, but the fact that the 8800 Ultra has essentially 50% more effective memory bandwidth does put it at an advantage in memory performance limited situations. This means there is potential for performance loss at high resolutions, high levels of AA, or in games with memory intensive effects. While we get that $300 US puts this card in a different class than the 8800 Ultra, and thus NVIDIA is targeting a different type of user, we would have liked to see a card with more bandwidth and more memory (especially when we look at the drop off in performance between Crysis at 19x12 and 25x16).

9800 GTX cards are capable of 3-way SLI with the two SLI connectors on the top. Of course, NVIDIA requires that we use an NVIDIA motherboard for this purpose. We are not fans of artificial technical limitations based on marketing needs and would much prefer to see SLI run on any platform that enables multiple PCIe x16 slots. With normal SLI, we do have the Skulltrail option, but NVIDIA has chosen not to enable 3-way capability on this board either.

 

We wanted to be able to include 3-way SLI numbers in our launch review (which has been one incredible headache, but more on that later), and EVGA was kind enough to help us out by providing the hardware. We certainly appreciate them enabling us to bring you numbers for this configuration today.

We were also able to get our hands on a C0 engineering sample 790i board for testing. Let’s just say that the experience was … character building. Running a QX9770 and 1333Mhz DDR3 at 9:9:9:24, we had what could best be described as a very rough time getting 3-way and even quad SLI with two 9800 GX2 boards to work in this system. We were lucky to get the numbers we did get. Let’s take a look at what we tested with

Index The Test
Comments Locked

49 Comments

View All Comments

  • Jangotat - Friday, April 18, 2008 - link

    The way they're setting this up is great but they need to fix a few things 1 use a 790i Asus motherboard 2 use OCZ 1600 platinum memory 3 let us see some benchmarks with 3-way 8800 ultra cards that would be sweet

    Platinum memory has custom timings for asus, and asus doesn't have issues like EVGA and XFX do. And we really need to see the 3-way ultra setup to see what's really the best for crysis and everything else

    You guys could do this right?
  • LSnK - Wednesday, April 2, 2008 - link

    What, are you guys running out of zeros or using some ancient text mode resolution?
  • Mr Roboto - Thursday, April 3, 2008 - link

    Derek, you say that a 25% decrease in performance resulted from disabling VSYNC in Crysis and WIC. However you then say in the next sentence that performance gains can be had by disabling VSYNC? Maybe I'm misunderstanding?

    "Forcing VSYNC off in the driver can decrease performance by 25% under the DX10 applications we tested. We see a heavier impact in CPU limited situations. Interestingly enough, as we discussed last week, with our high end hardware, Crysis and World in Conflict were heavily CPU and system limited. Take a look for yourself at the type of performance gains we saw from disabling VSYNC".
  • Evilllchipmunk89 - Wednesday, April 2, 2008 - link

    Seriously what about the AMD 790FX board? you will test the Nvidea cards on thier "home platform/790I" platform, But what not the ATI cards home platform. Obviously you can get more performance if you had the 790FX board that was made more specificly for the Radeon3870s
    where you can tweek more aspects of the card. In an earlyer review you showed us that with nothing changed but the board the 780I outperformed the skulltrail on the Nvidia cards but you wint even mess with the ATI boards
  • just4U - Tuesday, April 1, 2008 - link

    I dont quite understand why they just didnt go with a 512bit interface like on the X2's. That's what I was expecting anyway.

    One thing that has me surprised. I was checking my local store on the web for "new arrivals" (a feature where new listings appear daily) and saw the GTX and was thinking hey wait .. Annand hasn't even reviewed this yet and it's in stock???! wow. I imediately came here and there the review was :D So nvidia is trying to stay on top of the hard launch which is nice to see but mmmm.. still troubled by that no 512bit interface. To me it still seems like a GTS/512.
  • 7Enigma - Wednesday, April 2, 2008 - link

    And yet the GTS wasn't included in the review...
  • deeznuts - Tuesday, April 1, 2008 - link

    It's actually "lo and behold" and I'm not even sure it's being used right. You propably are, but essentially you're saying, "look, see, I looked, and saw ..."
  • Olaf van der Spek - Tuesday, April 1, 2008 - link

    So what is the cause of the vsync issue? I don't see an explanation of that.
    It'd be interesting to know why performance drops with vsync off.
  • finbarqs - Tuesday, April 1, 2008 - link

    Haha Happy April Fools day!
  • prophet001 - Tuesday, April 1, 2008 - link

    you guys write some nice reviews on this website but the visuals are a little lacking. i guess when i read an RSS feed that talks about 9800 gtx triple SLI then i kinda expect to see at least a picture of a mobo with 3 cards on it and a uranium iv. i know, it's about the results, but more neat pictures would be nice :)

Log in

Don't have an account? Sign up now