The Test

Once again we used the Skulltrail system for most of our comparisons. We’ve added on the 790i board for 3-way SLI performance scaling tests.

I didn’t believe I would be saying this so soon, but our experience with 790i and SLI has been much much worse than on Skulltrail. We were plagued by power failure after power failure. With three 9800 GTX cards plugged in, the system never got up over 400 W when booting into windows, but after a few minutes the power would just flicker and cut out.

It didn’t make sense that it was the PSU size, because it wasn’t even being loaded. We did try augmenting the PSU with a second one to run one of the cards, but that didn’t work out either. The story is really long and arduous and for some reason involved the Power of the Dark Side, but our solution (after much effort) was to use one power supply for the system and graphics cards and one power supply for the drives and fans. Each PSU needed to be plugged into its own surge protector and needed to be on different breakers.

The working theory is that power here isn’t very clean, and the 790i board is more sensitive to fluctuations in the quality of the power supplied (which is certainly affected by the AC source). Isolating breakers and using surge protectors was the best we could do, and we are very thankful it worked out. It seems likely that a good quality 1000-1500 VA UPS would have been enough to provide cleaner power and solve the issue, but we didn’t have one to test with.

Once we handled this we were mostly able to benchmark. We could get a good 15 minutes of up time out of the system, but after repeated benchmarking instability crept back in and we’d need to wait a while before we tried again. The majority of these problems were on 3-way and Quad SLI, but we did have a hiccup with a two card SLI configuration as well. We didn’t have any trouble at all with single card solutions (even single 9800 GX2 solutions).

Before anyone says heat, we were testing in an open air environment in a room with an ambient temp of about 15 degrees C, with one 120mm fan blowing straight into the back of the GPUs and another blowing through the memory (we did take care not to interfere with the CPU HSF airflow as well). The graphics cards did get warm, but if heat was the issue here, I’d better get a bath of LN2 to run this thing submerged in ready.

It is very important that we note one more time that this is the C0 engineering sample stepping and that NVIDIA explicitly told us that stability might be an issue in some situations. The retail C1 stepping should not have these issues.

Here’s our test setup:

Test Setup
CPU 2x Intel Core 2 Extreme QX9775 @ 3.20GHz
Motherboard Intel D5400XS (Skulltrail)
Video Cards ATI Radeon HD 3870 x2
NVIDIA GeForce 8800 Ultra
NVIDIA GeForce 9800 GTX
NVIDIA GeForce 9800 GX2
Video Drivers Catalyst 8.3
ForceWare 174.74
Hard Drive Seagate 7200.9 120GB 8MB 7200RPM
RAM 2xMicron 2GB FB-DIMM DDR2-8800
Operating System Windows Vista Ultimate 64-bit SP1

The 9800 GTX and EVGA’s Cards Crysis, DX10 and Forcing VSYNC Off in the Driver
Comments Locked

49 Comments

View All Comments

  • crimsonson - Tuesday, April 1, 2008 - link

    Although the graphs works well it is gets very difficult to read when there are a lot of test subjects. Can you guys find another way? Trying to trace a dozen lines and trying to make a distinction between them is rather hard and defeats the whole purpose of a visual AID. I end up reading the spreadsheet instead.

    .02
  • Spacecomber - Wednesday, April 2, 2008 - link

    These graphs are beginning to resemble the one's from AnandTech's heatsink reviews, which is not a good thing. Jamming as much information as you can into a single graph serves no purpose. They're unreadable. The reader is forced to use the table, instead, which means the graph has failed as an illustration.
  • geogaddi - Tuesday, April 1, 2008 - link


    seconded. and reading spreadsheets for me is like shooting pool with a piece of rope...

    .04
  • jtleon - Tuesday, April 1, 2008 - link

    I must second that sentiment. Isn't the objective of web content to clearly communicate a message and minimize confusion? There are many data communication tools available, one example is DaDISP (www.dadisp.com) which is extremely powerful and rather cost effective. I don't work for dadisp, only use it on a regular basis. The free evaluation version should satisfy your needs easily.

    Regards,
    jtleon
  • 7Enigma - Tuesday, April 1, 2008 - link

    I'm really dissapointed to see the 8800GT not present in this review. As a person just getting ready to build a system I am on the fence with purchasing this new 9800GTX, or saving >$100 and going with the 8800GT until we actually get a next-gen part.

    Since the platforms/issues are around I cannot really compare these results to previous reviews. If you could please comment, or throw the 8800GT on for a couple quick gaming benchmarks I (we) would be greatly appreciative!
  • Spuke - Tuesday, April 1, 2008 - link

    I was hoping I could finally see a comparison between the 8800GT 512MB and the 9600GT. All Anandtech has is a review of the 8800GT 256BM versus the 9600GT.
  • 7Enigma - Tuesday, April 1, 2008 - link

    8800GT and 9600GT are directly compared here:

    http://www.xbitlabs.com/articles/video/display/gai...">http://www.xbitlabs.com/articles/video/...ainward-...

    "As we have seen in the gaming tests, 64 execution units are enough for most of modern games. We’ve only seen a serious performance hit in comparison with the Nvidia GeForce 8800 GT 512MB in such games as Bioshock, Crysis, Unreal Tournament 3 and Company of Heroes: Opposing Fronts. In a few tests the GeForce 9600 GT was even faster than the more expensive and advanced GeForce 8800 GT 512MB due to the higher frequency of the core. "

    So basically its a good stop-gap solution for right now, but I would probably go with the 8800GT while waiting for the next gen cards (even at typical 19" LCD resolutions). If ATI/AMD was competetive currently I think the 9600GT would be the perfect card, but we have no idea how long Nvidia will milk their crown, and in turn how long before that next gen card actually takes to come out.
  • bill3 - Tuesday, April 1, 2008 - link

    So you're saying Nvidia doesn't like more profits? And they like to help AMD?

    Because that's what not bringing out a next gen card does.

    Is Intel going to delay Penyrns successor now to milk their lead also?

    Nvidia doesn't have any other card ready, period. Because they are too slow, period.
  • 7Enigma - Wednesday, April 2, 2008 - link

    Lol Bill, you need to learn something about business. If you have the lead virtually across the board, any system builder but far more importantly OEM's will purchase your card if the price is right (ie AMD/ATI not undercutting for the sake of survival as they are with their Phenom cpu's). It doesn't matter if the top of the line is 2X as fast as the competition or 5X as fast, it will be purchased because it, for the time being, is the best. The same follows for the lower-grade cards. There is no reason to bring out the next-gen card killer until the competition brings something to the table that is actually competetive (or gasp...better). And funny you mention Intel because I believe they are doing EXACTLY that with their new quad's that aren't the very top of the line. These long delays and extremely limited availability seems to smack of milking for all its worth.

    That's good business I can't fault them for doing it. Sucks for us, but smart for them.
  • Jovec - Tuesday, April 1, 2008 - link

    Yes, AT should pick 2-3 of the most common cards (how many of us are still running 8800GTS cards?) and include numbers as a baseline. Or have a low-mid and mid/high system that all cards get tested on for easy comparison in ongoing graphs. The question most of have is "I have card X, if I buy card Y how much of an improvement will it be and is it worth the cost?"

Log in

Don't have an account? Sign up now