Power Consumption

Final Words

So, now that we have the 9800 GTX in the mix, what has changed? Honestly, not as much in terms of performance stack as in price. Yes, the 8800 Ultra is better than the 9800 GTX where memory bandwidth is a factor, but other than that the relationship of the 9800 GTX to the 3870X2 is largely the same. Of course, NVIDIA would never sell 8800 Ultra below the 3870X2 price of $400 (the binned 90nm G80 glued on there didn’t come cheap).

The smaller die size of the G92 based 9800 GTX takes away one victory AMD had over NVIDIA: the more expensive 8800 Ultra was slower than AMD’s top of the line. Without significantly improving (and sometimes hurting) performance over the 8800 Ultra (because they didn’t really need to with the 9800 GX2 in their pocket), NVIDIA has brought more competition to AMD’s lineup, which is definitely not something they will be happy about.

It is nice to have this card come in at the $300 price point with decent performance, but the most exciting thing about it is the fact that picking up two of them will give you better performance than a single 9800 GX2 for the same amount of money. Two of them can even start to get by in Crysis with Very High settings (though it might offer a better experience with one or two features turned down a bit).

While our very limited and rocky experience with 3-way SLI may have been tainted by the engineering sample board we used, the fact that we can get near 9800 GX2 Quad SLI performance for 3/4 of the costs is definitely a good thing. The fact this set up MUST be run in an nForce board is a drawback, as we would love to test in a system that can run every configuration under the sun. We’re getting closer with Skulltrail, and we aren’t missing the fact that there are concerns among our readers over its use. But we’re confident that we can push performance up and turn it into our workhorse for graphics, especially now that the VSYNC issue has been cleared up.

While testing this group of cards has been difficult with all the problems we experienced, we are very happy to have a solid explanation for what was causing our decreased performance we were seeing. Now all we need is an explanation for why forcing VSYNC off in the driver causes such a huge performance hit.

Once Again, The Rest
Comments Locked

49 Comments

View All Comments

  • nubie - Tuesday, April 1, 2008 - link

    It is all well and good to bash nVidia for lack of SLi support on other systems, but why can't the AMD cards run on an nVidia motherboard? Kill 2 birds with one stone there, lose your FB-Dimms and test on a single platform.

    Apples to apples, you can't say nVidia is at fault when the blame isn't entirely theirs (besides, isn't the capability to run Crossfire and SLi on one system a little out of the needs of most users?).

    Is it that AMD allows crossfire on all Except nVidia motherboards? (Do VIA, or SiS make a multi-PCIe board?) If so, then we are talking Crossfire availability on Intel and AMD chipsets, and not nvidia, whereas nVidia allows only their own. That sounds like 30/70% blame AMD vs nVidia.
  • PeteRoy - Tuesday, April 1, 2008 - link

    It is too hard to understand these graphs, use the ones you had in the past, I can't understand how to compare the different systems in these graphs.

    Use the graphs from the past with the best on top and the worst on bottom.
  • araczynski - Tuesday, April 1, 2008 - link

    i'm using a 7900gtx right now, and the 9800gtx isn't impressing me enough to warrant $300, i might just pick up an 8800gts512 in a month when they're all well below $200. overclocking one would be more than "close enough" for me.
  • araczynski - Tuesday, April 1, 2008 - link

    ... in any case, i'd much prefer to see benchmarks comparing a broader range of cards than seeing this sli/tri/quad crap. your articles are assuming that everyone upgrades their cards everytime nvidia/ati shit something out on a monthly basis.
  • Denithor - Tuesday, April 1, 2008 - link

    ...because they set out to accurately compare nVidia's latest high end card to other high end options available.

    I'm sure in a few days there will be a followup article showing a broader spectrum of cards at more usable resolutions so we (the common masses) can see whether or not this $300 card really brings any benefit with its high price tag.
  • Ndel - Tuesday, April 1, 2008 - link

    your benchmarks dont even make any sense =/

    why use a system 99.9 percent of the people dont have.

    this is not even relative to what other enthusiasts currently have, how are we suppose to believe these benchmarks at all.

    grain of salt...
  • SpaceRanger - Tuesday, April 1, 2008 - link

    I believe he used the fastest processor out there to eliminate IT at the bottleneck for the benchmark.
  • Rocket321 - Tuesday, April 1, 2008 - link

    Derek -
    I enjoyed this article for a few reasons that made it different.

    First, the honesty and discussion of problems expirenced. This helps to convay the many issues still expirenced with multi GPU solutions.

    Second the youtube video. This is a neat use of available technology. Could this be useful in other ways? Maybe in the next low/mid GPU roundup it could be used to show a short clip of each card playing a game at the same point.
    This could visually show where one card gets choppy and a better card doesn't.

    Finally - Using a poll in the forums - really great idea to do this for relavent info and then add to an article.

    Thanks for the good article!
  • chizow - Tuesday, April 1, 2008 - link

    Thanks for making AT reviews worth reading again Derek. You addressed many of the problems I've had with the ho-hum reviews of late, like emphasizing major problems encountered during testing and dropping some incredibly insightful discoveries backed by convincing evidence (Vsync issue). Break throughs such as this are part of what make PC hardware fun and exciting.

    A few things you touched on but didn't really clarify was performance on Skulltrail vs. NV chipsets and memory bandwidth/amount on the 9800 vs. Ultra. I'd like to see a comparison of Skulltrail vs. 780/790i and then just future disclaimers like (Skulltrail is ~20% slower than the fastest NV solutions).

    With the 9800 vs Ultra I'm a bit disappointed you didn't really dig into overclocking at all or further investigation on how much some of the issues you talked about impacted or benefitted performance, like memory bandwidth. I think its safe to say the 9800GTX as a refined G92 8800GTS has significant overclocking headroom while the Ultra does not (its basically an overclocked GTX). It would have been nice to see how much memory overclocks would've benefitted overall performance alone, then max overclocks on both the core/shader and memory.

    But again, great review, I'll be reading over it again to pick up on some of the finer details.
  • lopri - Tuesday, April 1, 2008 - link

    How many revisions the 790i have been through already? Major ones at that. Usually minor revisions are like A0->A1->A2, I thought. As a matter of fact I don't even remember if there was any nVidia chip that is 'C' revision, except maybe MCP55 (570 SLI).

Log in

Don't have an account? Sign up now