The Test

With the recent launch of Intel's Core 2 Duo, affordable CPU power isn't much of an object. While the midrange GPUs we will be testing will more than likely be paired with a midrange CPU, we will be testing with high end hardware. Yes, this is a point of much contention, as has always been the case. The arguments on both sides of the aisle have valid points, and there are places for system level reviews and component level reviews. The major factor is that the reviewer and readers must be very careful to understand what the tests are really testing and what the numbers mean.

For this article, one of the major goals is to determine which midrange cards offers the best quality and performance for the money at stock clock speeds at this point in time. If we test with a well aged 2.8GHz Netburst era Celeron CPU, much of our testing would show every card performing the same until games got very graphics limited. Of course, it would be nice to know how a graphics card would perform in a common midrange PC, but this doesn't always help us get to the bottom of the value of a card.

For instance, if we are faced with 2 midrange graphics cards which cost the same and perform nearly the same on a midrange CPU, does it really matter which one we recommend? In our minds, it absolutely does matter. Value doesn't end with what performance the average person will get from the card when they plug it into a system. What if the user wants to upgrade to a faster CPU before the next GPU upgrade? What about reselling the card when it's time to buy something faster? We feel that it is necessary to test with high end platforms in order to offer the most complete analysis of which graphics solutions are actually the best in their class. As this is our goal, our test system reflects the latest in high end performance.

CPU: Intel Core 2 Extreme X6800 (2.93GHz/4MB)
Motherboard: Intel D975XBX (LGA-775)
Chipset: Intel 975X
Chipset Drivers: Intel 7.2.2.1007 (Intel)
Hard Disk: Seagate 7200.7 160GB SATA
Memory: Corsair XMS2 DDR2-800 4-4-4-12 (1GB x 2)
Video Card: Various
Video Drivers: ATI Catalyst 6.7
NVIDIA ForceWare 91.33
Desktop Resolution: 1920 x 1440 - 32-bit @ 60Hz
OS: Windows XP Professional SP2

 

The games we have chosen to test represent a wide variety of engines and styles. We have included some familiar faces, along with some long over due additions. All told, we are testing 9 games, less than half of which are first person shooters. As the interest in HDR and advanced visual effects continues to rise, the tradeoff required for antialiasing is often overshadowed by the quality available from other options. This is especially true in games like Splinter Cell: Chaos Theory, Oblivion, and Black & White 2. Thus, we limited testing with AA to 3 games: Battlefield 2, Half-Life 2: Episode One, and Quake 4. We chose BF2 because aliasing makes snipers go crazy, HL2:Ep1 because the Source engine does HDR and AA on all hardware (and it does it with great performance), and Quake 4 because we wanted to include AA with an OpenGL title.

In reporting our results, in hopes to increase readability, we will be including a snapshot of one resolution using our standard graphing engine graphs along side a resolution scaling line graph.

The Contenders Battlefield 2 Performance
Comments Locked

74 Comments

View All Comments

  • gmallen - Friday, August 11, 2006 - link

    Most of the PC enthusiast population interested in mid-range cards are still running AGP motherboards (this is based on sales of pci motherboards vs. agp motherboards). Where are these cards?
  • Josh7289 - Friday, August 11, 2006 - link

    quote:

    Where are these cards?


    They don't exist.
  • arturnowp - Friday, August 11, 2006 - link

    Hi

    It's written that all card in oblivion was tested with HDR Lighting with X800GTO doesn't support. I think your results are misleading. The same with SC: Chaos Theory...

    BTW: Who plays Oblivion with Actor Fade at 20%, Item Fade at 10% and Object Fade at 25% you get better graphics and performance setting those option to 50-60% and turning off grass with consums a lot of power and doesn't look good. In foliage it's better to see your enemies from greater distance the say with a horse ;-)

  • arturnowp - Friday, August 11, 2006 - link

    OK there's writen about SC: Chaos Theory but all in all conclusion are misleading "Owners of the X800 GTO may have a little more life left in their card depending on how overclocked the card is, but even at stock clocks, it might be wise to hang on for another product cycle if possibl" where GeForce 6600GT performe on par with X800GTO. It would be better to exclude X800GTO from charts or mark it as SM 2.0 card. What's better GeForce 6600GT should be tested in SM 2.0 mode...
  • nv40 - Friday, August 11, 2006 - link

    Don't why?
    http://www.xbitlabs.com/articles/video/display/pow...">http://www.xbitlabs.com/articles/video/display/pow...
    Some difference of test are so large that it almost shocked me
    For instance:
    7900GT@84.21 with FX-60 can run 54 FPS avg in 1600x1200 with 4xAA 16xAF in X-bit lab
    7900GT@91.33 with X6800 just be 35 FPS ave in 1600x1200 with only 4x AA in Anandtech
    Problem of 91.33? Intel 975X? X6800? nVidia?
    more than 40% performance difference despite X6800 is far superior to FX-60

  • coldpower27 - Friday, August 11, 2006 - link

    They probably aren't running the same time demo sequences.
  • nv40 - Friday, August 11, 2006 - link

    Maybe... but only 9% dif in X1900GT (41 vs 38)
    And 7900GT test in Anandtech definitely performed much worse then X-bit lab in general
    nothing with which is correct or not, but if both are right, the the conclusion may be probably draw like below:
    1. Driver problem: 91.33 is much slower than 84.21 (nV Cheat, or 91.33 problem)
    2. CPU problem: X6800 is much inferior than FX-60 in game (Rediculous, and far from true in every test)
    3. Platform problem: nVidia cards perform much worse in intel chipset (975X)
  • Sharky974 - Friday, August 11, 2006 - link

    I agree. I clearly remember Xbit declaring the 7900GT to win the vast majority of benches vs the X1900GT.

    In fact overall the X1900GT wasn't warmly recieved. I really feel this deserves some looking into.

    For example, I'll have to go look, but I think Firing Sqaud also showed the X1900GT as inferior to the 7900GT.

    As it stands now, it's like Anand's platforms are somehow ATI biased, on the other hand I believe Xbit platform is Nvidia biased. Xbit reviews nearly always show Nvidia winning.
  • Sharky974 - Friday, August 11, 2006 - link

    http://www.firingsquad.com/hardware/sapphire_radeo...">http://www.firingsquad.com/hardware/sapphire_radeo...

    I started on the first page of benches.

    As one glaring example:

    Firings squad: Quake 4 1280X1024 4XAA 8XAF 7900GT-87.2 X1900GT-60.6

    http://www.firingsquad.com/hardware/sapphire_radeo...">http://www.firingsquad.com/hardware/sapphire_radeo...

    Anand: Quake 4 1280X1024 4XAA 7900 GT-45.1 X1900GT-49.8

    http://images.anandtech.com/reviews/video/roundups...">http://images.anandtech.com/reviews/video/roundups...

    With similar settings, FS has the 7900GT getting nearly double the frames Anand does. The X1900GT also gets significantly more in FS review, from 49 to 60 FPS, but nowhere near the change the 7900GT sees, with the net effect the X1900GT eaks out a win at Anand, but loses by nearly 27+ FPS at FS.

    The X1900GT is definitly a better card than I had remembered, even at the FS benches though.

    Also, FS was using a FX-57. Anand a much more powerful CPU, making results all the more puzzling.

    In addition to some of the other suggestions, I'd question drivers. FS was using older drivers on both since it is an older review. Perhaps Nvidia drivers have seen a large performance decrease, or ATI's a similar increase? This seems fairly unlikely, though, as I dont think you normally get huge differences from driver to driver.

    Unless Nvidia really was cheating RE 16-bit filtering as the INQ claimed a while back, so they fixed it causing a massive performance decrease? :) Again though, that suggestion is made half-jokingly.

    This definitly needs a lot of looking into I fell. Anand's results are quite different than others around the web at first blush.
  • JarredWalton - Friday, August 11, 2006 - link

    Levels can make a huge difference in performance. For example, Far Cry has segments that get about 80 FPS max on any current CPU (maybe higher with Core 2 Extreme overclocked...), but other areas of the game run at 150+ FPS on even a moderate CPU like a 3500+. I don't have a problem providing our demo files, but some fo them are quite large (Q4 is about 130 MB if I recall). SCCT, FEAR, and X3 provide a reference that anyone can compare to, if they want. The only other thing is that ATI driver improvements are certainly not unlikely, especially in Quake 4.

Log in

Don't have an account? Sign up now