The Test

With the recent launch of Intel's Core 2 Duo, affordable CPU power isn't much of an object. While the midrange GPUs we will be testing will more than likely be paired with a midrange CPU, we will be testing with high end hardware. Yes, this is a point of much contention, as has always been the case. The arguments on both sides of the aisle have valid points, and there are places for system level reviews and component level reviews. The major factor is that the reviewer and readers must be very careful to understand what the tests are really testing and what the numbers mean.

For this article, one of the major goals is to determine which midrange cards offers the best quality and performance for the money at stock clock speeds at this point in time. If we test with a well aged 2.8GHz Netburst era Celeron CPU, much of our testing would show every card performing the same until games got very graphics limited. Of course, it would be nice to know how a graphics card would perform in a common midrange PC, but this doesn't always help us get to the bottom of the value of a card.

For instance, if we are faced with 2 midrange graphics cards which cost the same and perform nearly the same on a midrange CPU, does it really matter which one we recommend? In our minds, it absolutely does matter. Value doesn't end with what performance the average person will get from the card when they plug it into a system. What if the user wants to upgrade to a faster CPU before the next GPU upgrade? What about reselling the card when it's time to buy something faster? We feel that it is necessary to test with high end platforms in order to offer the most complete analysis of which graphics solutions are actually the best in their class. As this is our goal, our test system reflects the latest in high end performance.

CPU: Intel Core 2 Extreme X6800 (2.93GHz/4MB)
Motherboard: Intel D975XBX (LGA-775)
Chipset: Intel 975X
Chipset Drivers: Intel 7.2.2.1007 (Intel)
Hard Disk: Seagate 7200.7 160GB SATA
Memory: Corsair XMS2 DDR2-800 4-4-4-12 (1GB x 2)
Video Card: Various
Video Drivers: ATI Catalyst 6.7
NVIDIA ForceWare 91.33
Desktop Resolution: 1920 x 1440 - 32-bit @ 60Hz
OS: Windows XP Professional SP2

 

The games we have chosen to test represent a wide variety of engines and styles. We have included some familiar faces, along with some long over due additions. All told, we are testing 9 games, less than half of which are first person shooters. As the interest in HDR and advanced visual effects continues to rise, the tradeoff required for antialiasing is often overshadowed by the quality available from other options. This is especially true in games like Splinter Cell: Chaos Theory, Oblivion, and Black & White 2. Thus, we limited testing with AA to 3 games: Battlefield 2, Half-Life 2: Episode One, and Quake 4. We chose BF2 because aliasing makes snipers go crazy, HL2:Ep1 because the Source engine does HDR and AA on all hardware (and it does it with great performance), and Quake 4 because we wanted to include AA with an OpenGL title.

In reporting our results, in hopes to increase readability, we will be including a snapshot of one resolution using our standard graphing engine graphs along side a resolution scaling line graph.

The Contenders Battlefield 2 Performance
Comments Locked

74 Comments

View All Comments

  • augiem - Thursday, August 10, 2006 - link

    I wonder which of these cards would accelerate Maya's 3D viewport performance the most...
  • PrinceGaz - Thursday, August 10, 2006 - link

    If you're a casual Maya user, then look at the OpenGL performance (Quake 4) for a rough guide. I'm tempted to think though that the GeForce cards should still have the edge in most OpenGL situations so Quake 4 might not be representative.

    If you use Maya professionally, then none of the cards looked at are for you. A good Quadro or FireGL card will render scenes far faster than any consumer card, and as time is money, will more than pay for itself despite their high cost if that is what you do for a living.
  • Calin - Friday, August 11, 2006 - link

    There was a time when it was possible (although not very easy) to mod a Radeon 9700 into the corresponding FireGL card. This would have been great for you (but now a FireGL based on 9700 could be slower than consumer cards)
  • PrinceGaz - Thursday, August 10, 2006 - link

    I've only read the first two pages of the article up to and including the list of prices for the various cards at the bottom of the second page, and haven't read any comments here, but it seems pretty obvious already that the X1900GT is going to be the obvious winner in terms of value for money.

    I'll be back in half an hour or so after I've read the rest of it.
  • Gondorff - Friday, August 11, 2006 - link

    Indeed, the X1900GT looks very good... which makes me very happy b/c I just bought it a week or so ago (damned slow shipping though...). For those who do care about rebates, the x1900gt can be had on newegg for $200 right now (a connect3d one). I was lucky and got it at $175 before they raised the price... for $15 more than the 7600gt I was going to get otherwise, that's pretty damn good if I may say so myself.

    Anyway... excellent article; if only it were out earlier so I could worry less about a slightly blind choice... but c'est la vie and it turned out well anyway :).
  • Kougar - Thursday, August 17, 2006 - link

    Good grief, I just found it for $199... and it was previously $175!? Incredible... :(
  • PrinceGaz - Thursday, August 10, 2006 - link

    Yep, pretty much as I suspected- the X1900GT is best at stock speeds. Things become a little blurred when factory-overclocked 7900GTs are brought into the picture but while they're faster, they're also more expensive by a similar amount. Both offer great value for money if you need to buy a card now.

    One thing the article seemed to overlook is that many people who visit sites like this will overclock cards themselves, factory overclocked or not, and this is likely to reduce the advantage of already overclocked cards like the 7900GTs you recommend. I imagine there is a bit more headroom in a stock X1900GT than a factory overclocked 7900GT (especially a 7900GT with a core clock of 580 like you used). Those of us willing to take a chance on how much extra a card has available may well find a user-overclocked X1900GT to be a match for what an overclocked (user or factory) 7900GT can achieve.
  • coldpower27 - Friday, August 11, 2006 - link


    The problem with this is that your using assume performance vs guranteed performance of factory overclocked units, so they aren't comparable.

    The point provided is something to keep in mind, but shouldn't be recommended for anyone other then those who know what they are doing. Not to mention the voiding of the warranty when you do when you suggest.
  • DerekWilson - Friday, August 11, 2006 - link

    Also, if you look around, increasing voltage and cooling for 7900 GT cards can yeild results better than a 7900 GTX. Buying a factory overclocked 7900 GT gives you a card that a manufacturer binned as a part that is able to hit higher than stock clocks at stock voltage and temperature. So you should get a more easily overclockable card if you really want to push it to its limits.
  • Genx87 - Thursday, August 10, 2006 - link

    2nd from the top for ATI is considered mid grade?

    Guess that 7950GX2 is pushing them down from the top.

Log in

Don't have an account? Sign up now