Quake 4 Performance

There has always been a lot of debate in the community surrounding pure timedemo benchmarking. We have opted to stick with the timedemo test rather than the nettimedemo option for benchmarking Quake 4. To be clear, this means our test results focus mostly on the capability of each graphics card to render frames generated by Quake 4. The frame rates we see here don't directly translate into what one would experience during game play.

Additionally, Quake 4 limits frame rate to 60 fps during gameplay whether or not VSync is enabled. Performance characteristics of a timedemo do not reflect actual gameplay. So why do we do them? Because the questions we are trying to answer have only to do with the graphics subsystem. We want to know what graphics card is better at rendering Quake 4 frames. Any graphics card that does better at rendering Quake 4 frames will play Quake 4 better than other slower cards. While that doesn't mean the end user will see higher performance in the game, it does mean that the potential for seeing more performance is there, for instance if the user upgrades a CPU before the next graphics card upgrade.

Timedemos do walk a fine line between synthetic benchmarks and real world benchmarks. While we tend to favor real world data here at AnandTech, this type of benchmark is very capable of using a real world data set to test the maximum capabilities of the graphics cards under its particular work load without bottlenecking at other points in the system. To be sure, even timedemos can see memory and CPU bottlenecks, as data must be transfered to the graphics card some how. But this impact is much lower than the impact of running AI, physics, script management, I/O, and other game code at the same time.

What this means to the end user is that in-game performance will almost always be lower than timedemo performance. It also means that graphics cards that do slightly better than other graphics cards will not always show a tangible performance increase on an end user's system. As long as we keep these things in mind, we can make informed conclusions based on the data we collect.

Our benchmark consists of the first few minutes of the first level. This includes both inside and outdoor sections, with the initial few fire fights. We test the game with Ultra Quality settings, and we enable all the advanced graphics options except for VSync and antialiasing. Anisotropic filtering is manually set to 8x. Id does a pretty good job of keeping framerate very consistent; in-game framerates of 25 are acceptable. While we don't have the ability to make a direct mapping to what that means in the timedemo test, our experience indicates that a timedemo fps of about 35 translates into an enjoyable experience on our system. This will certainly vary on other systems, so take it with a grain of salt. But the important thing to remember is that this is more of a test of relative performance of graphics cards when it comes to rendering Quake 4 frames -- it doesn't directly translate to Quake 4 experience.

Quake 4

The Doom 3 engine was once NVIDIA's stomping grounds, but Quake 4 performance is now dominated by ATI's Radeon X1900 and X1950 series. The X1950 CrossFire manages a 26% performance advantage over the GeForce 7900 GTX SLI, while the X1900 CF setup pulls ahead by just under 16%.

Among single cards, the X1950 XTX manages about an 11% performance advantage over the 7900 GTX, meaning that CrossFire actually scales much better than SLI in Quake 4 for some reason. The new 256MB X1900 XT is outperformed by its 512MB sibling by a decent 16%. Honestly we were not expecting to see such big differences, especially with AA disabled, between the 256MB and 512MB cards. It's good to see that games are actually using all of this framebuffer being thrown at them.

Quake 4

With AA enabled, ATI does even better, with the X1950 CrossFire outperforming the 7900 GTX SLI by over 57%. ATI has done a lot of work on its OpenGL performance lately and we're currently investigating to see if that's the cause for such a stellar showing in Quake 4 here today.

Half-Life 2: Episode One Performance Splinter Cell: Chaos Theory Performance
Comments Locked

74 Comments

View All Comments

  • Vigile - Wednesday, August 23, 2006 - link

    My thought exactly on this one Anand...
  • Anand Lal Shimpi - Wednesday, August 23, 2006 - link

    You can run dual monitors with a CrossFire card as well, the CrossFire dongle that comes with the card has your 2nd DVI output on it :)

    Take care,
    Anand
  • kneecap - Wednesday, August 23, 2006 - link

    What about VIVO? The Crossfire Edition does not support that.
  • JarredWalton - Wednesday, August 23, 2006 - link

    For high-end video out, the DVI port is generally more useful anyway. It's also required if you want to hook up to a display using HDCP - I think that will work with a DVI-to-HDMI adapter, but maybe not? S-VIDEO and Composite out are basically becoming seldom used items in my experience, though the loss of component out is a bit more of a concern.
  • JNo - Thursday, August 24, 2006 - link

    So if I use DVI out and attach a DVI to HDMI adaptor before attaching to a projector or HDTV, will I get a properly encrypted signal to fully display future blu-ray/hd-dvd encrypted content?

    The loss of component is a bit of a concern as many HDTVs and projectors still produce amazing images with component and, in fact, I gather that some very high resolutions+refresh rates are possible on component but not DVI due to certain bandwidth limitations with DVI. But please correct me if I am wrong. I take Anandtech's point on the crossfire card offering more but with a couple of admittedly small quesiton marks, I see no reason not to get the standard card and crossfire for the second later if you decided to go that route...
  • JarredWalton - Thursday, August 24, 2006 - link

    I suppose theoretically component could run higher resolutions than DVI, with dual-link being required for 2048x1536 and higher. Not sure what displays support such resolutions with component inputs, though. Even 1080p can run off of single-link DVI.

    I think the idea with CF cards over standard is that they will have a higher resale value if you want to get rid of them in the future, and they are also more versatile -- TV out capability being the one exception. There are going to be a lot of people that get systems with a standard X1950 card, so if they want to upgrade to CrossFire in the future they will need to buy the CrossFire edition. We all know that at some point ATI is no longer going to make any of the R5xx cards, so if people wait to upgrade to CrossFire they might be forced to look for used cards in a year or two.

    Obviously, this whole scenario falls apart if street prices on CrossFire edition cards end up being higher than the regular cards. Given the supply/demand economics involved, that wouldn't be too surprising, but of course we won't know for another three or four weeks.
  • UNESC0 - Wednesday, August 23, 2006 - link

    thanks for clearing that up Anand, news to me!
  • TigerFlash - Wednesday, August 23, 2006 - link

    I was wondering if anyone thinks it's wise to get an intel core duo 2 motherboard with crossfire support now that AMD is buying out ATI. Do you think ATI would stop supporting Intel motherboards?
  • johnsonx - Wednesday, August 23, 2006 - link

    quote:

    Do you think ATI would stop supporting Intel motherboards?


    Of course not. AMD/ATI isn't stupid. Even if their cross-licensing agreement with Intel didn't prevent them from blocking Crossfire on Intel boards (which it almost surely does), cutting out that part of the market would be foolish.
  • dderidex - Wednesday, August 23, 2006 - link

    What's with the $99 -> $249 gap?

    Weren't we supposed to see an X1650XT, too? Based on RV570? ...or RV560? Something?

Log in

Don't have an account? Sign up now