Gaming Performance

As usual, gaming performance was tested with a variety of current games. We ran benchmarks with our standard 1280x1024 resolution with 4x antialiasing and with 8x anisotropic filtering (if the game has support) enabled. Given the number of users that run 19" LCDs these days, 1280x1024 represents one of the most commonly used resolutions. We decided to also stress the graphics subsystem since we are benchmark testing the core logic chipset to compare the performance of ATI CrossFire on the P35 and 975X platforms.

In order to do this we were particularly interested in increasing the resolutions and graphic settings of the graphics cards so we are including 1600x1200 4xAA/8xAF and 1920x1200 4xAA/8XAF resolutions. We feel these resolutions will be the best indicator of performance for users with a high-end CrossFire and CPU setup. Our review is based on the capability of the Intel P35 chipset and how well R600 CrossFire works on the ASUS P5K-Deluxe with the current 8.37.4.3 drivers. We will examine other GPU solutions with this chipset in our P35 article.

Battlefield 2

This benchmark is performed using DICE's built-in demo playback functionality with additional capture capabilities designed in house. When using the built-in demo playback features of BF2, frames rendered during the loading screen are counted in the benchmark. In order to get a real idea of performance, we use the instantaneous frame time and frames per second data generated from our benchmark run. We discard the data collected during the loading screen and calculate a result that represents actual game play. While DICE maintains that results over 100fps aren't always reliable, our methods have allowed us to get useful data from high performing systems.

During the benchmark, the camera switches between players and vehicles in order to capture the most action possible. There is a significant amount of smoke, explosions, and vehicle usage as this a very GPU intensive Battlefield 2 benchmark. We run Battlefield 2 using the highest quality graphics settings available in the video settings. The game itself is best experienced with average in-game frame rates of 40 and up.


The P35 CrossFire solution performs well at 1280x1024 but as we crank up the resolution our scores start to drop and end up being about 5% worse at the 1920x1200 resolution. We still found the P35 setup to be perfectly playable and did not witness any stutters or graphic anomalies during testing. We noticed the same results in our initial XP testing and since the single card results are actually better than the 975X we believe this issue is a limitation of the x16/x4 design. However, we firmly believe that driver optimizations could regain a couple of percent based on P965 testing with an X1950 XTX setup.

Serious Sam 2

This benchmark is performed using Croteam's built-in demo capability in the Serious Sam II engine. We utilize the included Branchester Demo and capture the playback results using the Ctrl-~ function. The benchmark features a large number of combatants, explosions, and general mayhem. The benchmark is primarily GPU sensitive with the actual percentage of GPU/CPU/Audio activity being displayed during the benchmark run. We typically find this game is very playable at average in-game rates of 60 and above. We maximize all settings except antialiasing and anisotropic filtering within the general and advanced video settings.


We see our P35 CrossFire setup performing 5% better at 1280x1024 and then following the same pattern in BF2 with it trailing the 975X platform by 10% at 1920x1200 as the game becomes GPU limited. As in the Battlefield 2 testing the game playback was perfect and generated all of the correct visuals. Considering the single card scores favor the P35 setup we once again believe the differences at the higher resolutions are due to a combination of driver maturity and the x16/x4 limitation in high data traffic games. The initial XP testing reveals the same pattern although the difference at 1920x1200 is 5% which leads us to Vista driver improvements still being required for CrossFire operation on the P35.

F.E.A.R.

F.E.A.R. uses a built-in performance test that generates graphical test scenes based upon the actual game engine. This test consists of a couple of different action sequences, a stressful water flyby, and heavy use of shadows while traveling through hallways. F.E.A.R. is a very graphics intensive game and we switch all settings to maximum for both the system and GPU. During our testing of F.E.A.R., we noted that the "soft shadows" don't really look soft and the performance hit is drastic, so we disable this setting. An average frame rate for F.E.A.R. can dip drastically during game play and that is not good for a first person shooter but the game is still playable around 35fps although we prefer a solid 45fps.


This game is still a GPU crusher when the settings are dialed up and our P35 CrossFire setup falls flat on its face in this game when compared to the 975X. The single GPU scores are indicating another issue with both driver optimizations and throughput issues on the DMI link. The minimum and maximum frame rates followed the same pattern and will be presented in the P35 chipset article. Our initial testing on the P965 platform revealed a minor difference in frame rates between the P965 and 975X chipsets so we think driver optimizations can close the gap in this game.

Quake 4

Our benchmark utilizes the IdNetDemo with the playnettimedemo option. This includes mainly outdoor areas with numerous players trying to kill each other. We tested the game with Ultra Quality settings (uncompressed normal maps), and we enabled all the advanced graphics options except for VSync. Id does a pretty good job of keeping frame rates consistent so in-game frame rates above 30 are acceptable for single player and 60 for multiplayer. The important thing to remember is this test will directly translate to an actual Quake 4 experience.


After seeing a consistent pattern in our other game tests we now see a complete flip in the CrossFire results. We see the P35 setup being around 8% faster on average across the CrossFire results although it loses in the single card results. We ran this test several times with different settings but the results were always the same. We attribute the differences to driver optimizations and game code for the most part as the single card scores are better than the CrossFire scores.

During actual game play the perception was the CrossFire setups were more fluid and "seemed" faster than the single card setups although the net timedemos said otherwise. We also ran our custom timedemo and the scores were just the opposite with the 975X scoring up to 12% better. However, this was one of our benchmarks that had issues consistently finishing under Vista and showed signs of rendering corruption during playback. At present, we would have to say that our Q4 test is almost completely CPU limited, and this is often not the case during actual gaming. We are working on a new benchmark at this time.

General Graphics Performance Gaming Performance, Continued
Comments Locked

29 Comments

View All Comments

  • vailr - Thursday, May 17, 2007 - link

    miss an outing of lifetime with friends
    [outing of a lifetime]
    We are not here to single handily knock AMD
    [single-handedly]

    System Platform Drivers Intel - 8.3.0.1013
    [Version 8.4.0.1010 Beta:
    http://www.station-drivers.com/telechargement/inte...">http://www.station-drivers.com/telechargement/inte...
    Note: running the .exe installer may NOT update existing installed drivers. Must be manually updated for each device in Device Manager. See the readme.txt file:
    "INF files are copied to the hard disk [Program Files/Intel/INFInst folder] after running the Intel(R) Chipset Device Software executable with an '-A'
    flag (i.e., "INFINST_AUTOL.EXE -A"]
  • Paradox999 - Thursday, May 17, 2007 - link

    Wow, who would have thought ATI might have *immature* drivers for the x2900 at this point ? Duh. Moreover, why even try Crossfire when the cards in single configuration have been little more than a major league flop (don't bother spamming me, I'm an ATI fanboy). Given the poor performance (vs a much cheaper 8800GTS) and insane power requirements of a single card, you might be able to count on one hand the people eager to rush out to get a Crossfire setup. This kind of article is more in the category of 'curiosity' (like those guys that tried overclocking an x2900 with liquid nitro). Anand should be publishing more articles of a practical nature. If you want to try Crossfire and the x2900....at least wait for a few driver revisions AND then a head-to-head against the 8800gts. That *might* provide more useful information, albeit, for a very small segment of the enthusiast market.

    I have to totally agree with some of the previous posters and say SLI and Crossfire is overkill and a waste of money. Buy the best card you can afford now. When it doesn't work for you any more replace it with the best NEW generation card you can buy.

    I'm still annoyed that the better motherboards (like my P5B Dlx Wi-Fi)come with 2 PCIE-16 slots. I use 1 x1900XTX and I'll replace it one day with one (1) much better card. The way I see it,ASUS robbed me of a PCI slot for my many expansion cards.
  • lopri - Thursday, May 17, 2007 - link

    quote:

    I'm still annoyed that the better motherboards (like my P5B Dlx Wi-Fi)come with 2 PCIE-16 slots. I use 1 x1900XTX and I'll replace it one day with one (1) much better card. The way I see it,ASUS robbed me of a PCI slot for my many expansion cards.

    You must be joking, I assume? I think all PCI-E slots should be full length (x16) even though they are not electrically so. The only PCI card worth buying (for who needs one, that is) at this time would be X-Fi and that's just because of Creative's incompetency and monopoly in the market. I've ditched my X-Fi and refuse to buy Creative products until they get their act straight.
  • TA152H - Thursday, May 17, 2007 - link

    I ready stuff like this and I wonder what people are thinking. Why would Creative make such a mistake as you suggest?

    Let's see, every motherboard comes with PCI slots, and there are tons of motherboards that people use that don't have PCI-E slots. They are selling upgrade parts, and PCI-E does NOTHING for these parts that they can't get from PCI. It's not like if they were using PCI-E they would get better performance or it would work better in some way. So, naturally, they are making PCI cards. Duh.

    Maybe down the road when Intel stops supporting PCI, or when motherboards come out without PCI slots Creative will start making PCI-E, but until then, who needs them? They don't hurt in any way, not in performance, not in reliability. If they made them PCI-E so soon, they'd invest money in a product that currently makes no sense, and it would just jack up the costs.
  • lopri - Thursday, May 17, 2007 - link

    I somehow doubt that those 'tons of' folks with mobo without PCI-E wouldn't mind on-board sound. Heck. I have an SLI board and I rather make do with on-board sound than dealing with Creative garbage. X-F.. what? I also doubt X-Fi's target market is folks using 5 year old motherboards. Don't get me wrong. Their SB Live! is still decent and perfectly suited for older motherboards.

    And.. Mistake? Umm.. I wouldn't argue about PCI-E vs PCI here, but It's not exactly the case that Creative's PCI products and supports (which is non-existent, btw) are spectacular. They didn't even have a full driver download link until very recently. (They had no choice but to upload the drivers thanks to Vista)
  • TA152H - Friday, May 18, 2007 - link

    I'm not sure we're on the same page here. I thought you were implying that Creative needed to get their act together and get on the PCI-E bandwagon since that was what you were talking about. Apparently, you just don't like Creative and that was just kind of thrown in without respect to PCI-E.

    If so, I agree, they blow. Their software is horrible, and their hardware is overpriced. I don't know I'd go as far as to say that they are a monopoly; there are a lot of choices in the low end, but at the high end you can get a card from any maker you want - as long as it's Creative. I am really, really particular with respect to sound too, I have no tolerance for bad speakers or noisy computers because I listen to music on my computer, so it's extremely quiet. Unfortunately, I have to buy Creative and I have a love/hate feeling towards them. They do make the best stuff, but it's expensive, difficult and buggy. So, I know where you're coming from. Maybe NVIDIA should move into that market too. I think they'd eat up a half-rate company like Creative. How about AMD? Hell, if they're going to get into Fusion, why not do it right and put the sound processor there too? It's probably a matter of time. Sound is very important to gaming, and of course to watching TV and listening to music. Makes you wonder why more attention hasn't been placed on it, and substandard companies like Creative are given free reign.
  • PrinceGaz - Thursday, May 17, 2007 - link

    Although it was a nicely presented article on a product which is not exactly revolutionary, I must take issue with the game benchmarks which were included.

    Out of the seven games tested, only two of them had any results where the average was below 60fps; one where the lowest was 54fps and the other (which was the only one with meaningful framerates) being Supreme Commander where the P35 Crossfire configuration had driver issues.

    I know you might say that results of 80 vs 100 vs 120fps do still provide useful information regarding likely performance in future games, but the fact is that they don't as the demands made on the CPU, mobo, and graphics card of a much more demanding game running at 40fps tends to be quite different to that of a current game running at 120fps. I appreciate you must have spent an awful lot of time running them all (five times each for every setting, no less) but at the end of the day they didn't really provide any meaningful information other than that there are driver issues which need to be resolved (which is what we would expect).

    By the way, since you already went to the trouble of running every test five times, and discarded the two highest and lowest results to prevent them from unduly affecting an average; wouldn't it be a good idea to run the tests a sixth time so that the score you used is based on the average of two results rather than just the one in the middle? I imagine the 2nd-3rd-4th places were pretty close anyway (hopefully almost identical, with 1st place being very similar, and only 5th place somewhat slower because it was the first run), but for the sake of an extra 20% testing time a sixth run would involve, the statistical accuracy of using the mean of two results would be significantly improved.

    I will reiterate though that overall the review was informative and well written; it was only the benchmarks themselves which were a bit pointless.
  • DigitalFreak - Thursday, May 17, 2007 - link

    This is yet another perfect example of why ATI needs to open up Crossfire support to NVidia chipset motherboards. In the Intel space, the only supported chipsets that actually give them the bandwidth they need are the 975X and X38. I would think they would want to sell as many cards as possible.
  • OrSin - Thursday, May 17, 2007 - link

    SLI and Crossfire as far asIi can see are not needed for almost anything. I have a 6800 and 7900 and when I was shopping around I could not find a single reason to get another 6800 and go SLI instead just of getting a 7900. Thats the same for crossfire. SLI and crossfire support in games are just not good enough. The 6800 would have been 30% less then the 7900, but the gains would have been 60% less on a good day and no gain at all for several games.

    With all that rambling it just means that the P35 is a great board, so unless you need crossifre (and most should not) get it. And dont wait for the next over-hyped product (X38). Hows thats :)
  • PrinceGaz - Thursday, May 17, 2007 - link

    SLI/Crossfire is never needed a good upgrade path if the next-gen product is already out. You're almost always much better off selling your old card and buying a new one from the current generation as it works out no more expensive, but provides better performance, uses less power and makes less noise, and has none of the compatibility issues associated with twin graphics-card configurations.

    However, that does not make SLI and Crossfire useless. They are needed for bragging rights by people whose weekly shopping list includes having a large tub of liquid-nitrogen delivered, and by those who are worried about the size of their ePenis. The rest of us have no need of going the twin graphics-card route unless the money is burning a hole in our pocket anyway and we've nothing better to do with it.

Log in

Don't have an account? Sign up now