General Graphics Performance

The 3DMark series of benchmarks by Futuremark are among the most widely used tools for benchmark reporting and comparisons. Although the benchmarks are very useful for providing apples-to-apples comparisons across a broad array of GPU and CPU configurations, they are not a substitute for actual application and gaming benchmarks. In this sense we consider the 3DMark benchmarks to be purely synthetic in nature but still valuable for providing consistent measurements of performance.

Graphics Performance - General

Well, the results of this test are confusing on the surface although driver maturity and memory sensitivity across the DMI interface has a great deal to do with the P35 results. The P35 chipset scores about 2% better in single card operation than in CrossFire mode and also leads the 975X CrossFire setup. We found in testing that the P35 CrossFire scores in each scene were slightly higher until the Nature test where the single card scored about 12% better. The 975X CrossFire setup just barely ekes by its single card performance results. This benchmark is currently a better indicator for CPU, chipset, and memory performance. In this regard, we can see that the P35 single card performance leads the 975X slightly in platform performance as our game benchmark testing will indicate shortly. The fact is, in unbuffered memory testing the P35 was generally about 5% faster than the 975X across the board with CPU throughput testing being higher with a quad core processor.

Graphics Performance - General

Graphics Performance - General

The DirectX 8 centric tests in 3DMark05 benefited greatly from the improved chipset throughput performance of the P35 chipset at stock settings with our quad core processor. The P35 CrossFire results are up to 7% faster than the 975X results with the single card P35 setup once again finishing ahead of the 975X CrossFire setup. Although we have found the P35 chipset to be a fierce competitor to the 975X in initial testing we think some additional BIOS and driver tuning would allow the 975X performance to improve by a few percent in these tests.

In our more strenuous graphics test utilizing 3DMark06 we find the P35 results once again leads the 975X chipset but the margin of difference is a negligible 1~2%. We decided to see why the results were so close in this particular test. We looked over the results and found in the SM2.0 tests the P35 solution was about 2% behind the 975X scores, the P35 CPU score was slightly better, and the HDR/SM3.0 tests showed a 4% advantage for the P35. Since the HDR/SM3.0 tests heavily stress both the CPU and graphics bus we figured the x4 PCI Express lane limitation would cause a bottleneck in this test.

Our initial assumptions turned out to be incorrect. After working with ASUS we discovered in their internal testing they noticed the same issue, and they decided to see what would happen on the 975X if the MCH was programmed at x16/x4 operation between the two GPU slots instead of x8/x8. Their test results revealed a surprise as the difference in throughout in all areas of testing was less than 1%. The issue lies in the limited bandwidth and speed of the Direct Media Interface between the P35 MCH and ICH9R. The time required to simultaneously move the data between the two chipsets imposes a significant overhead and bandwidth issue in memory sensitive applications, hence our issues in the memory sensitive 3DMark01 benchmark. Of course, 3DMark performance doesn't necessarily have anything to do with actual game performance anyway, so these results are only mildly interesting.

While ASUS has optimized this link and will continue to do so, it appears we are now near the maximum efficiency of this interface. This simply means that as games become increasingly complex and data bandwidth increases then the differences between the P35 and 975X in CrossFire operation will widen. Let's see how this potential issue and driver maturity affects our initial gaming benchmarks. We would like to stress once again that synthetic benchmark results do not necessarily correlate into real application results.

Test Setup Gaming Performance
Comments Locked

29 Comments

View All Comments

  • vailr - Thursday, May 17, 2007 - link

    miss an outing of lifetime with friends
    [outing of a lifetime]
    We are not here to single handily knock AMD
    [single-handedly]

    System Platform Drivers Intel - 8.3.0.1013
    [Version 8.4.0.1010 Beta:
    http://www.station-drivers.com/telechargement/inte...">http://www.station-drivers.com/telechargement/inte...
    Note: running the .exe installer may NOT update existing installed drivers. Must be manually updated for each device in Device Manager. See the readme.txt file:
    "INF files are copied to the hard disk [Program Files/Intel/INFInst folder] after running the Intel(R) Chipset Device Software executable with an '-A'
    flag (i.e., "INFINST_AUTOL.EXE -A"]
  • Paradox999 - Thursday, May 17, 2007 - link

    Wow, who would have thought ATI might have *immature* drivers for the x2900 at this point ? Duh. Moreover, why even try Crossfire when the cards in single configuration have been little more than a major league flop (don't bother spamming me, I'm an ATI fanboy). Given the poor performance (vs a much cheaper 8800GTS) and insane power requirements of a single card, you might be able to count on one hand the people eager to rush out to get a Crossfire setup. This kind of article is more in the category of 'curiosity' (like those guys that tried overclocking an x2900 with liquid nitro). Anand should be publishing more articles of a practical nature. If you want to try Crossfire and the x2900....at least wait for a few driver revisions AND then a head-to-head against the 8800gts. That *might* provide more useful information, albeit, for a very small segment of the enthusiast market.

    I have to totally agree with some of the previous posters and say SLI and Crossfire is overkill and a waste of money. Buy the best card you can afford now. When it doesn't work for you any more replace it with the best NEW generation card you can buy.

    I'm still annoyed that the better motherboards (like my P5B Dlx Wi-Fi)come with 2 PCIE-16 slots. I use 1 x1900XTX and I'll replace it one day with one (1) much better card. The way I see it,ASUS robbed me of a PCI slot for my many expansion cards.
  • lopri - Thursday, May 17, 2007 - link

    quote:

    I'm still annoyed that the better motherboards (like my P5B Dlx Wi-Fi)come with 2 PCIE-16 slots. I use 1 x1900XTX and I'll replace it one day with one (1) much better card. The way I see it,ASUS robbed me of a PCI slot for my many expansion cards.

    You must be joking, I assume? I think all PCI-E slots should be full length (x16) even though they are not electrically so. The only PCI card worth buying (for who needs one, that is) at this time would be X-Fi and that's just because of Creative's incompetency and monopoly in the market. I've ditched my X-Fi and refuse to buy Creative products until they get their act straight.
  • TA152H - Thursday, May 17, 2007 - link

    I ready stuff like this and I wonder what people are thinking. Why would Creative make such a mistake as you suggest?

    Let's see, every motherboard comes with PCI slots, and there are tons of motherboards that people use that don't have PCI-E slots. They are selling upgrade parts, and PCI-E does NOTHING for these parts that they can't get from PCI. It's not like if they were using PCI-E they would get better performance or it would work better in some way. So, naturally, they are making PCI cards. Duh.

    Maybe down the road when Intel stops supporting PCI, or when motherboards come out without PCI slots Creative will start making PCI-E, but until then, who needs them? They don't hurt in any way, not in performance, not in reliability. If they made them PCI-E so soon, they'd invest money in a product that currently makes no sense, and it would just jack up the costs.
  • lopri - Thursday, May 17, 2007 - link

    I somehow doubt that those 'tons of' folks with mobo without PCI-E wouldn't mind on-board sound. Heck. I have an SLI board and I rather make do with on-board sound than dealing with Creative garbage. X-F.. what? I also doubt X-Fi's target market is folks using 5 year old motherboards. Don't get me wrong. Their SB Live! is still decent and perfectly suited for older motherboards.

    And.. Mistake? Umm.. I wouldn't argue about PCI-E vs PCI here, but It's not exactly the case that Creative's PCI products and supports (which is non-existent, btw) are spectacular. They didn't even have a full driver download link until very recently. (They had no choice but to upload the drivers thanks to Vista)
  • TA152H - Friday, May 18, 2007 - link

    I'm not sure we're on the same page here. I thought you were implying that Creative needed to get their act together and get on the PCI-E bandwagon since that was what you were talking about. Apparently, you just don't like Creative and that was just kind of thrown in without respect to PCI-E.

    If so, I agree, they blow. Their software is horrible, and their hardware is overpriced. I don't know I'd go as far as to say that they are a monopoly; there are a lot of choices in the low end, but at the high end you can get a card from any maker you want - as long as it's Creative. I am really, really particular with respect to sound too, I have no tolerance for bad speakers or noisy computers because I listen to music on my computer, so it's extremely quiet. Unfortunately, I have to buy Creative and I have a love/hate feeling towards them. They do make the best stuff, but it's expensive, difficult and buggy. So, I know where you're coming from. Maybe NVIDIA should move into that market too. I think they'd eat up a half-rate company like Creative. How about AMD? Hell, if they're going to get into Fusion, why not do it right and put the sound processor there too? It's probably a matter of time. Sound is very important to gaming, and of course to watching TV and listening to music. Makes you wonder why more attention hasn't been placed on it, and substandard companies like Creative are given free reign.
  • PrinceGaz - Thursday, May 17, 2007 - link

    Although it was a nicely presented article on a product which is not exactly revolutionary, I must take issue with the game benchmarks which were included.

    Out of the seven games tested, only two of them had any results where the average was below 60fps; one where the lowest was 54fps and the other (which was the only one with meaningful framerates) being Supreme Commander where the P35 Crossfire configuration had driver issues.

    I know you might say that results of 80 vs 100 vs 120fps do still provide useful information regarding likely performance in future games, but the fact is that they don't as the demands made on the CPU, mobo, and graphics card of a much more demanding game running at 40fps tends to be quite different to that of a current game running at 120fps. I appreciate you must have spent an awful lot of time running them all (five times each for every setting, no less) but at the end of the day they didn't really provide any meaningful information other than that there are driver issues which need to be resolved (which is what we would expect).

    By the way, since you already went to the trouble of running every test five times, and discarded the two highest and lowest results to prevent them from unduly affecting an average; wouldn't it be a good idea to run the tests a sixth time so that the score you used is based on the average of two results rather than just the one in the middle? I imagine the 2nd-3rd-4th places were pretty close anyway (hopefully almost identical, with 1st place being very similar, and only 5th place somewhat slower because it was the first run), but for the sake of an extra 20% testing time a sixth run would involve, the statistical accuracy of using the mean of two results would be significantly improved.

    I will reiterate though that overall the review was informative and well written; it was only the benchmarks themselves which were a bit pointless.
  • DigitalFreak - Thursday, May 17, 2007 - link

    This is yet another perfect example of why ATI needs to open up Crossfire support to NVidia chipset motherboards. In the Intel space, the only supported chipsets that actually give them the bandwidth they need are the 975X and X38. I would think they would want to sell as many cards as possible.
  • OrSin - Thursday, May 17, 2007 - link

    SLI and Crossfire as far asIi can see are not needed for almost anything. I have a 6800 and 7900 and when I was shopping around I could not find a single reason to get another 6800 and go SLI instead just of getting a 7900. Thats the same for crossfire. SLI and crossfire support in games are just not good enough. The 6800 would have been 30% less then the 7900, but the gains would have been 60% less on a good day and no gain at all for several games.

    With all that rambling it just means that the P35 is a great board, so unless you need crossifre (and most should not) get it. And dont wait for the next over-hyped product (X38). Hows thats :)
  • PrinceGaz - Thursday, May 17, 2007 - link

    SLI/Crossfire is never needed a good upgrade path if the next-gen product is already out. You're almost always much better off selling your old card and buying a new one from the current generation as it works out no more expensive, but provides better performance, uses less power and makes less noise, and has none of the compatibility issues associated with twin graphics-card configurations.

    However, that does not make SLI and Crossfire useless. They are needed for bragging rights by people whose weekly shopping list includes having a large tub of liquid-nitrogen delivered, and by those who are worried about the size of their ePenis. The rest of us have no need of going the twin graphics-card route unless the money is burning a hole in our pocket anyway and we've nothing better to do with it.

Log in

Don't have an account? Sign up now