Gaming with Core 2 and CrossFire on 975X

We were so used to getting excited over AMD processor launches that we almost forgot what an important Intel CPU launch was like. You see, AMD and Intel behave very differently when at a dinner table preparing to eat their meals. AMD will eat when its partners eat; companies like ATI and NVIDIA get to share in the joy of a new AMD product launch as they are busy building chipsets for the new platform. That's why we get a new nForce chipset whenever AMD launches a new CPU. Intel on the other hand isn't as generous; Intel likes to eat first, and then whatever remains after it's nice and full can be scraped off the table and given to its partners. This is why today's launch is taking place pretty much exclusively on Intel chipsets, with retail products based on ATI/NVIDIA chipsets shipping in the coming months.

Intel's table manners aren't as nice as AMD's largely because they don't have to be. Intel has a lot more fabs than AMD, however they aren't all pumping out 65nm Core 2 Duos on 300mm wafers; instead many of them are still using old 90nm or 130nm process technology. It's not exactly economically feasible to keep converting all of the fabs to the latest technology as soon as it's available, so Intel uses up excess capacity in its older fabs by producing chipsets. AMD does not have this luxury so it depends on companies like ATI, NVIDIA, SiS and VIA for the platform side of things, and thus is much nicer at the dinner table.

Eating habits aside, what this means for us is that our only real options to test Core 2 Duo are with Intel chipsets. NVIDIA's nForce 590 SLI reference board for Core 2 Duo is in our labs but its BIOS isn't finalized yet so NVIDIA is asking us to hold off on using it for a couple more weeks. At the same time, we're hearing that we shouldn't expect any retail motherboards using ATI chipsets for Core 2 Duo motherboards until September at the earliest, once again leaving us with Intel.

Don't get us wrong; Intel chipsets are far from a terrible option. In fact, Intel continues to make extremely trouble-free platforms. It's not stability or performance that we're concerned about, as Intel has got both of those down pat. The issue however is multi-GPU compatibility.

You see, NVIDIA is a lot like Intel in that it wants to eat first or maybe, if the right people are at the table, at the same time as its partners. The problem with two companies that have identical eating habits is that no one ends up eating, and thus we have no SLI support on Intel chipsets. NVIDIA views this as an upper hand because honestly it's the only tangible advantage anyone has ever held over an Intel chipset since the days when Intel and Rambus were inseparable. If you want the best multi-GPU solution on the market you buy NVIDIA graphics cards, but they won't run (together) on Intel chipsets so you've got to buy the NVIDIA chipset as well - sounds like NVIDIA is trying to eat some of Intel's dinner, and this doesn't make Intel very happy.

Luckily for Intel, there's this little agreement it has with NVIDIA's chief competitor - ATI. Among other things, it makes sure that Intel platforms (or platform in this case, since it only officially works on the 975X) can support CrossFire, ATI's multi-GPU technology. Unfortunately, CrossFire isn't nearly as polished as NVIDIA's SLI. Case in point would be benchmarking for this Core 2 Duo article, which used a pair of X1900 XTs running in CrossFire mode. During our testing, CrossFire decided to disable itself after a simple reboot - twice. No warnings, no hardware changes, just lower frame rates after a reboot and a CrossFire enable checkbox that had become unchecked. Needless to say it was annoying, but by now we already know that CrossFire needs work and ATI is on it.

More than anything this is simply a message to ATI and Intel: if CrossFire had been in better shape, the high end gaming enthusiast could have been satisfied today, but instead they will have to wait a little longer for the first nForce 500 motherboards with Core 2 support to arrive (or settle for a nForce 4 board with Core 2 support).

Why does multi-GPU even matter? Given how fast Intel's Core 2 processors are, we needed to pair them with a GPU setup that was well matched - in this case we went with a pair of X1900 XTs running in CrossFire mode. With a pair of X1900 XTs we could run at 1600 x 1200 for all of our gaming tests, achieving a good balance between CPU and GPU loads and adequately characterizing the gaming performance of Intel's Core 2 line.

Encoding Performance using DivX 6.1, WME9, Quicktime (H.264) & iTunes Gaming Performance using Quake 4, Battlefield 2 & Half Life 2 Episode 1
Comments Locked

202 Comments

View All Comments

  • Anand Lal Shimpi - Friday, July 14, 2006 - link

    Corrected, it was a misprint.

    Take care,
    Anand
  • Zorba - Friday, July 14, 2006 - link

    Why is the article talking about how Intel is killing AMD on power consumption when AMD is on the top for both idle and load? If you are doing a performance/watt ratio you need to show that on the graph. This page (page 7) just makes the how article look completely baised.
  • Calin - Friday, July 14, 2006 - link

    Because the EE SFF processors were hard to obtain by Anandtech even for testing purposes. I'm not sure they are available in retail market any more than Conroe is
  • Anand Lal Shimpi - Friday, July 14, 2006 - link

    The Core 2 Extreme X6800 has a performance per watt score of 0.3575 in WME9 compared to 0.2757 for the X2 3800+ EE SFF. I'll put together a performance per watt graph now and see if I can stick it in there.

    Take care,
    Anand
  • Anand Lal Shimpi - Friday, July 14, 2006 - link

    I included the performance per watt scores I mentioned above in the review now, hopefully that will make things a little more clear.

    Take care,
    Anand
  • JarredWalton - Friday, July 14, 2006 - link

    I don't see the chart, Anand - I hope I didn't accidentally overwrite your change. Sorry!
  • MrKaz - Friday, July 14, 2006 - link

    Don't put it because it's a biased chart,

    Why based on WM9 benchmark? Why not one of the others?

    Why put it now, if you never put it when A64 was killing the P4s?
  • coldpower27 - Friday, July 14, 2006 - link

    Because AMD didn't real make a big deal about the performance per watt intiative back in the day. They focused on their price/performance instead.
  • MrKaz - Friday, July 14, 2006 - link

    So?

    Just because Intel focuses now on that Anandtech must be obliged to put it?

    So for you where was the price/performance (A64 vs P4) charts on Anandtech reviews?
  • coldpower27 - Friday, July 14, 2006 - link

    Yeah, due to their making people aware of it, it has now become a issue.

    It was only after Prescott, that we became more aware that thermals were starting to get out of control and paid more attention to wattage numbers.

    Price/Performance is not as hard to calculate.

Log in

Don't have an account? Sign up now