Gaming with Core 2 and CrossFire on 975X

We were so used to getting excited over AMD processor launches that we almost forgot what an important Intel CPU launch was like. You see, AMD and Intel behave very differently when at a dinner table preparing to eat their meals. AMD will eat when its partners eat; companies like ATI and NVIDIA get to share in the joy of a new AMD product launch as they are busy building chipsets for the new platform. That's why we get a new nForce chipset whenever AMD launches a new CPU. Intel on the other hand isn't as generous; Intel likes to eat first, and then whatever remains after it's nice and full can be scraped off the table and given to its partners. This is why today's launch is taking place pretty much exclusively on Intel chipsets, with retail products based on ATI/NVIDIA chipsets shipping in the coming months.

Intel's table manners aren't as nice as AMD's largely because they don't have to be. Intel has a lot more fabs than AMD, however they aren't all pumping out 65nm Core 2 Duos on 300mm wafers; instead many of them are still using old 90nm or 130nm process technology. It's not exactly economically feasible to keep converting all of the fabs to the latest technology as soon as it's available, so Intel uses up excess capacity in its older fabs by producing chipsets. AMD does not have this luxury so it depends on companies like ATI, NVIDIA, SiS and VIA for the platform side of things, and thus is much nicer at the dinner table.

Eating habits aside, what this means for us is that our only real options to test Core 2 Duo are with Intel chipsets. NVIDIA's nForce 590 SLI reference board for Core 2 Duo is in our labs but its BIOS isn't finalized yet so NVIDIA is asking us to hold off on using it for a couple more weeks. At the same time, we're hearing that we shouldn't expect any retail motherboards using ATI chipsets for Core 2 Duo motherboards until September at the earliest, once again leaving us with Intel.

Don't get us wrong; Intel chipsets are far from a terrible option. In fact, Intel continues to make extremely trouble-free platforms. It's not stability or performance that we're concerned about, as Intel has got both of those down pat. The issue however is multi-GPU compatibility.

You see, NVIDIA is a lot like Intel in that it wants to eat first or maybe, if the right people are at the table, at the same time as its partners. The problem with two companies that have identical eating habits is that no one ends up eating, and thus we have no SLI support on Intel chipsets. NVIDIA views this as an upper hand because honestly it's the only tangible advantage anyone has ever held over an Intel chipset since the days when Intel and Rambus were inseparable. If you want the best multi-GPU solution on the market you buy NVIDIA graphics cards, but they won't run (together) on Intel chipsets so you've got to buy the NVIDIA chipset as well - sounds like NVIDIA is trying to eat some of Intel's dinner, and this doesn't make Intel very happy.

Luckily for Intel, there's this little agreement it has with NVIDIA's chief competitor - ATI. Among other things, it makes sure that Intel platforms (or platform in this case, since it only officially works on the 975X) can support CrossFire, ATI's multi-GPU technology. Unfortunately, CrossFire isn't nearly as polished as NVIDIA's SLI. Case in point would be benchmarking for this Core 2 Duo article, which used a pair of X1900 XTs running in CrossFire mode. During our testing, CrossFire decided to disable itself after a simple reboot - twice. No warnings, no hardware changes, just lower frame rates after a reboot and a CrossFire enable checkbox that had become unchecked. Needless to say it was annoying, but by now we already know that CrossFire needs work and ATI is on it.

More than anything this is simply a message to ATI and Intel: if CrossFire had been in better shape, the high end gaming enthusiast could have been satisfied today, but instead they will have to wait a little longer for the first nForce 500 motherboards with Core 2 support to arrive (or settle for a nForce 4 board with Core 2 support).

Why does multi-GPU even matter? Given how fast Intel's Core 2 processors are, we needed to pair them with a GPU setup that was well matched - in this case we went with a pair of X1900 XTs running in CrossFire mode. With a pair of X1900 XTs we could run at 1600 x 1200 for all of our gaming tests, achieving a good balance between CPU and GPU loads and adequately characterizing the gaming performance of Intel's Core 2 line.

Encoding Performance using DivX 6.1, WME9, Quicktime (H.264) & iTunes Gaming Performance using Quake 4, Battlefield 2 & Half Life 2 Episode 1
Comments Locked

202 Comments

View All Comments

  • arachimklepeto - Tuesday, July 25, 2006 - link

    And what about noise Core 2 Duo fan(decibels)?
  • bmaamba - Tuesday, July 25, 2006 - link

    Hi,
    Acc. to Toms hardware, for EIST to work, setting in Control panel has to be changed from "desktop" to "portable/laptop".AT guys, was this done? If not, how about putting it in the "Power consumed" graphs?(acc. to Tom(if i rem. right), least power in this mode is about 25watts by core 2 duo!!!).Also anyone knowledgeable, is this setting available in Linux?
    Also, how about putting XP X2 3800+ EE in the encoding benchmarks (along with core 2 duo 6300)?
    Thanks
    Ed
    PS.Price and power consumed when idle are v. imp. to me.
  • herkulease - Thursday, July 20, 2006 - link

    Unless I missed it what are temps like on these.

  • Justin Case - Monday, July 17, 2006 - link

    What the heck is a "composite score"...? What are the units? How about giving us rendering times (you know, minutes, seconds) and render settings, so the numbers actually mean something...?
  • rahvin - Monday, July 17, 2006 - link

    Where's a good 64bit comparison on Linux and a LAMP stack run at 64bit? There hasn't been a serious linux server benchmark posted.
  • BikeDude - Sunday, July 16, 2006 - link

    I'd love to see some timings from a C++ compiler or two... Looks like I'll have to revise our standard developer PC configuration.

    --
    Rune
  • kmmatney - Sunday, July 16, 2006 - link

    "Jarred that would be great to see. The E6300 and X2 3800+ seem close, but the final AMD pricing and the overclocking potential of each could really make either the clear winner for performance per dollar in the midrange segment."

    Yes - this is the test that most people want to see. I';m sure a lot of people are like me, and don't much care about any processors over $200. We want to see that the low end can do!! The AMD X2 3800+ is going to be even lower priced than the E6300, so there may be a good battle at teh low cost end.
  • aznskickass - Sunday, July 16, 2006 - link

    Battle? What battle? The war is over my friend. ;)

    The E6300 wins hands down vs X2 3800+, even more so once both are overclocked:

    http://xbitlabs.com/articles/cpu/display/core2duo-...">http://xbitlabs.com/articles/cpu/display/core2duo-...
  • Jeff7181 - Saturday, July 15, 2006 - link

    Would have been nice to see a Core Duo CPU in there too just for comparison for those of us with laptops who might considering spending $200 on a Merom if it would increase performance 10-20% over a Yonah with the same power consumption.
  • IntelUser2000 - Thursday, July 20, 2006 - link

    quote:

    Would have been nice to see a Core Duo CPU in there too just for comparison for those of us with laptops who might considering spending $200 on a Merom if it would increase performance 10-20% over a Yonah with the same power consumption.


    Link: http://www.trustedreviews.com/article.aspx?art=316...">http://www.trustedreviews.com/article.aspx?art=316...
    http://www.hardware.fr/articles/623-10/intel-core-...">http://www.hardware.fr/articles/623-10/intel-core-...

    Core 2 Duo E6400 2.13GHz is approximately 15% faster than Core Duo T2600 2.13GHz, in addition to the fact that 4MB cache versions are 3% in average faster, it looks estimation of 10-20% faster per clock than Yonah is right, even with the 2MB cache version.

Log in

Don't have an account? Sign up now