Test Setup

As we tried to determine what sort of systems we should include in our FX530 comparison, the logical choice became the various quad core systems tested in our AMD Quad FX article. That will allow us to compare results of the overclocked QX6700 with a base configuration, and we also get to look at how the current AMD quad core (dual socket) solution stacks up. Needless to say, given that QX6700 was already faster in almost every benchmark, the results aren't going to be too favorable towards AMD right now. The fact that a major OEM is willing to release an overclocked QX6700 system, however, is a testament to the excellent design of Intel's Core 2 architecture. The overclocked QX6700 definitely consumes a lot of power compared to many of the Core 2 Duo solutions, but the truth is that it's really not that much more power-hungry than many of the faster Pentium D offerings.

Gateway FX530XT
Motherboard: Intel BTX 975X (custom)
Processor: Core 2 Extreme QX6700 Overclocked
(12x266MHz 3.20 GHz 2x4MB shared L2 cache)
Heatsink/Cooling: Custom BTX CPU HSF with dual 120mm fans at front and rear of case
RAM: 2x1024MB Hynix DDR2-667 5-5-5-15
Graphics: ATI Radeon X1950 XTX CrossFire with ATI Radeon X1950 XTX
EVGA GeForce 8800 GTX
Hard Drives: 2x150GB Western Digital Raptor 16MB 10000 RPM in RAID 0
Optical Drives: HL Data Storage GSA-H11N 16X DVD+RW
Lite-On SOHC-4836V 16X DVD-ROM/CD-RW Combo
Audio: Creative Sound Blaster X-Fi
TV Tuner: ATI Theater 550
Power Supply: 700W Delta Electronics
Operating System: Windows Media Center Edition 2005 SP2b

We will be running most of the benchmarks used in the AMD Quad FX article, although we will have to omit the Blu-ray plus video encoding multitasking benchmarks as we did not have a Blu-ray drive on hand for this review. What we're primarily interested in determining is how much of a performance increase we get with the 20% overclock, given that we're also running slightly slower memory than in previous tests. We expect to see an average performance improvement of around 15% in solutions that are CPU limited. We will also include a few additional synthetic benchmarks to further investigate the FX530's performance. In a few instances, we will also provide results from previously tested systems for comparison.

The system we tested was initially shipped with Windows XP MCE 2005 installed, but recently Vista officially became available. Right now, the choice to get XP MCE is still there, but in the near future the FX530 will likely become a Vista-only solution. Those looking to move to the new OS will not have a problem with this, though we're still not ready to make the switch on most of our PCs. Vista has its good points, but the driver situation still needs work. If you choose to get XP installed, a free upgrade to Vista is currently available. That is set to end in March, after which customers might be forced to get Vista. Most people are eventually going to move to Vista, but waiting for things to settle down a bit more certainly won't hurt.

Internals and Design General Performance
Comments Locked

26 Comments

View All Comments

  • akers - Tuesday, March 20, 2007 - link

    Can anyone shead some light on why Gateway is delaying shipment on the FX530? I have had two delays so far and they cannot promise that it will be deliered by the second delay date. I have heard that there were so Vista problems but it was fixed by now.
  • rfaster - Thursday, March 22, 2007 - link

    My system arrived last week - I ordered it bare bones with the quad core OC'd to the 3.2.

    Specs - I put in a 8800GTX (fac OC'd to 600) - 2nd slot so its only running at 4X ( I did not realize this until I read the great article on this site), I'm running 2 150 10K raptor's, 4GB 667 ram. The best I can do is low 9K's on 3DMark 2006 (Running Vista Ultimate 32bit). I'm seeing easy 1,200's from other folks with similar setups.

    Question - Is the 4X for Slot 2 causing the SLOWNESS? As you pointed out in your article there is NO way to fit the 8800GTX into slot 1 - so I am trying to decide if I should accept the 4X speed on my $699 8800GTXOC - or ship this pc back. I hate to think that my $699 video card is a WASTE on this system due to the 4X?
  • rfaster - Thursday, March 22, 2007 - link

    Akers - on the delay I was told they are having a difficult time sourcing the parts needed to build this system. I was a bit put off by the delay on getting this box -- reminded me of my experience with Alienware a few years back ----
  • JarredWalton - Monday, March 26, 2007 - link

    Right now, Vista plus 8800 GTX is probably going to be a bit slower than normal. Still, I wouldn't worry too much - you can see that your low 9000s score matches what I got in 3DMark06... which is really just a benchmark and not an actual game.
  • Darkskypoet - Thursday, February 15, 2007 - link

    Now, correct me if I am wrong... But One of the major hinderances to the Quad FX platform (yes I realize 2 dual core chips is power hungry, and inelegant vs a dual die Quad core) is NUMA, rather the lack of proper NUMA support in XP. Looking at the benchmarks (and in fact all Quad FX bench's) sites continue to use XP variants to benchmark the Quad FX systems vs Conroe. XP does not support NUMA, one article in particular I had read mentioned this fact explicitly, and also mentioned that in many cases accessing data in memory in a NUMA dumb system incresed memory latency SUBSTANTIALLY. Consider that in a NUMA dumb O/S, the data required for a process / thread assigned to one chip, could inadvertantly have it's data stored in memory directly linked to the other CPU. This alone hurts benchmarking scores like crazy. In reality a Quad FX setup, if benched with real SMP/ SMC aware software, should eek out a higher per core performance vs Quad Core Conroe, then an X2 vs Dual Core Conroe.
    I saw this because the interconnect superiority (When run with NUMA Aware O/S : Vista / Linux / Etc.) will show itself vs the somewhat limited FSB in use in Quad Core C2D implementations; thus increasing performance per core vs Quad Core C2D.

    I'm not saying we're gonna see the Quad FX Systems out perform C2D systems here. However, given proper NUMA support, the Benches will be a lot closer. Added to that we can use 2xxx series opterons in QUAD FX, and it starts to become a bit of a nicer picture for AMD. The icing on the cake however, would be that one should expect to be able to drop 2 native Quad Cores on to the QUAD FX boards in the near future.

    I believe Nintendo Summed it up for us previously, "Now you're playing with power". If AMD follows this track, then they have a platform out that is fully tested, and stable; running 2 NAtive Quad core chips for the Enthusiast market. As unknown as the Performance of K10 is at this stage, 8 cores with should be mighty interesting. Mighty interesting Indeed.

    Anyone know of a Proper NUMA aware OS used in Quadcore C2D vs Quad FX benchmarking?

  • Tuvoc - Saturday, February 17, 2007 - link

    Windows XP x64 edition DOES support NUMA. I have dual Opteron 265s (nicely overclocked from 1.8 to 2.2) and as long as the BIOS is set correctly, then Sandra reports the NUMA status

    I also have an Intel Quad core, and it is blindingly fast....
  • roflsaurus - Tuesday, February 13, 2007 - link

    BTX case?
  • JarredWalton - Tuesday, February 13, 2007 - link

    BTX is a new form factor that Intel came up with a couple years ago, but the computer parts manufacturers have been relatively slow to adopt it. Basically, it reorganizes the locations of various parts in order to allow for better cooling. Motherboards are also mounted on the offices side of the case, compared to ATX. So where you would open the left side of the case on an ATX system, on the BTX case you would open the right side. If you were to put an ATX motherboard and a BTX motherboard next to each other, on the BTX motherboard everything would appear to be "backwards".
  • Tuvoc - Sunday, February 11, 2007 - link

    They say Gateway had to increase the voltage to make the overclock stable - but by how much ? That would have been intresting to know. Also core temps under full load certainly would have been very interesting
  • JarredWalton - Monday, February 12, 2007 - link

    The motherboard doesn't appear to work all that well at higher FSB speeds, so Gateway's overclocking is accomplished via changing the multiplier. More on this in a moment.

    Voltages are also a bit odd. CPU-Z reports 1.238V, but the BIOS is set to 1.450V. Obviously, there's a pretty big difference, and which is more accurate I cannot say. That illustrates the problem with reporting CPU temperatures as well: the BIOS/motherboard implementation will have an impact, as they can read the thermistor differently. Basically, you only end up comparing the Gateway results to itself, and the important thing is that there were no issues with stability when running overclocked.

    Back to the FSB stuff. The BIOS has support for adjusting FSB speed and RAM speed, but only in large steps. The FSB can be set to 533, 800, 1067, and 1333 - default being 1067 for Core 2. The RAM can be set to DDR2-400, 533, and 667 (or Automatic). Basically, all of these items select a ratio and bus speed. DDR2-533 represents a 1:1 bus/RAM ratio, while 400 is 3:4 and 667 is 5:4. Using those ratios, you can use the FSB-1333 speed to modify the overclocks a bit. I was able to run the bus at 1333 with DDR2-533 and a 10X multiplier to end up at a 3.33 GHz CPU speed (and a real DDR2 speed of 667).

    RAM voltages can be adjusted as well, but only to 1.8, 1.9, 2.1, or 2.2V. I didn't play with these at all. No point in trying to fry Gateway's equipment. I would venture to guess that the CPU could run at 3.3-3.5GHz if you want to push things (3.33 seemed perfectly fine in somewhat limited testing), but again I don't want to push too hard and end up with a dead PC/CPU/RAM/mobo/whatever.

    Hope that helps,
    Jarred Walton
    Editor
    AnandTech.com

Log in

Don't have an account? Sign up now