GPU Performance vs. CPU Clock Speed

For these tests, we took a single core AMD Athlon 64 (1MB L2) and increased its clock speed from 1.8GHz all the way up to 2.6GHz, measuring performance at each step of the way. The image quality settings haven't changed; what we're looking for here is if there's a pattern in the CPU/GPU relationship.

We picked four GPUs to look at their CPU dependency: the Radeon X1900 XT CrossFire, X1900 XT, X1800 XT and X1800 XL. We chose these four configurations because they represent the best ultra high end, high end, upper mid-range and mid-range GPUs for Oblivion. Our main interest is in finding out if there is a point at which the best mid-range GPU ends up being faster than the best high end GPU because of being paired with a faster CPU, or if having a faster GPU is really all that matters in Oblivion.

What this graph proves is that our Oblivion Gate benchmark is really only CPU bound if you've got a pair of X1900 XTs in CrossFire. What this does mean is that if you've got a low end Athlon 64, you won't see much of a performance difference between a single X1900 XT and a pair of them running in CrossFire mode. But for the most part this benchmark is no different than what we've seen from other games, with the X1900 XT, X1800 XT and X1800 XL being basically GPU bound - let's see if our other two tests show the same picture.

The Town benchmark is extremely CPU bound as you can see by this graph, and in Oblivion you do spend quite a bit of time walking around in towns. Being able to isolate the individual lines in this graph isn't very important because they basically all show the same thing, but what is important is to be able to look at the graph two dimensionally. What this graph shows us is that a single X1800 XT paired with a 2.4GHz CPU offers much better performance than an X1900 XT with a 1.8GHz Athlon 64, thus stressing the need to have a balanced CPU and GPU setup in order to avoid wasting money on a fast GPU. We already saw in our GPU performance article that CrossFire (and SLI) do nothing for performance in our Town/Dungeon benchmarks so the behavior here is not surprising.

Much to our surprise, the Dungeon benchmark ended up being a lot more GPU bound than the Town test but the conclusions we can draw are very similar. The Radeon X1800 XT does extremely well when paired with a high end CPU and will actually offer the same performance as a Radeon X1900 XT with a lower end or mid range CPU; it isn't until you give the X1900 XT a faster CPU as well that it can really stretch its legs and offer the performance advantage we know it has over its predecessor.

Oblivion CPU Performance Hyper-Threading and SMP Tweaks
Comments Locked

36 Comments

View All Comments

  • RyanHirst - Sunday, April 30, 2006 - link

    hehe it's ok. Just chompin' at the bit curious, that's all. If anyone on eBay wanted the collectible books I no longer need, I'd have turned them into a magic pair of 275s and I'd know. Alas. Pantone 072 Blue.
  • bob661 - Saturday, April 29, 2006 - link

    I guess this round of testing only applies to ATI video cards. I guess us Nvidia owners are left to guess how CPU performance affects GPU performance. Oh well. :(
  • PrinceGaz - Sunday, April 30, 2006 - link

    Just refer to part 1 of the Oblivion article and find out where your nVidia card of choice lies in relation to the four ATI cards tested this time and it is easy to see how it will perform with various CPUs.
  • Bladen - Saturday, April 29, 2006 - link

    My guess as to why CPUs help so much in towns is because the Radiant AI takes a fair amount of power.
  • BitByBit - Saturday, April 29, 2006 - link

    I don't know why, but the benchmark graphs didn't appear at all in that review, nor did they in the previous Oblivion review; I get nothing where the graphs should be.
    Has anyone else had this problem?
  • blackbrrd - Saturday, April 29, 2006 - link

    If you have turned off refering you won't see any images.
  • JarredWalton - Saturday, April 29, 2006 - link

    Someone else had a problem with Norton Internet Security blocking images for some reason.
  • Madellga - Monday, May 1, 2006 - link

    That was me. Turn off privacy control.
  • RyanHirst - Saturday, April 29, 2006 - link

    This article still left me with the nagging question about multithread performance. The oblivion team made reference to th game being heavily optimized for multithread performance because they knew from the beginning they'd be writing it simultaneously for the XBox360.
    So the debate about the potential of multithread code in games has been going on for awhile, and here we have the perfect game test, and we happen to know AnandTech had a couple of four-way servers in the shop over the last few weeks.... but the CPU guide leaves that question unanswered.
    If it's not unreasonable to heap $1200 in graphics hardware onto a M/B for a game that is GPU bound only half of the time (outdoors), is it too much to ask that a $1200 pair of Opteron 275's be tested to see how multithread the first advertised multithread game really is? Is it not possible that the game can offload a large number of threads to extra cores?
    If we can talk about throwing over $1K at a game, isn't anyone the least bit curious how a 4-core opteron rig with 4 gigs of RAM in NUMA might handle this game?
  • JarredWalton - Saturday, April 29, 2006 - link

    P4 EE 955 runs up to four threads, and it doesn't seem to get any help from the extra capability. It could be that further INI tweaks would allow Oblivion to run better on multi-core (more than 2 core) systems, but if going from 1 to 2 cores gives 10 to 20% more performance, there's a very good chance that moving from 2 to 4 cores wouldn't give more than another 5%. Ideally, a CPU-limited game should be able to get as much as 50% or more performance from multi-threading, but rarely can we realize the ideal case.

    Also, FYI, the servers are at a completely different location than the GPUs for this testing. They also don't support dual X1900 cards in CrossFire - they might not even have X16 PCIe slots, let alone two of them. Servers are, quite simply, not interested in improving gaming performance. There are a few out there targeting the 3D graphics workstation that might support SLI, but not CrossFire. Multi-core will really only become important when we have multi-core CPUs. The desktop/home PC market isn't interested in multiple socket motherboards (other than a few extreme enthusiasts).

Log in

Don't have an account? Sign up now