GPU Performance vs. CPU Clock Speed

For these tests, we took a single core AMD Athlon 64 (1MB L2) and increased its clock speed from 1.8GHz all the way up to 2.6GHz, measuring performance at each step of the way. The image quality settings haven't changed; what we're looking for here is if there's a pattern in the CPU/GPU relationship.

We picked four GPUs to look at their CPU dependency: the Radeon X1900 XT CrossFire, X1900 XT, X1800 XT and X1800 XL. We chose these four configurations because they represent the best ultra high end, high end, upper mid-range and mid-range GPUs for Oblivion. Our main interest is in finding out if there is a point at which the best mid-range GPU ends up being faster than the best high end GPU because of being paired with a faster CPU, or if having a faster GPU is really all that matters in Oblivion.

What this graph proves is that our Oblivion Gate benchmark is really only CPU bound if you've got a pair of X1900 XTs in CrossFire. What this does mean is that if you've got a low end Athlon 64, you won't see much of a performance difference between a single X1900 XT and a pair of them running in CrossFire mode. But for the most part this benchmark is no different than what we've seen from other games, with the X1900 XT, X1800 XT and X1800 XL being basically GPU bound - let's see if our other two tests show the same picture.

The Town benchmark is extremely CPU bound as you can see by this graph, and in Oblivion you do spend quite a bit of time walking around in towns. Being able to isolate the individual lines in this graph isn't very important because they basically all show the same thing, but what is important is to be able to look at the graph two dimensionally. What this graph shows us is that a single X1800 XT paired with a 2.4GHz CPU offers much better performance than an X1900 XT with a 1.8GHz Athlon 64, thus stressing the need to have a balanced CPU and GPU setup in order to avoid wasting money on a fast GPU. We already saw in our GPU performance article that CrossFire (and SLI) do nothing for performance in our Town/Dungeon benchmarks so the behavior here is not surprising.

Much to our surprise, the Dungeon benchmark ended up being a lot more GPU bound than the Town test but the conclusions we can draw are very similar. The Radeon X1800 XT does extremely well when paired with a high end CPU and will actually offer the same performance as a Radeon X1900 XT with a lower end or mid range CPU; it isn't until you give the X1900 XT a faster CPU as well that it can really stretch its legs and offer the performance advantage we know it has over its predecessor.

Oblivion CPU Performance Hyper-Threading and SMP Tweaks
Comments Locked

36 Comments

View All Comments

  • feraltoad - Friday, May 5, 2006 - link

    I would have like to seem some High Resolution Benchmarks, or at least one. At 1680*1050 would this show that the CPU was no longer the weakest link with the Video Card Scrambling to render the scene? Would the exra mhz of the cpus still have a marked effect? Maybe you see the answers to these questions as obvious but not me. :( I play at this resolution. I'm sure some people complain but I've tried running around with fraps in different places and as long as the FPS doesn't seem to drop below 24fps then it isn't bad at all. If I look at an Oblivion gate I get 39 fps steady, granted this isn't with Dremoras trying to club me. I find that long views in towns (anvil) are rougher than the Oblivion gates. Perhaps its all the NPCs or shadows of buildings?

    However, very nice article, and VERY DEPRESSING! Sad to see the Kings of the Video Cards struggle with a game this much. Also insanely pathetic to see that a fx60 will net you a whopping 9 fps in the most demanding part of the game. 9FPS?!? That's a frickin' $1000 dollar chip. You could buy it a $115 dollar A643000+ and OC it (just 250htt) to just 3500+ speeds and you only have a 9 fps difference from a $100 to a $1000 chip? That is messed up. (yeah I know you can OC the fx too, but still at $1000 would you expect that to be necessary? It should brew coffee w/the other core while using office apps. also I don't believe anybody buys a 3000+ w/out planning to OC) So much for "getting what you pay for". I guess it is comforting to normal people who can't drop a grand on the cpu alone.

    I too would have like to seen if a Aegia Phyx chip would have made a difference by freeing up the cpu for other things. Aegia could have really gained some momentum if they could have had this game support their chip & had a board for around ~$150.

    People keep talking about patches. Can a patch really signifigantly optimize a game anyway than decreasing quality for performance?

    Jarred what is the biggest increase in performance in a single game that you have seen with new video card drivers? Can we expect Nvidia or ATI to dramatically increase performance with a drive update?

    After all, this will hurt vid card sales if people can expect sub-mediocre performance from the best graphics cards. It will also boost 360 sales if they can't increase performance. Especially, when I release my patent pending super dongle that will allow two Xbox 360s to be used together in a SLI/Crossfire configuration. haha (j/k. but my dongle is quite super, I assure you. ;)
  • moletus - Tuesday, May 2, 2006 - link

    2 months ago, i was about to go shop 1900xtx for 600euros. Than i saw Xbox360.
    Guess what! (hint: it costed with all the bells and whistles 550eur :)

    Why on earth burn so much money when you can have all the chocolate(+AA and HDR) in the world for 1/3 of the money and still cant run the game?

    Im no MickeySoft fanboy but the 360-ride has been surprinsinly smooth.
  • The Black Cursor - Monday, May 1, 2006 - link

    Both articles were quite informative, but I don't quite have enough information to make a decision between the upper-mid range GPUs and CPUs at similar price points,

    ATI X1800XT vs. NVIDIA 7900GT

    AMD A64 3700+ vs. AMD A64 X2 3800+

    Any opinions?


    Be seeing you...

    ---> TBC (To Buy Components)
  • xFlankerx - Monday, May 1, 2006 - link

    I say the X2 3800+ with the 7900GT. While the X2 most likely will not make a difference in the game itself, as noted by my last comment, it WILL provide for an overall smoother experience from your PC.
  • xFlankerx - Sunday, April 30, 2006 - link

    I just came from reading the Anandtech Oblivion CPU testing. I beg to differ with the conclusion that you need a fast CPU to get the most out of your GPU. I'm more of the opinion that the CPU and GPU handle different tasks. We know that the CPU is much stronger than the GPU in MOST cases. Now the CPU is what provides the GPU with instructions for what the GPU needs to do. If the CPU is already feeding the GPU more instructions than it can handle, then there is no point in having a faster CPU. See, there was hardly a difference in the FPS even with a 1.8GHz A64 when compared to a 2.6GHz A64, in areas where it's GPU-intensive. The Oblivion gates are by far the most intesive part of the game, GPU-wise. Now every single one of the single-card solutions bottlenecked the CPU in these cases. Only Dual X1900XTXs were able to take advantage of the CPU.

    On the other hand, the Town benchmarks are influenced quite a bit by the CPU. This is easily explained when you notice that in a town, there is a lot for the CPU to calculate that is not in the Oblivion Gates. There are hordes of NPCs in a city, and the CPU has to control every single one of them (oblivion NPCs lead lives of their own, by far the most complicated AI in a game yet). Thus, the stronger the CPU, the better your frames per second in a crowded area will be. The clumpiness of all the videocards performance here somewhat reinforces the point.

    The Dungeon benchmarks did surprise me. There is a decent amount for the GPU to render, though not as much as the other areas. However, there is very little for the CPU to render. And yet, we see quite a bit of improvement with a faster CPU. I'm not entirely sure how to explain that.

    My point is, basically, that "You don't need a fast CPU to get the most out of your GPU." If anything, it should be the other way around, since the GPU is what can't keep up with the CPU. I think that the conclusion should be more like, "You need a fast GPU to handle the graphics of the game, but you will suffer a drop if FPS if your CPU cannot keep up with the AI in the game in crowded areas."
  • kuk - Sunday, April 30, 2006 - link

    Any idea on how a Core Duo setup would perform?
  • ballero - Sunday, April 30, 2006 - link

    Anand, are you sure that the performance increases are due to the game?
    AFAIK Catalyst are multithreading driver.
  • RyanHirst - Saturday, April 29, 2006 - link

    Ok, I was just needling a bit about the opteron servers; I didn't really think they'd be a part of the test. :>

    First, PCI-X x16 is standard on all dual-socket940 NForce professional boards, and several have 2 16x slots. I did forget that you can't do CrossFire on an Nvidia board-- but even just checking a server board with one, two, and four cores as a different test, would that not be worth a look?

    Hyperthreading doesn't really answer the question. A (perhaps) 5% improvement with HT is consistent with the "6 of one...." results HT gives in dynamic multithread envirnonments (chess, say, or video games -- where the thread results must be compared to values from other threads, or completed within a certain time). There never has been any reason to assume that, because a progam does not get a boost from HyperThreading, it will not benifit from an additional full processor.
    I'm not arguing that a dual socket-940 rig is mainstream, but should we argue about which is less relevant to 99.9% of the gaming public, an Opteron or a $1200 cross-fire setup? The point I'm trying to make is that regardless of how common it is, a multicore system could answer some questions that remain open. Namely,
    --What are the maximal advantages of multithreading in this first generation of games?

    Every review site seemed happy to pontificate on this subject when there was virtually no data, but now that there is a test candidate, no one seems interested in actually testing it.
    Was it ever in doubt that you can improve a game's performance by throwing a CrossFire x1900xtx at it? No one would argue with this premise. *THE* question that chipmakers are taking revolutionary stands on for video games... is about cores. And I would just REALLY like someone to take a look at it.
    Sorry that was long; I realized the first post sounded grumpy because I didn't explain my reasoning. Cheers, and I take the time to write this because your reviews are always the best; when I ask myself what information I would need to make an objective judgement about something, it's always information you present and analyze. Thanks again!
  • RyanHirst - Saturday, April 29, 2006 - link

    Oh, yeah and about the small boost in performance of the second core: since the multithread coding in Oblivion is (by their own statements) a result of their XBox360 version, I was assuming that the game would be generating low-complexity, nearly branchless threads, none of which would tax the extra processor at anything close to 100%... but that the game engine can generate a LOT of them. Is that not more or less the model of the XBox360 processor?
  • JarredWalton - Saturday, April 29, 2006 - link

    Xbox 360 processor is quite a bit different, and handling up to six threads over three cores. The problem is, it runs a completely different machine architecture, so it runs through different compiler. Even if their code to take full advantage of the Xbox 360 capabilities, they might not bother porting the entire spectrum over to x86. More likely -- and as far as I know there's no way to prove this -- the Xbox 360 version doesn't use more than about half the potential of the Xenon CPU. (Go ahead and call me "doubting Thomas". LOL)

    Anyway, maybe Anand will find a time to try running a few tests on a quad core versus dual core setup. I unfortunately don't have access to any quad core systems. :(

Log in

Don't have an account? Sign up now