Oblivion CPU Performance

Here's one thing we really didn't expect, for our most GPU intensive test to be extremely CPU dependent as well. In its natural state with no out-of-game tweaks, Oblivion will give dual-core CPUs about a 10% increase in performance over their single core counterparts at the top of our charts. Moving down the graphs, the X2 3800+ has a 15% performance advantage over the 3200+, while the Pentium 930 has a 20% advantage over a higher clocked Pentium 4 641.

While 10% may not sound like a lot, especially given that our FRAPS benchmarks can vary by up to 5% between runs, keep in mind that this is an extremely GPU intensive benchmark. A 10% difference with the fastest clockspeeds available is pretty significant. Intel clearly has more need of help, and the larger 15-20% boost they get from dual-core processors is nice to see. Unfortunately, it's not nearly enough to catch up to AMD's competing offerings.

As we've seen in other games, AMD's Athlon 64 X2 and vanilla Athlon 64 are the way to go, taking up the overwhelming majority of the top spots in this graph. The Pentium Extreme Edition 955 is about the only Intel CPU that is significantly competitive here; the Pentium D 930 is the next time that Intel makes another appearance and it offers performance lower than the single core Athlon 64 3500+.

Later in this review we will look at another way of characterizing CPU performance in Oblivion, but rest assured that this graph is far more important than just stating for the millionth time that the Athlon 64 FX-60 is faster than the Athlon 64 3500+....

Once again, we see a 10-15% boost from dual core CPUs in our Town benchmark. The rest of the standings and performance echo what we saw in the Oblivion Gate test above. The usefulness of the Radeon X1900 XT CrossFire setup diminishes significantly as you begin to move down the list of contenders; you'll actually lose over 20% of your real world frame rate if you've got an Athlon 64 3500+ vs. if you had an Athlon 64 FX-60. These aren't low resolution tests designed to isolate the impact of your CPU, these are tests at reasonable display settings for the GPU setup and these are real world results.

Oblivion also clearly benefits from larger CPU caches. The abysmal performance of the Celeron D 351 exemplifies this statement, but you can also look at higher performing parts. The difference between the Athlon 64 3700+ and the 3500+ is around 5-7%, which is more than many other titles show.

Our final Oblivion benchmark delivers the same message we've seen in the first two: CPU performance matters. But the real question is, how fast of a CPU do you really need to make your GPU investment worthwhile? That's the question we're answering next...

The Test GPU Performance vs. CPU Clock Speed
POST A COMMENT

36 Comments

View All Comments

  • RyanHirst - Sunday, April 30, 2006 - link

    hehe it's ok. Just chompin' at the bit curious, that's all. If anyone on eBay wanted the collectible books I no longer need, I'd have turned them into a magic pair of 275s and I'd know. Alas. Pantone 072 Blue. Reply
  • bob661 - Saturday, April 29, 2006 - link

    I guess this round of testing only applies to ATI video cards. I guess us Nvidia owners are left to guess how CPU performance affects GPU performance. Oh well. :( Reply
  • PrinceGaz - Sunday, April 30, 2006 - link

    Just refer to part 1 of the Oblivion article and find out where your nVidia card of choice lies in relation to the four ATI cards tested this time and it is easy to see how it will perform with various CPUs. Reply
  • Bladen - Saturday, April 29, 2006 - link

    My guess as to why CPUs help so much in towns is because the Radiant AI takes a fair amount of power. Reply
  • BitByBit - Saturday, April 29, 2006 - link

    I don't know why, but the benchmark graphs didn't appear at all in that review, nor did they in the previous Oblivion review; I get nothing where the graphs should be.
    Has anyone else had this problem?
    Reply
  • blackbrrd - Saturday, April 29, 2006 - link

    If you have turned off refering you won't see any images. Reply
  • JarredWalton - Saturday, April 29, 2006 - link

    Someone else had a problem with Norton Internet Security blocking images for some reason. Reply
  • Madellga - Monday, May 01, 2006 - link

    That was me. Turn off privacy control. Reply
  • RyanHirst - Saturday, April 29, 2006 - link

    This article still left me with the nagging question about multithread performance. The oblivion team made reference to th game being heavily optimized for multithread performance because they knew from the beginning they'd be writing it simultaneously for the XBox360.
    So the debate about the potential of multithread code in games has been going on for awhile, and here we have the perfect game test, and we happen to know AnandTech had a couple of four-way servers in the shop over the last few weeks.... but the CPU guide leaves that question unanswered.
    If it's not unreasonable to heap $1200 in graphics hardware onto a M/B for a game that is GPU bound only half of the time (outdoors), is it too much to ask that a $1200 pair of Opteron 275's be tested to see how multithread the first advertised multithread game really is? Is it not possible that the game can offload a large number of threads to extra cores?
    If we can talk about throwing over $1K at a game, isn't anyone the least bit curious how a 4-core opteron rig with 4 gigs of RAM in NUMA might handle this game?
    Reply
  • JarredWalton - Saturday, April 29, 2006 - link

    P4 EE 955 runs up to four threads, and it doesn't seem to get any help from the extra capability. It could be that further INI tweaks would allow Oblivion to run better on multi-core (more than 2 core) systems, but if going from 1 to 2 cores gives 10 to 20% more performance, there's a very good chance that moving from 2 to 4 cores wouldn't give more than another 5%. Ideally, a CPU-limited game should be able to get as much as 50% or more performance from multi-threading, but rarely can we realize the ideal case.

    Also, FYI, the servers are at a completely different location than the GPUs for this testing. They also don't support dual X1900 cards in CrossFire - they might not even have X16 PCIe slots, let alone two of them. Servers are, quite simply, not interested in improving gaming performance. There are a few out there targeting the 3D graphics workstation that might support SLI, but not CrossFire. Multi-core will really only become important when we have multi-core CPUs. The desktop/home PC market isn't interested in multiple socket motherboards (other than a few extreme enthusiasts).
    Reply

Log in

Don't have an account? Sign up now