The Test

Thankfully ATI's CrossFire runs on both ATI chipsets as well as Intel's 975X, so we were able to use our ultra high end GPU of choice to compare CPU performance under Oblivion. Remember that, just like in our first Oblivion article, we're manually walking through portions of the game and using FRAPS to generate our results, and thus the margin for error in our tests is much higher than normal; differences in performance of 5% or less aren't significant and shouldn't be treated as such.

While we tested with a number of AMD CPUs, we had issues with our Intel test bed where we couldn't adjust clock multipliers to give us the full spread of Intel CPU options, and thus we were only able to highlight the performance of a handful of Intel CPUs. However, with what we had we were able to adequately characterize the performance offered by Intel solutions under Oblivion. We also didn't have an Extreme Edition 965 on hand, so the EE 955 is the fastest offering from Intel in the test. The EE 965 should offer another 5% or so above what the EE 955 offers based on the tests we've done, just in case you're curious.

CPU: AMD Athlon 64 and Athlon 64 X2s
Intel Pentium Extreme Edition, Pentium D and Pentium 4
Motherboard: ASUS A8R32-MVP
Intel X975XBX
Chipset: ATI CrossFire 3200
Intel 975X
Chipset Drivers: ATI Catalyst 6.4
Hard Disk: Seagate 7200.9 300GB SATA
Memory: 2 x 1GB OCZ PC3500 DDR 2-3-2-7 1T
2 x 1GB OCZ PC8000 DDR2 4-4-4-12
Video Card(s): ATI Radeon X1900 XT CrossFire
Video Drivers: ATI Catalyst 6.4 w/ Chuck Patch
Desktop Resolution: 1280 x 1024 - 32-bit @ 60Hz
OS: Windows XP Professional SP2

Armed with a pair of X1900 XTs running in CrossFire mode - the clear GPU performance leader in our first Oblivion article - we set out to run some additional tests. Pay attention to the rest of the system as well: we've installed 2GB of high quality (i.e. low latency) RAM, which also helps performance. 1GB is sufficient, but Oblivion appears to do a good job of making use of additional memory; load times and area transitions are noticeably quicker with 2GB of RAM. We used the same "High Quality" settings we introduced in the last review:

Oblivion Performance Settings High Quality
Resolution 1280x1024
Texture Size Large
Tree Fade 50%
Actor Fade 65%
Item Fade 65%
Object Fade 65%
Grass Distance 50%
View Distance 100%
Distant Land On
Distant Buildings On
Distant Trees On
Interior Shadows 50%
Exterior Shadows 50%
Self Shadows On
Shadows on Grass On
Tree Canopy Shadows On
Shadow Filtering High
Specular Distance 50%
HDR Lighting On
Bloom Lighting Off
Water Detail High
Water Reflections On
Water Ripples On
Window Reflections On
Blood Decals High
Anti-aliasing Off
Index Oblivion CPU Performance
Comments Locked


View All Comments

  • feraltoad - Friday, May 5, 2006 - link

    I would have like to seem some High Resolution Benchmarks, or at least one. At 1680*1050 would this show that the CPU was no longer the weakest link with the Video Card Scrambling to render the scene? Would the exra mhz of the cpus still have a marked effect? Maybe you see the answers to these questions as obvious but not me. :( I play at this resolution. I'm sure some people complain but I've tried running around with fraps in different places and as long as the FPS doesn't seem to drop below 24fps then it isn't bad at all. If I look at an Oblivion gate I get 39 fps steady, granted this isn't with Dremoras trying to club me. I find that long views in towns (anvil) are rougher than the Oblivion gates. Perhaps its all the NPCs or shadows of buildings?

    However, very nice article, and VERY DEPRESSING! Sad to see the Kings of the Video Cards struggle with a game this much. Also insanely pathetic to see that a fx60 will net you a whopping 9 fps in the most demanding part of the game. 9FPS?!? That's a frickin' $1000 dollar chip. You could buy it a $115 dollar A643000+ and OC it (just 250htt) to just 3500+ speeds and you only have a 9 fps difference from a $100 to a $1000 chip? That is messed up. (yeah I know you can OC the fx too, but still at $1000 would you expect that to be necessary? It should brew coffee w/the other core while using office apps. also I don't believe anybody buys a 3000+ w/out planning to OC) So much for "getting what you pay for". I guess it is comforting to normal people who can't drop a grand on the cpu alone.

    I too would have like to seen if a Aegia Phyx chip would have made a difference by freeing up the cpu for other things. Aegia could have really gained some momentum if they could have had this game support their chip & had a board for around ~$150.

    People keep talking about patches. Can a patch really signifigantly optimize a game anyway than decreasing quality for performance?

    Jarred what is the biggest increase in performance in a single game that you have seen with new video card drivers? Can we expect Nvidia or ATI to dramatically increase performance with a drive update?

    After all, this will hurt vid card sales if people can expect sub-mediocre performance from the best graphics cards. It will also boost 360 sales if they can't increase performance. Especially, when I release my patent pending super dongle that will allow two Xbox 360s to be used together in a SLI/Crossfire configuration. haha (j/k. but my dongle is quite super, I assure you. ;)
  • moletus - Tuesday, May 2, 2006 - link

    2 months ago, i was about to go shop 1900xtx for 600euros. Than i saw Xbox360.
    Guess what! (hint: it costed with all the bells and whistles 550eur :)

    Why on earth burn so much money when you can have all the chocolate(+AA and HDR) in the world for 1/3 of the money and still cant run the game?

    Im no MickeySoft fanboy but the 360-ride has been surprinsinly smooth.
  • The Black Cursor - Monday, May 1, 2006 - link

    Both articles were quite informative, but I don't quite have enough information to make a decision between the upper-mid range GPUs and CPUs at similar price points,

    ATI X1800XT vs. NVIDIA 7900GT

    AMD A64 3700+ vs. AMD A64 X2 3800+

    Any opinions?

    Be seeing you...

    ---> TBC (To Buy Components)
  • xFlankerx - Monday, May 1, 2006 - link

    I say the X2 3800+ with the 7900GT. While the X2 most likely will not make a difference in the game itself, as noted by my last comment, it WILL provide for an overall smoother experience from your PC.
  • xFlankerx - Sunday, April 30, 2006 - link

    I just came from reading the Anandtech Oblivion CPU testing. I beg to differ with the conclusion that you need a fast CPU to get the most out of your GPU. I'm more of the opinion that the CPU and GPU handle different tasks. We know that the CPU is much stronger than the GPU in MOST cases. Now the CPU is what provides the GPU with instructions for what the GPU needs to do. If the CPU is already feeding the GPU more instructions than it can handle, then there is no point in having a faster CPU. See, there was hardly a difference in the FPS even with a 1.8GHz A64 when compared to a 2.6GHz A64, in areas where it's GPU-intensive. The Oblivion gates are by far the most intesive part of the game, GPU-wise. Now every single one of the single-card solutions bottlenecked the CPU in these cases. Only Dual X1900XTXs were able to take advantage of the CPU.

    On the other hand, the Town benchmarks are influenced quite a bit by the CPU. This is easily explained when you notice that in a town, there is a lot for the CPU to calculate that is not in the Oblivion Gates. There are hordes of NPCs in a city, and the CPU has to control every single one of them (oblivion NPCs lead lives of their own, by far the most complicated AI in a game yet). Thus, the stronger the CPU, the better your frames per second in a crowded area will be. The clumpiness of all the videocards performance here somewhat reinforces the point.

    The Dungeon benchmarks did surprise me. There is a decent amount for the GPU to render, though not as much as the other areas. However, there is very little for the CPU to render. And yet, we see quite a bit of improvement with a faster CPU. I'm not entirely sure how to explain that.

    My point is, basically, that "You don't need a fast CPU to get the most out of your GPU." If anything, it should be the other way around, since the GPU is what can't keep up with the CPU. I think that the conclusion should be more like, "You need a fast GPU to handle the graphics of the game, but you will suffer a drop if FPS if your CPU cannot keep up with the AI in the game in crowded areas."
  • kuk - Sunday, April 30, 2006 - link

    Any idea on how a Core Duo setup would perform?
  • ballero - Sunday, April 30, 2006 - link

    Anand, are you sure that the performance increases are due to the game?
    AFAIK Catalyst are multithreading driver.
  • RyanHirst - Saturday, April 29, 2006 - link

    Ok, I was just needling a bit about the opteron servers; I didn't really think they'd be a part of the test. :>

    First, PCI-X x16 is standard on all dual-socket940 NForce professional boards, and several have 2 16x slots. I did forget that you can't do CrossFire on an Nvidia board-- but even just checking a server board with one, two, and four cores as a different test, would that not be worth a look?

    Hyperthreading doesn't really answer the question. A (perhaps) 5% improvement with HT is consistent with the "6 of one...." results HT gives in dynamic multithread envirnonments (chess, say, or video games -- where the thread results must be compared to values from other threads, or completed within a certain time). There never has been any reason to assume that, because a progam does not get a boost from HyperThreading, it will not benifit from an additional full processor.
    I'm not arguing that a dual socket-940 rig is mainstream, but should we argue about which is less relevant to 99.9% of the gaming public, an Opteron or a $1200 cross-fire setup? The point I'm trying to make is that regardless of how common it is, a multicore system could answer some questions that remain open. Namely,
    --What are the maximal advantages of multithreading in this first generation of games?

    Every review site seemed happy to pontificate on this subject when there was virtually no data, but now that there is a test candidate, no one seems interested in actually testing it.
    Was it ever in doubt that you can improve a game's performance by throwing a CrossFire x1900xtx at it? No one would argue with this premise. *THE* question that chipmakers are taking revolutionary stands on for video games... is about cores. And I would just REALLY like someone to take a look at it.
    Sorry that was long; I realized the first post sounded grumpy because I didn't explain my reasoning. Cheers, and I take the time to write this because your reviews are always the best; when I ask myself what information I would need to make an objective judgement about something, it's always information you present and analyze. Thanks again!
  • RyanHirst - Saturday, April 29, 2006 - link

    Oh, yeah and about the small boost in performance of the second core: since the multithread coding in Oblivion is (by their own statements) a result of their XBox360 version, I was assuming that the game would be generating low-complexity, nearly branchless threads, none of which would tax the extra processor at anything close to 100%... but that the game engine can generate a LOT of them. Is that not more or less the model of the XBox360 processor?
  • JarredWalton - Saturday, April 29, 2006 - link

    Xbox 360 processor is quite a bit different, and handling up to six threads over three cores. The problem is, it runs a completely different machine architecture, so it runs through different compiler. Even if their code to take full advantage of the Xbox 360 capabilities, they might not bother porting the entire spectrum over to x86. More likely -- and as far as I know there's no way to prove this -- the Xbox 360 version doesn't use more than about half the potential of the Xenon CPU. (Go ahead and call me "doubting Thomas". LOL)

    Anyway, maybe Anand will find a time to try running a few tests on a quad core versus dual core setup. I unfortunately don't have access to any quad core systems. :(

Log in

Don't have an account? Sign up now