The Elder Scrolls IV: Oblivion CPU Performanceby Anand Lal Shimpi on April 28, 2006 10:00 AM EST
- Posted in
Whenever a PC game pushes the limits of what current hardware can do, it generally ends up being fairly GPU bound. In the past, as long as you had pretty much any Socket-939 Athlon 64 you had enough CPU power to drive even the fastest single GPU video cards. You would typically be running at fairly GPU-bound graphics settings - even if you were CPU-bound, frame rates would be high enough that it wouldn't really matter. However, every now and then there comes a game that is an equal opportunity stress test on your system, requiring an extremely fast CPU as well as a high end GPU. Bethesda Softworks' latest hit title, The Elder Scrolls IV: Oblivion, is such a game.
In our initial article on Oblivion performance we compared high end and mid range PCI Express GPUs, discovering that we had finally found a game that was stressful enough to truly demand more GPU power than what is currently available on the market. Today's article uses the same benchmarks that we used in our first article, but focuses on finding the right mix of CPU and GPU performance for the best Oblivion experience.
It's worth stating up-front that we are not going to attempt to find ideal settings for every possible CPU/GPU configuration available. There are many tweaks that can be made that will dramatically improve performance on slower CPUs. Reducing the height of the grass as well as the density - or turning off grass entirely - will help a lot. Running without HDR, using medium textures, turning off shadow filtering... you can easily get performance to a level that many people will find acceptable, but it always comes at the cost of reducing the quality of the graphics - or at least the complexity of the graphics. We're interested in characterizing CPU performance under identical configurations for this article, providing an apples-to-apples look at how the Oblivion engine runs on a variety of processors.
Post Your CommentPlease log in or sign up to comment.
View All Comments
RyanHirst - Sunday, April 30, 2006 - linkhehe it's ok. Just chompin' at the bit curious, that's all. If anyone on eBay wanted the collectible books I no longer need, I'd have turned them into a magic pair of 275s and I'd know. Alas. Pantone 072 Blue.
bob661 - Saturday, April 29, 2006 - linkI guess this round of testing only applies to ATI video cards. I guess us Nvidia owners are left to guess how CPU performance affects GPU performance. Oh well. :(
PrinceGaz - Sunday, April 30, 2006 - linkJust refer to part 1 of the Oblivion article and find out where your nVidia card of choice lies in relation to the four ATI cards tested this time and it is easy to see how it will perform with various CPUs.
Bladen - Saturday, April 29, 2006 - linkMy guess as to why CPUs help so much in towns is because the Radiant AI takes a fair amount of power.
BitByBit - Saturday, April 29, 2006 - linkI don't know why, but the benchmark graphs didn't appear at all in that review, nor did they in the previous Oblivion review; I get nothing where the graphs should be.
Has anyone else had this problem?
blackbrrd - Saturday, April 29, 2006 - linkIf you have turned off refering you won't see any images.
JarredWalton - Saturday, April 29, 2006 - linkSomeone else had a problem with Norton Internet Security blocking images for some reason.
Madellga - Monday, May 1, 2006 - linkThat was me. Turn off privacy control.
RyanHirst - Saturday, April 29, 2006 - linkThis article still left me with the nagging question about multithread performance. The oblivion team made reference to th game being heavily optimized for multithread performance because they knew from the beginning they'd be writing it simultaneously for the XBox360.
So the debate about the potential of multithread code in games has been going on for awhile, and here we have the perfect game test, and we happen to know AnandTech had a couple of four-way servers in the shop over the last few weeks.... but the CPU guide leaves that question unanswered.
If it's not unreasonable to heap $1200 in graphics hardware onto a M/B for a game that is GPU bound only half of the time (outdoors), is it too much to ask that a $1200 pair of Opteron 275's be tested to see how multithread the first advertised multithread game really is? Is it not possible that the game can offload a large number of threads to extra cores?
If we can talk about throwing over $1K at a game, isn't anyone the least bit curious how a 4-core opteron rig with 4 gigs of RAM in NUMA might handle this game?
JarredWalton - Saturday, April 29, 2006 - linkP4 EE 955 runs up to four threads, and it doesn't seem to get any help from the extra capability. It could be that further INI tweaks would allow Oblivion to run better on multi-core (more than 2 core) systems, but if going from 1 to 2 cores gives 10 to 20% more performance, there's a very good chance that moving from 2 to 4 cores wouldn't give more than another 5%. Ideally, a CPU-limited game should be able to get as much as 50% or more performance from multi-threading, but rarely can we realize the ideal case.
Also, FYI, the servers are at a completely different location than the GPUs for this testing. They also don't support dual X1900 cards in CrossFire - they might not even have X16 PCIe slots, let alone two of them. Servers are, quite simply, not interested in improving gaming performance. There are a few out there targeting the 3D graphics workstation that might support SLI, but not CrossFire. Multi-core will really only become important when we have multi-core CPUs. The desktop/home PC market isn't interested in multiple socket motherboards (other than a few extreme enthusiasts).