When both Doom 3 and Half Life 2 came out we burned the midnight oil trying to put together guides for CPU and GPU performance in the games as soon as they were released. Much to our surprise, especially given the performance hype that had preceeded both of them, both games ran relatively well on most mainstream hardware that was available at the time. One GPU generation later and the worries about performance under Doom 3 and Half Life 2 were yesterday's news. The same, unfortunately, cannot be said about Bethesda Softworks' latest immersive RPG: Oblivion.

The game itself is more addicting and immersive than any of its predecessors and its reviews confirm that. But we're not here to tell you that the game is great, we're here to tell you what you need to run it. The fact of the matter is that Oblivion is the most stressful game we've ever encountered, taking the crown away from F.E.A.R. as something that simply doesn't run well on anything. Obtaining good performance under Oblivion is so hard that a number of optimization guides have popped up helping users do whatever it takes to make the game playable. At AnandTech we've been using the Oblivion Tweak Guide from Tweakguides.com and recommend reading it if you're looking to get a good idea for the impact of the many visual settings available in the game.

Just as we've done in our previous articles on Doom 3 and Half Life 2, we're splitting our Oblivion performance coverage into multiple parts. This first part will focus on high-end and mid-range PCIe GPU performance and future articles will look at CPU performance as well as low-end GPU and AGP platform performance if there is enough demand for the latter two. Where we take this series of articles in the future will depend on many of your demands and requests, so please make them heard.

Benchmarking Oblivion

There are really three types of areas you encounter while playing Oblivion, you'll find your character either: 1) Outdoors, 2) Inside a town but still outdoors, or 3) Inside a building or dungeon. Interestingly enough, our seemingly haphazard list of Oblivion locales is actually organized in ascending order of performance. You'll encounter your absolute highest performance inside buildings while you'll actually contemplate spending $1200 on graphics cards whenever you find yourself outside. It only made sense that we benchmarked in each of those three areas, so we constructed manually scripted (read: walk-throughs by hand) benchmarks taking us through one of each type of area in Oblivion.


Oblivion Gate Benchmark

The first test is our Oblivion Gate benchmark, which just so happens to be the most stressful out of all three. In this test we've spotted an Oblivion gate in The Great Forest and walk towards it as scamps attempt to attack our character. The benchmark takes place in a heavily wooded area with lots of grass; combined with the oblivion gate itself, even the fastest GPUs will have trouble breaking 30 fps here.


Town Benchmark

The next test takes place in the town of Bruma and simply features our character walking through a portion of the town. There are a few other characters on screen but no major interaction takes place. Despite the simplicity of the test, since it takes place outside the frame rate is already quite stressful on some mid-range GPUs.


Dungeon Benchmark

Our final test takes place in the Sanctum on our way to the Imperial City prisons; this "Dungeon" benchmark showcases indoor area performance and consists of our character sneaking through the dimly lit Sanctum. There are guards around however none appear in the view of our character. Many cards will do well in this test, but unless they can pass the first benchmark their performance here is meaningless.

We measured frame rates using FRAPS and reported both the minimum and average frame rates in our charts (we left out maximum frame rates because they simply aren't as important and they made the graphs a little too difficult to read when we included them). The minimum frame rates are indicated by a vertical white line inside the bar representing average frame rate.

Since we measured performance using FRAPS and not through a scripted timedemo sequence, the amount of variance between runs is higher than normal; differences in performance of 5% or less aren't significant and shouldn't be treated as such.

Our Settings
Comments Locked

100 Comments

View All Comments

  • smitty3268 - Friday, April 28, 2006 - link

    Well, all the tests that had the XT ahead of the XTX were obviously CPU bound, so for all intents and purposes you should have read the performance as being equal.

    I would like to know a bit about the drivers though. Were you using Catalyst AI and does it make a difference?
  • coldpower27 - Thursday, April 27, 2006 - link

    Quite a nice post there, well said Jarred.
  • JarredWalton - Thursday, April 27, 2006 - link

    LOL - a Bolivian = Oblivion. Thanks, Dragon! :D (There are probably other typos as well. Sorry.)
  • alpha88 - Thursday, April 27, 2006 - link

    Opteron 165, 7800GTX 256meg

    I run at 1920x1200 with every ingame setting set to max, HDR, no AA, (16x AF)

    The game runs just fine.

    I don't know what the framerates are, but whatever they are, it's very playable.

    I have a few graphics mods installed (new textures), and the graphics are good enough that I randomly stop and take screenshots, the view looked so awesome.
  • z3R0C00L - Thursday, April 27, 2006 - link

    The game is a glimpse at the future of gaming. The 7x00 series is old. True, nVIDIA were able to remain competitive with revamped 7800's which they now call 7900's but consumers need to remember that these cards have a smaller die space for a reason... they offer less features, less performance and are not geared towards HDR gaming.

    Right now nVIDIA and ATi have a complete role reversal from the x800XT PE vs. 6800 Ultra. The 6800 Ultra performed on par or beat the x800XT PE. The kick was that the 6800 Ultra produced more heat (larger die) was louder (larger cooler) but had more features and was more forward looking. Right now we have the same thing.

    ATi's x1900 series has a larger die, produces more heat (larger die means more voltage to operate) and comes with a larger cooler. The upside is that it's a better card. The x1900 series totally dominate the 7900 series. Some will argue about OpenGL others will point to inexistant flaws in ATi drivers... the truth is those who make these comments on both sides are hardware fans. Product wise.. the x1900 series should be the card you buy if you're looking for a highend card... if you're looking more towards the middle of the market the x1800XT is better then the 7900GT.

    Remember performance, features and technology.. the x1k series has all of them above the 7x00 series. Larger die space.. more heat. Larger die space.. more features.

    Heat/Power for features and performance... hmmm fair tradeoff if you ask me.
  • aguilpa1 - Thursday, April 27, 2006 - link

    inefficient game programming is no excuse to go out and spend 1200 on a graphics system. Games like the old Crytek Cryengine have proven they can provide 100% of the oblivion immersion and eye candy without crippling your graphics system and bring your computer to a crawl, ridicoulous game and test,....nuff said.
  • dguy6789 - Thursday, April 27, 2006 - link

    The article is of a nice quality, very informative. However, what I ponder more than GPU performance in this game is CPU performance. Please do an indepth cpu performance article that includes Celerons, Pentium 4s, Pentium Ds, Semprons, Athlon 64s, and Athlon 64 X2s. Firing squad did an article, however it only contained four AMD cpus that were of relatively the same speed in the first place. I, as well as many others, would greatly appreciate an indepth article speaking of cpu performance, dual core benefits, as well as anything else you can think of.
  • coldpower27 - Thursday, April 27, 2006 - link

    I would really enjoy a CPU scaling article with Intel based processors from the Celeron D's, Pentium 4's, and Pentium D's in this game.

  • frostyrox - Thursday, April 27, 2006 - link

    It's something i already knew but I'm glad Anandtech has brought it into full view. Oblivion is arguably one of the best PC games i've seen in 2006, and could very well turn out to be one of the best we'll see all year. Instead of optimizing the game for the PC, Bethesda (and Microsoft indirectly) bring to the PC a half *ss, amature, embarassing, and insanely bug-ridden 360 Port. I think I have the right to say this because I have a relatively fast PC (a64 3700+, x800 xl, 2gb cosair, sata2 hdds, etc) and I'm roughly 65hrs into Oblivion right now. Next time Bethesda should use the Daikatana game engine - that way gamers with decent PCs might not see framerates of 75 go to 25 everytime an extra character came onto the screen and sneezed. Right now you may be thinking that I'm mad about all this. Not quite. But I will say this much: next time I get the idea of upgrading my pc, I'll have to remember that upgrading the videocard may be pointless if the best games we see this year are 360 ports running at 30 frames. So here's to you Bethesda and Microsoft, for ruining a gaming experience that could've been so much more if you gave a d*mn about pc gamers.
  • trexpesto - Thursday, April 27, 2006 - link

    Maybe Oblivion should be $100?

Log in

Don't have an account? Sign up now