Setting Expectations

While we've been used to running high end SLI setups at 3MP resolutions and still getting reasonable frame rates in games, the same is not true for Oblivion. In fact, there isn't a single GPU or pair of GPUs today that will run Oblivion at 1600 x 1200 with everything turned on smoothly. Our highest end X1900 XT CrossFire setup can't even run our most stressful real world Oblivion test at 1280 x 1024 with all of the detail settings set to their highest values. That's right, $1200 worth of GPUs will get you less than 50 fps at less than the highest image quality, and we're not talking about having AA enabled either.

With Oblivion, you've got to set your expectations appropriately for what good performance is. If your frame rate never drops below 30 fps, then you've put together a very fast system for Oblivion. The problem with Oblivion is that while performance may be in the 60 - 70 fps range indoors or while taking a stroll around town, as soon as you engage a few enemies or walk near an oblivion gate your frame rate may drop into the teens. A choppy frame rate really impacts your ability to do things like slice the correct opponent and not someone you're trying to protect. If a video card can maintain a minimum of 20 fps in our most strenuous test (the Oblivion Gate benchmark) then it will do you very well, otherwise you may want to start turning down some of the visual quality options.

Oblivion is also one of those rare games where turning down all of the image quality options not only impacts how good the game looks, but it actually can have a pretty serious impact on gameplay as well. Turning down your view distance is a sure fire way to increase performance, however the lower your view distance is the more difficult it is to spot landmarks you're searching for. You can decrease things like the distance which characters, items and other objects will appear, giving you better performance, but also putting you at a disadvantage when you're looking for a particular item or when someone is about to attack you. With Oblivion it's not all about performance and dealing with slightly blurred textures and jagged edges to maintain higher frame rates, the total experience of the game is very dependent on having a powerful system with a fast GPU.

The Test

Given that we're looking at GPU performance we tested all graphics cards with the same, very fast CPU, to minimize any CPU bottlenecks. In future articles we will look at how the CPU impacts performance under Oblivion, but for now the GPU is our focus. All ATI cards were run on a CrossFire 3200 platform while all NVIDIA cards were run on an nForce4 SLI x16 platform.

The latest drivers from ATI and NVIDIA were used, including ATI's Chuck patch for Oblivion that enables CrossFire support and AA+HDR rendering support. Given the low frame rates that we're talking about already, enabling AA simply didn't make any sense as you will see from the performance results on the coming pages. We would much rather increase detail settings than turn on AA in Oblivion.

CPU: AMD Athlon 64 FX-60 (2.6GHz/1MBx2)
Motherboard: ASUS A8N32-SLI
ASUS A8R32-MVP
Chipset: NVIDIA nForce4 SLI x16
ATI CrossFire 3200
Chipset Drivers: nForce4 6.85
ATI Catalyst 6.4
Hard Disk: Seagate 7200.9 300GB SATA
Memory: 2 x 1GB OCZ PC3500 DDR 2-3-2-7
Video Drivers: ATI Catalyst 6.4 w/ Chuck Patch
NVIDIA ForceWare 84.43
Desktop Resolution: 1280 x 1024 - 32-bit @ 60Hz
OS: Windows XP Professional SP2
Our Settings High End GPU Performance w/ HDR Enabled
Comments Locked

100 Comments

View All Comments

  • bobsmith1492 - Wednesday, April 26, 2006 - link

    I'm playing with a 9700 mobility (basically 9600 ultra) in my laptop with a P-M and a gigger at 1024, medium settings about like you set it. Where in the world did all those extra settings come from though (shadows, water)? Is that something outside the game itself?
  • ueadian - Thursday, April 27, 2006 - link

    I played this game fine on my X800XL with high settings.. Yeah it PROBABLY dipped into the 20's but honestly I never really noticed "lag". I shortcircuited my X800XL by stupidly putting a fan with a metal casing on top of it it went ZZZZT and died. I bought a 7900 GT for 299.99 and voltmoded it to GTX speeds and I really don't notice a difference while playing the game. Yeah I'm sure if I payed attention to FPS I'd see it, but really, the only place I noticed lag with my X800XL at high settings was by oblivion gates, and my 7900 GT at 680 core 900 mem locks up near oblivion gates as well. I was sort of forced to "upgrade" my card, but the 7900 GT is the best value for the money right now considering you can do a pen mod to get it to run PAST GTX speeds fairly easy. I have a crappy CRT who's max resolution is 1024x768 and dont plan on upgrading it anytime soon, so I don't need 512mb memory to throw the resolution up to goddly high settings, besides, im pretty blind, I find it easier to play most online games like FPS's at lower resolution just to gain an advantage. Oblivion is near perfection as a GAME it's the most enjoyable game I've ever played, and I've been playing games since Doom. Yeah the engine does suck, and I was really disapointed to have my brand new top of the line video card actualy STUTTER in a game, but really, does it completely ruin the whole game for you? If you have played it you know that it doesn't.
  • thisisatest - Thursday, April 27, 2006 - link

    7900 series isn't what I consider to be the top of the line. There is high end and there is top of the line. The top of the line is clear.
  • poohbear - Wednesday, April 26, 2006 - link

    im really curious to see how dualcore cpus perform as Oblivion is supposed to take advantage of multithreading. if anandtech could do a cpu performance chart that'd be great. firingsquad did a cpu performance chart but only @ 2 resolutions, 800x600 & 1280x1024, they found significant differences between dualcore and singlecore on 800x600 but no diff on 1280x1024. now, i play @ 1024x768 on my 6800GT, so wondering if a dualcore would help in that resolution. also, if u could investigate some of the supposed tweaks for dualcores and if they truly work that'd be great too. thanks.
  • Eris23007 - Wednesday, April 26, 2006 - link


    A friend of mine is playing it on a 3.4GHz Northwood; he told me that when he enabled HyperThreading he got an immediate ~10% (or so) improvement.

    That's a pretty good indication that dual cores will help a *lot*, in my view...
  • mpeavid - Thursday, April 27, 2006 - link

    10% is VERY pooor multi threading performance. A decent multi threaded app should give 40-60 and higher for highlt efficient codes.

  • nullpointerus - Thursday, April 27, 2006 - link

    HT isn't the same as having dual cores. IIRC, ~10% improvement from HT is rather typical in certain areas where multiple cores have significantly better returns.
  • Akaz1976 - Wednesday, April 26, 2006 - link

    Anyone have any idea how 9800PRO compares to x800?
  • hoppa - Friday, April 28, 2006 - link

    What this test fails to mention is that I'm running a 9800 pro, Athlon XP 3000+, 1.5 gigs of ram, at 1280x768, and the game runs quite well even at medium settings. This game is very stressful at maximum everything but still manages to run incredibly well on older rigs and lower settings. Had I not played this game, after seeing this article I would've thought that it'd be impossible on my rig, but the truth is I've got plenty of computing power to spare.
  • xsilver - Wednesday, April 26, 2006 - link

    9800pro is considered midrange/lowend now -- i guess that article is coming later

    my guess is aprox 10% less than the lowest card on each graph besides the 7300gs (also you dont have HDR)

Log in

Don't have an account? Sign up now