Final Words

Looking at the performance offered by a variety of GPUs in Oblivion makes one thing clear: this game is the most stressful title on the market right now. We've focused primarily on stock performance using commonly available settings, but if you're serious about getting the most out of Oblivion we highly recommend looking at some of the tweak guides to help balance performance with appearance. For now, those of you hoping to run Oblivion at 1920x1200 with all the detail settings at maximum will need to wait for future GPU generations. But how do the current generation of cards fare?

At the high end, there's no better solution than ATI's Radeon X1900 series. While NVIDIA can offer similar performance with the GeForce 7900 GTX, its minimum frame rates aren't anywhere near as high as what ATI can deliver, meaning that the X1900 series will give you a much better overall experience. Oblivion is quite possibly the first game we've ever benchmarked where having multiple GPUs is almost necessary to get good frame rates at relatively common resolutions with most of the impressive visual effects turned on. The performance offered by a pair of X1900 XTs simply can't be matched by any single card, but as good of a game as Oblivion is you'd have to have a pretty serious computer budget to accommodate the $1200 that a pair of X1900 CrossFire GPUs will set you back.

Looking at mid range offerings, our recommendation sticks with ATI as the Radeon X1800 XT continues ATI's trend of offering absolutely stellar performance (relatively speaking) under Oblivion. At $200, the GeForce 7600 GT also proved to be a fairly strong contender in our medium quality tests.

In terms of upgrades, if you've got a CrossFire or SLI motherboard, adding a second GPU can improve performance by 25-50% in our Oblivion Gate test. X1600 CrossFire is probably the cheapest upgrade offering reasonable, assuming you already own an Radeon X1600 XT. At around $150 you really can't go wrong there. Moving from a single 6800 GS or 7600 GT to SLI also provides nearly 50% more performance, but the cost will be a bit higher.

If you own a slightly older card like something in the Radeon X800/X850 series, you honestly don't really need an upgrade to get better performance under Oblivion. Moving to a Shader Model 3.0 card like something in ATI's Radeon X1800 or X1900 series will give you HDR support and you'll be able to turn up some more eye candy, but then you're talking about a fairly significant upgrade investment. Moving to an X1800 XT will set you back more than $300 and still not offer you tremendous performance at higher image quality settings; for that you'll have to turn to a X1900 XT or XTX. Owners of the GeForce 6 series are in a similar situation: lowering your expectations a bit may be better than spending a lot of money on an upgrade.

This is just the tip of the iceberg however; we have a general idea of what GPUs do the best and worst in Oblivion but what about CPUs? And at what point does it stop making sense to spend money on new graphics cards versus just going out and buying the Xbox 360 version instead? We'll be answering those questions and more as our Oblivion coverage continues...

Mid Range GPU Performance w/ Bloom Enabled
Comments Locked

100 Comments

View All Comments

  • bobsmith1492 - Wednesday, April 26, 2006 - link

    I'm playing with a 9700 mobility (basically 9600 ultra) in my laptop with a P-M and a gigger at 1024, medium settings about like you set it. Where in the world did all those extra settings come from though (shadows, water)? Is that something outside the game itself?
  • ueadian - Thursday, April 27, 2006 - link

    I played this game fine on my X800XL with high settings.. Yeah it PROBABLY dipped into the 20's but honestly I never really noticed "lag". I shortcircuited my X800XL by stupidly putting a fan with a metal casing on top of it it went ZZZZT and died. I bought a 7900 GT for 299.99 and voltmoded it to GTX speeds and I really don't notice a difference while playing the game. Yeah I'm sure if I payed attention to FPS I'd see it, but really, the only place I noticed lag with my X800XL at high settings was by oblivion gates, and my 7900 GT at 680 core 900 mem locks up near oblivion gates as well. I was sort of forced to "upgrade" my card, but the 7900 GT is the best value for the money right now considering you can do a pen mod to get it to run PAST GTX speeds fairly easy. I have a crappy CRT who's max resolution is 1024x768 and dont plan on upgrading it anytime soon, so I don't need 512mb memory to throw the resolution up to goddly high settings, besides, im pretty blind, I find it easier to play most online games like FPS's at lower resolution just to gain an advantage. Oblivion is near perfection as a GAME it's the most enjoyable game I've ever played, and I've been playing games since Doom. Yeah the engine does suck, and I was really disapointed to have my brand new top of the line video card actualy STUTTER in a game, but really, does it completely ruin the whole game for you? If you have played it you know that it doesn't.
  • thisisatest - Thursday, April 27, 2006 - link

    7900 series isn't what I consider to be the top of the line. There is high end and there is top of the line. The top of the line is clear.
  • poohbear - Wednesday, April 26, 2006 - link

    im really curious to see how dualcore cpus perform as Oblivion is supposed to take advantage of multithreading. if anandtech could do a cpu performance chart that'd be great. firingsquad did a cpu performance chart but only @ 2 resolutions, 800x600 & 1280x1024, they found significant differences between dualcore and singlecore on 800x600 but no diff on 1280x1024. now, i play @ 1024x768 on my 6800GT, so wondering if a dualcore would help in that resolution. also, if u could investigate some of the supposed tweaks for dualcores and if they truly work that'd be great too. thanks.
  • Eris23007 - Wednesday, April 26, 2006 - link


    A friend of mine is playing it on a 3.4GHz Northwood; he told me that when he enabled HyperThreading he got an immediate ~10% (or so) improvement.

    That's a pretty good indication that dual cores will help a *lot*, in my view...
  • mpeavid - Thursday, April 27, 2006 - link

    10% is VERY pooor multi threading performance. A decent multi threaded app should give 40-60 and higher for highlt efficient codes.

  • nullpointerus - Thursday, April 27, 2006 - link

    HT isn't the same as having dual cores. IIRC, ~10% improvement from HT is rather typical in certain areas where multiple cores have significantly better returns.
  • Akaz1976 - Wednesday, April 26, 2006 - link

    Anyone have any idea how 9800PRO compares to x800?
  • hoppa - Friday, April 28, 2006 - link

    What this test fails to mention is that I'm running a 9800 pro, Athlon XP 3000+, 1.5 gigs of ram, at 1280x768, and the game runs quite well even at medium settings. This game is very stressful at maximum everything but still manages to run incredibly well on older rigs and lower settings. Had I not played this game, after seeing this article I would've thought that it'd be impossible on my rig, but the truth is I've got plenty of computing power to spare.
  • xsilver - Wednesday, April 26, 2006 - link

    9800pro is considered midrange/lowend now -- i guess that article is coming later

    my guess is aprox 10% less than the lowest card on each graph besides the 7300gs (also you dont have HDR)

Log in

Don't have an account? Sign up now