High End GPU Performance w/ Bloom Enabled

The only Shader Model 2.0 cards we have in this comparison are ATI's Radeon X800 series, the rest of the contenders are SM3.0 capable. While the SM2.0 vs. 3.0 distinction doesn't really exist in Oblivion, there is one feature that requires the later spec: lighting. Oblivion's HDR lighting setting requires a Shader Model 3.0 capable card, otherwise you're left with a less precise lighting solution called Bloom or nothing at all. Bloom naturally runs faster on all GPUs so we couldn't really throw the X800 numbers in with the rest of the HDR capable cards from above, instead we were forced to do a second run of our benchmarks with Bloom enabled on all GPUs to show you X800 owners whether or not it made sense to upgrade just to get a higher frame rate.

We left the multi-GPU solutions out of these graphs to save time and make them easier to digest; you've already seen how having multiple GPUs improves performance in these tests, so the focus here will be on the single card upgrade paths available to X800 and X850 series owners.


The white lines within the bars indicate minimum frame rate

ATI's X850 and X800 series performs quite well despite its age, with even the X800 GTO outpacing the GeForce 6800 GS. Unfortunately, if you want a good upgrade from your X850/X800 card you're going to have to set your sights (and budget) fairly high. The GeForce 7900 GT and Radeon X1800 GTO are probably your best bets for upgrades, but if you have an X850 XT or X800 XT don't expect the performance difference to be tremendous; instead, you'll have to look towards the Radeon X1900 series.

We continue to see this trend of NVIDIA GPUs posting lower minimum frame rates than ATI GPUs here which, unfortunately for NVIDIA, makes us strongly recommend choosing ATI instead.


The white lines within the bars indicate minimum frame rate

Performance in our Town benchmarks is pretty much as expected and as we've seen before; the very high end GPUs all hit a performance wall right around 50 fps. The Radeon X800 series starts to pull up the rear but still offers significantly better performance than NVIDIA's GeForce 6800 GS (which is a performance equivalent to the GeForce 6800 GT/Ultra depending on clock speeds).


The white lines within the bars indicate minimum frame rate

While all of the GPUs have similar minimum frame rates in our Dungeon test, there is a pretty clear breakdown of performance once we look at cards slower than the GeForce 7800 GT. The standings however don't really change from what we've already seen, the X850/X800 cards continue to significantly outperform the GeForce 6800 GS while making any upgrade path that yields a reasonable improvement fairly expensive.

High End GPU Performance w/ HDR Enabled Mid Range GPU Performance w/ HDR Enabled
Comments Locked

100 Comments

View All Comments

  • bobsmith1492 - Wednesday, April 26, 2006 - link

    I'm playing with a 9700 mobility (basically 9600 ultra) in my laptop with a P-M and a gigger at 1024, medium settings about like you set it. Where in the world did all those extra settings come from though (shadows, water)? Is that something outside the game itself?
  • ueadian - Thursday, April 27, 2006 - link

    I played this game fine on my X800XL with high settings.. Yeah it PROBABLY dipped into the 20's but honestly I never really noticed "lag". I shortcircuited my X800XL by stupidly putting a fan with a metal casing on top of it it went ZZZZT and died. I bought a 7900 GT for 299.99 and voltmoded it to GTX speeds and I really don't notice a difference while playing the game. Yeah I'm sure if I payed attention to FPS I'd see it, but really, the only place I noticed lag with my X800XL at high settings was by oblivion gates, and my 7900 GT at 680 core 900 mem locks up near oblivion gates as well. I was sort of forced to "upgrade" my card, but the 7900 GT is the best value for the money right now considering you can do a pen mod to get it to run PAST GTX speeds fairly easy. I have a crappy CRT who's max resolution is 1024x768 and dont plan on upgrading it anytime soon, so I don't need 512mb memory to throw the resolution up to goddly high settings, besides, im pretty blind, I find it easier to play most online games like FPS's at lower resolution just to gain an advantage. Oblivion is near perfection as a GAME it's the most enjoyable game I've ever played, and I've been playing games since Doom. Yeah the engine does suck, and I was really disapointed to have my brand new top of the line video card actualy STUTTER in a game, but really, does it completely ruin the whole game for you? If you have played it you know that it doesn't.
  • thisisatest - Thursday, April 27, 2006 - link

    7900 series isn't what I consider to be the top of the line. There is high end and there is top of the line. The top of the line is clear.
  • poohbear - Wednesday, April 26, 2006 - link

    im really curious to see how dualcore cpus perform as Oblivion is supposed to take advantage of multithreading. if anandtech could do a cpu performance chart that'd be great. firingsquad did a cpu performance chart but only @ 2 resolutions, 800x600 & 1280x1024, they found significant differences between dualcore and singlecore on 800x600 but no diff on 1280x1024. now, i play @ 1024x768 on my 6800GT, so wondering if a dualcore would help in that resolution. also, if u could investigate some of the supposed tweaks for dualcores and if they truly work that'd be great too. thanks.
  • Eris23007 - Wednesday, April 26, 2006 - link


    A friend of mine is playing it on a 3.4GHz Northwood; he told me that when he enabled HyperThreading he got an immediate ~10% (or so) improvement.

    That's a pretty good indication that dual cores will help a *lot*, in my view...
  • mpeavid - Thursday, April 27, 2006 - link

    10% is VERY pooor multi threading performance. A decent multi threaded app should give 40-60 and higher for highlt efficient codes.

  • nullpointerus - Thursday, April 27, 2006 - link

    HT isn't the same as having dual cores. IIRC, ~10% improvement from HT is rather typical in certain areas where multiple cores have significantly better returns.
  • Akaz1976 - Wednesday, April 26, 2006 - link

    Anyone have any idea how 9800PRO compares to x800?
  • hoppa - Friday, April 28, 2006 - link

    What this test fails to mention is that I'm running a 9800 pro, Athlon XP 3000+, 1.5 gigs of ram, at 1280x768, and the game runs quite well even at medium settings. This game is very stressful at maximum everything but still manages to run incredibly well on older rigs and lower settings. Had I not played this game, after seeing this article I would've thought that it'd be impossible on my rig, but the truth is I've got plenty of computing power to spare.
  • xsilver - Wednesday, April 26, 2006 - link

    9800pro is considered midrange/lowend now -- i guess that article is coming later

    my guess is aprox 10% less than the lowest card on each graph besides the 7300gs (also you dont have HDR)

Log in

Don't have an account? Sign up now