High End GPU Performance w/ HDR Enabled


The white lines within the bars indicate minimum frame rate

At the very high end, in our most strenuous benchmark, $1200 of graphics cards will buy you less than 50 fps on average. It doesn't actually matter which vendor you go with, both ATI and NVIDIA offer similar performance at the very high end with one very important exception: ATI seems to offer much higher minimum frame rates than NVIDIA at the very high end in this test. We tried adjusting the render ahead setting but couldn't improve the situation any for NVIDIA, so while both ATI and NVIDIA's best performers offer similar average frame rates, the ATI Radeon X1900 XT CrossFire setup is better overall thanks to higher minimum frame rates.

Looking at single card performance, once again ATI takes the crown as the Radeon X1900 XTX has higher average and minimum frame rates than the GeForce 7900 GTX.

What really puts things into perspective though is the performance of the GeForce 7800 GTX, a GPU that was at one point a $500 king of the hill now falls in the lower half of the graph. Unable to average more than 20 fps in this test, the settings we're running at here are too much for the GPU. Given that we haven't turned up every feature and are running at a relatively mainstream 1280 x 1024 resolution, this chart alone gives you good indication of exactly how stressful Oblivion actually is.

GeForce 6 owners should no longer consider their GPUs as high end, because Oblivion certainly doesn't. Even a pair of GeForce 6800 GSes can't break 15 fps in this test and with a minimum frame rate of 10 fps, they make the game far from playable at these settings. No, believe it or not, but the GeForce 6800 GS performs like a mid-range card at best under Oblivion.


The white lines within the bars indicate minimum frame rate

At our high quality 1280 x 1024 setting, virtually all of the cards offer nearly identical performance when walking around inside a town, even down to the minimum frame rates. The problem with these numbers is that you really can't determine what settings you'll be running Oblivion at based on your in-town or in-dungeon performance, because the moment you step outside you'll find yourself watching a slide show. It's also worth noting that although a lot of these cards have average frame rates in the 50s, their minimums all drop to right around 30 fps. If we crank up any of the detail settings we'll be looking at even worse minimum frame rates, which are just as important.

We see no benefit to SLI or CrossFire here, due to whatever limitation we're running into at these settings. What we will investigate in future articles is exactly what is causing this limitation; we would assume we're CPU limitated even though we're already running an Athlon 64 FX-60. That doesn't bode well for other processors, as there simply isn't much more we can throw at the game.

It isn't until we get below the GeForce 7800 GTX that performance begins to drop off for our contenders here and once you get slower than the X1800 GTO then the minimum frame rates begin to dip below 30fps.


The white lines within the bars indicate minimum frame rate

Like our Town test, our Dungeon benchmark shows the cream of the crop performing very similarly with performance only really dropping off below the 7800 GTX. Although our Dungeon test also runs into some sort of a performance limiter, it appears to be a different one than what we saw outside walking around the town because our average limited frame rate is now up around 80 fps instead of 50 fps.

What the combination of these three tests show is the full gamut of performance of these GPUs under Oblivion, from the worst conditions to the best conditions. And while everyone is fairly competitive indoors or walking around a town, once you journey beyond the town walls you can really start to appreciate a faster video card.

Setting Expectations & The Test High End GPU Performance w/ Bloom Enabled
Comments Locked

100 Comments

View All Comments

  • smitty3268 - Friday, April 28, 2006 - link

    Well, all the tests that had the XT ahead of the XTX were obviously CPU bound, so for all intents and purposes you should have read the performance as being equal.

    I would like to know a bit about the drivers though. Were you using Catalyst AI and does it make a difference?
  • coldpower27 - Thursday, April 27, 2006 - link

    Quite a nice post there, well said Jarred.
  • JarredWalton - Thursday, April 27, 2006 - link

    LOL - a Bolivian = Oblivion. Thanks, Dragon! :D (There are probably other typos as well. Sorry.)
  • alpha88 - Thursday, April 27, 2006 - link

    Opteron 165, 7800GTX 256meg

    I run at 1920x1200 with every ingame setting set to max, HDR, no AA, (16x AF)

    The game runs just fine.

    I don't know what the framerates are, but whatever they are, it's very playable.

    I have a few graphics mods installed (new textures), and the graphics are good enough that I randomly stop and take screenshots, the view looked so awesome.
  • z3R0C00L - Thursday, April 27, 2006 - link

    The game is a glimpse at the future of gaming. The 7x00 series is old. True, nVIDIA were able to remain competitive with revamped 7800's which they now call 7900's but consumers need to remember that these cards have a smaller die space for a reason... they offer less features, less performance and are not geared towards HDR gaming.

    Right now nVIDIA and ATi have a complete role reversal from the x800XT PE vs. 6800 Ultra. The 6800 Ultra performed on par or beat the x800XT PE. The kick was that the 6800 Ultra produced more heat (larger die) was louder (larger cooler) but had more features and was more forward looking. Right now we have the same thing.

    ATi's x1900 series has a larger die, produces more heat (larger die means more voltage to operate) and comes with a larger cooler. The upside is that it's a better card. The x1900 series totally dominate the 7900 series. Some will argue about OpenGL others will point to inexistant flaws in ATi drivers... the truth is those who make these comments on both sides are hardware fans. Product wise.. the x1900 series should be the card you buy if you're looking for a highend card... if you're looking more towards the middle of the market the x1800XT is better then the 7900GT.

    Remember performance, features and technology.. the x1k series has all of them above the 7x00 series. Larger die space.. more heat. Larger die space.. more features.

    Heat/Power for features and performance... hmmm fair tradeoff if you ask me.
  • aguilpa1 - Thursday, April 27, 2006 - link

    inefficient game programming is no excuse to go out and spend 1200 on a graphics system. Games like the old Crytek Cryengine have proven they can provide 100% of the oblivion immersion and eye candy without crippling your graphics system and bring your computer to a crawl, ridicoulous game and test,....nuff said.
  • dguy6789 - Thursday, April 27, 2006 - link

    The article is of a nice quality, very informative. However, what I ponder more than GPU performance in this game is CPU performance. Please do an indepth cpu performance article that includes Celerons, Pentium 4s, Pentium Ds, Semprons, Athlon 64s, and Athlon 64 X2s. Firing squad did an article, however it only contained four AMD cpus that were of relatively the same speed in the first place. I, as well as many others, would greatly appreciate an indepth article speaking of cpu performance, dual core benefits, as well as anything else you can think of.
  • coldpower27 - Thursday, April 27, 2006 - link

    I would really enjoy a CPU scaling article with Intel based processors from the Celeron D's, Pentium 4's, and Pentium D's in this game.

  • frostyrox - Thursday, April 27, 2006 - link

    It's something i already knew but I'm glad Anandtech has brought it into full view. Oblivion is arguably one of the best PC games i've seen in 2006, and could very well turn out to be one of the best we'll see all year. Instead of optimizing the game for the PC, Bethesda (and Microsoft indirectly) bring to the PC a half *ss, amature, embarassing, and insanely bug-ridden 360 Port. I think I have the right to say this because I have a relatively fast PC (a64 3700+, x800 xl, 2gb cosair, sata2 hdds, etc) and I'm roughly 65hrs into Oblivion right now. Next time Bethesda should use the Daikatana game engine - that way gamers with decent PCs might not see framerates of 75 go to 25 everytime an extra character came onto the screen and sneezed. Right now you may be thinking that I'm mad about all this. Not quite. But I will say this much: next time I get the idea of upgrading my pc, I'll have to remember that upgrading the videocard may be pointless if the best games we see this year are 360 ports running at 30 frames. So here's to you Bethesda and Microsoft, for ruining a gaming experience that could've been so much more if you gave a d*mn about pc gamers.
  • trexpesto - Thursday, April 27, 2006 - link

    Maybe Oblivion should be $100?

Log in

Don't have an account? Sign up now