Mid Range GPU Performance w/ HDR Enabled

Next we looked at mainstream GPU performance, targeting graphics cards that were priced at or below $300; for reference we've tossed in a few pairs of cards running in SLI or CrossFire.

Not only have we reduced the resolution, but we've significantly reduced the image quality settings here. The result is a good balance between image quality and performance, however we would much rather play with our high quality settings as Oblivion can be one very impressive looking game with the details cranked up.


The white lines within the bars indicate minimum frame rate

The king of the mid range is actually the Radeon X1800 XT offering pretty much the best performance you can get for under $300, even outperforming the GeForce 7900 GT. If you already have a Radeon X1600 XT and happen to own a CrossFire capable motherboard, then simply picking up one more X1600 XT will do wonders for your performance. Remember that the X1600 series can run in CrossFire mode without any external dongle, so all you need are two X1600 XTs and you'll be looking at fairly good performance. While we wouldn't recommend purchasing two X1600 XTs (you're far better off with a single 7900 GT), if you already have one it's the cheapest way to get a good performance boost in Oblivion.

The GeForce 7600 GT proves to be a good middle of the road performer here, offering good performance while being priced at under $200.

The GeForce 6600 GT is unfortunately overwhelmed by our medium quality settings, and unfortunately has now become a low end contender as far as Oblivion is concerned. Running a pair of 6600GTs in SLI improves performance a bit but still no where near what a pair of X1600 XTs will do in CrossFire mode.


The white lines within the bars indicate minimum frame rate

Once again we see that there's no significant performance difference between the GPUs at the top of the charts, but performance really begins to drop off after the 7600 GT.


The white lines within the bars indicate minimum frame rate

The Medium Quality Dungeon benchmark is very friendly to the NVIDIA cards here, with the first ATI showing being the X1800 GTO half way down the graph. The GeForce 7900 GT and Radeon X1800 XT continue to be the best performers here, but the 7600 GT isn't too far behind.

High End GPU Performance w/ Bloom Enabled Mid Range GPU Performance w/ Bloom Enabled
Comments Locked

100 Comments

View All Comments

  • JarredWalton - Thursday, April 27, 2006 - link

    I can't say I'd even begin to consider DOAC as better looking graphics. But if that's what you like, more power to you.
  • dhei - Thursday, April 27, 2006 - link

    quote:

    I can't say I'd even begin to consider DOAC as better looking graphics. But if that's what you like, more power to you.


    Thats easy to say when you prob don't play it currently. It has all the graphics features you see in Oblivion minus maybe HDR.
  • ueadian - Thursday, April 27, 2006 - link

    Lol are you serious? DAoC might look better while you are smoking the reefer bud but I've played it many times and Oblivion blows it away graphicaly.
  • dhei - Thursday, April 27, 2006 - link

    Well your doing drugs to think oblvion has good graphics. I consider them medicore compared to other games of same kind, has I have seen people play the game.

    DAOC graphics look a ton better to me imo...drug free..
  • hondaman - Wednesday, April 26, 2006 - link

    Why play oblivion instead of MMO?

    1. No monthly fees
    2. Oblivion has an end. MMO doesnt. Thats a good thing for those of us with lives, but little self control.
    3. No annoying kids to deal with.
    4. No annoying cliques
    5. No annoying server downtimes.
    6. Not having to answer "a/s/l" every 30 seconds.

    There is a pleasant serenity about single player RPG's that is impossible with MMO.
  • TejTrescent - Wednesday, April 26, 2006 - link

    But hmm.

    My rig is no where close to the rig that you guys used for comparison, and I don't know my exact framerates because I've not yet ran FRAPS with Oblivion, but..

    My 3500+ Newcastle, not overclocked, with 2GB of Corsair/Mushkin running dual channel at 2.5-3-3-7, with my AGPx8 6800OC from BFGTech (not overclocked any further either).. I pull highly playable framerates (aka no choppiness unless I'm getting jumped by 6+ Daedric mages, that lightning is killer) at settings MILDLY better than the medium GPU ones (though still 1024x768, just higher fade rates), no tweaks on either. I can even run MediaMonkey for music in the background without any choppy feelings.

    I guess Oblivion isn't very CPU dependant or gains anything from multithreads really or something, because huh. I mean, I can't believe my crappy 3500+ is keeping decent pace with an FX60. o_o And better RAM. Just huh. I can generally tell if a framerate falls below 24 thanks to FPS games being painful at any lower than that.. and just huh. Weiird.
  • JarredWalton - Wednesday, April 26, 2006 - link

    I would say there's a good chance your framerates are below 20 on a regular basis in the wilderness. FX-60, 7900 GT SLI, 2GB RAM, 2x150GB Raptors, and at Fort Bloodford looking towards the closest Oblivion Gate, I pull 13-15 FPS. (1920x1200, most detail settings at high.) I've also found that lowering a lot of settings doesn't have much of an impact. The various "fade" settings don't do much for me.

    Open the console and type "TDT" to see your frame rates. I personally find anyting above 15 FPS to be acceptable for Oblivion, but opinions vary. :)
  • TejTrescent - Wednesday, April 26, 2006 - link

    Well.. I'll check it out later tonight, but if they are, it's absolutely the smoothest sub 20fps I've ever seen.

    Lot more playable than UT2004, Painkiller, FarCry, or... pretty much anything else on this comp. XD
  • TejTrescent - Thursday, April 27, 2006 - link

    Wish there was an edit because I feel stupid replying to myself, but. Huh.

    18-30 outside commonly, 18-35 in the city, and consistent 25-35 in dungeon areas.

    I am so so confused right now. The 18 isn't even noticeable. How did they.. what did they.. wha.. Guess it's just the slower pace making it less noticeable..
  • nullpointerus - Thursday, April 27, 2006 - link

    Uh...I'm not a graphics guru, but is it possible that the dips in fps are smoother? If we draw a graph with "1" indicating a frame draw and "x" indicating stalling of some kind - such as processing sound, physics, or waiting on the GPU hardware - then I can illustrate what I'm talking about.

    1xx1xx1xx1xx1xx
    1xx1xx1xx1xx1xx
    1xx1xx1xx1xx1xx
    1xx1xx1xx1xx1xx (20 fps on a 60Hz display - balanced)

    1xxxx11xx1xxx1x
    1xxxx11xx1xxx1x
    1xxxx11xx1xxx1x
    1xxxx11xx1xxx1x (20 fps on a 60Hz display - choppy)

    Or do 3D games not work like this?

Log in

Don't have an account? Sign up now