The Elder Scrolls IV: Oblivion Performance

We've talked a lot about The Elder Scrolls IV: Oblivion, and how it's one of the most graphics intensive games available right now. Games are always coming out that set new standards for graphics hardware, but for now, Oblivion remains one of the most taxing games for graphics cards to date. Thankfully though, Oblivion has a very large selection of quality settings that can be tweaked in order to allow users to optimize the game for their particular card. With Oblivion, it's very important in our opinion to get the highest quality possible out of the game graphics in order to fully enjoy the game. That is why we would suggest putting off playing this game if you have a lower-end graphics card until you can get an upgrade that can run the game at higher quality settings. These are the settings we used when testing Oblivion:

Oblivion Performance Settings
Texture Size Large
Tree Fade 100%
Actor Fade 100%
Item Fade 66%
Object Fade 90%
Grass Distance 50%
View Distance 100%
Distant Land On
Distant Buildings On
Distant Trees On
Interior Shadows 95%
Exterior Shadows 85%
Self Shadows On
Shadows on Grass On
Tree Canopy Shadows On
Shadow Filtering High
Specular Distance 100%
HDR Lighting On
Bloom Lighting Off
Water Detail High
Water Reflections On
Water Ripples On
Window Reflections On
Blood Decals High
Anti-aliasing Off


For our Oblivion benchmark we use FRAPS to record the average frame rate of a walkthrough of a specific section of the game world. Because there are no console commands for recording or playing back demos, this is one of our more difficult games to benchmark. The benchmark takes place in the wilderness at night, walking towards an imposing Oblivion gate, which is a gate surrounded by flames. There are several low-level enemies nearby who notice and attack the player as the benchmark runs, and the AI tends to vary in their actions a little - sometimes hitting the player with a fireball, and sometimes missing. This is something that adds a bit of variance to our tests, but the enemy AI is fairly consistent, and in order to get more accurate results, multiple tests are run at each resolution and an average is taken. This allows us to be fairly confident that our results are accurate and consistent.

The Elder Scrolls IV: Oblivion


The Elder Scrolls IV: Oblivion


In Oblivion, we see lower frame rates than in Battlefield 2, which is because Oblivion is a much more demanding game graphically. With lower quality settings, we would see more playable frame rates for these cards (a playable frame rate for this game would be around 25 fps in this particular benchmark), but we test at higher quality settings because we feel they are important to fully enjoying this game.

In this game, we can really see a difference in performance the 8800 makes over all the other cards. Even though this game tends to favor ATI hardware, the 8800 GTX blows away ATI's top card in this game, getting a 102% increase in performance over the Sapphire X1950 XTX at 1600x1200 resolution. The EVGA and BFG 7950 GX2s also do better than the ATI X1950 XTX in both resolutions because of their dual gpu advantage. We've found that ATI hardware generally does a little better with Oblivion than NVIDIA, and this is evident when we compare the reference X1650 XT with the 7600 GT, which are direct competitors to each other. As we saw in Battlefield 2, the 7600 GT generally did better than the X1650 XT, but here in Oblivion the X1650 XT is the better performer of the two. The worst performers again are the Gigabyte 7600 GS and the Powercolor X1600 PRO, but a number of these cards here have a hard time running this game well. If Oblivion is your game of choice, you will probably need to upgrade to at least a 7900 GS or better.

Battlefield 2 Performance CPU Utilization
Comments Locked

48 Comments

View All Comments

  • rnemeth - Friday, December 1, 2006 - link


    I personally think this article was extremely on point with the direction of media components in general. I think there will be many people out there, like me, who are planning on the convergence of the Media Center PC & Gaming PC. Vista (Ultimate) will bring Media Center to the masses and why shouldn't you be able to play your favorite DX10 "Game for Windows" on your big HDTV as well?

    This article is ahead of its time. HDPC/DRM/HDMI/DVI/BR/HD-DVD/HDTV is all in the early stages. In the future, you will be able to buy or rent your high-def movie by downloading it to your PC with DRM finally figured out. This review shows us that it is not all there yet, but gives us an idea who is doing what, and what we need to look for.

    You mentioned in the beginning of the article that you were looking for feedback to see how interested your audience is on this subject... count me as 1.
  • thejez - Sunday, November 26, 2006 - link

    HDCP is a joke... look at how hard it is to understand and get this stuff running... not to mention you have upgrade all your hardware?? lol and who watches movies on their PC anyway?? do you people really sit huddled over your keyboard watching movies?? Why not invest in some better equipment for the family room and watch movies they way they were intended....

    but the difficulty in setting all this up makes it even more important that hdcp has already been cracked... i wouldnt let the movie industry tell you your hardware isnt good enough to buy their content... just buy what you want and "work" around the issue later... the cracks are only getting better.... we'll see HD-DVD Shrink soon enough...
  • KalTorak - Monday, November 20, 2006 - link

    Careful - it's not safe to assume that higher bitrate content is more computationally difficult to decode than lower bitrate content. [In fact, I suspect they're weakly correlated the other way - lower bitrate is harder.]
  • DerekWilson - Monday, November 20, 2006 - link

    ahh ... very interesting ...

    it would make sense to me to say that both are true.

    In the case where low bitrate means more aggressive high quality encoding, i absolutely see your point. But low bitrate can also mean lower quality (less information) at the same level of encoding -- in these cases lower bitrate will be easier to decode.

    Thanks for pointing this out.
  • Badkarma - Monday, November 20, 2006 - link

    Hi Derek,

    Can you comment on HD audio formats like Dolby TrueHD and DTS-HD for the HTPC? I know Bluray has yet to use these formats but how about HD-DVD on the HTPC. From what I understand, you could get these formats outputted via analog output on your soundcard, but I'm interested in HDMI. I know some of the HDMI video cards you reviewed have SPDIF passthrough via HDMI, however SPDIF cannot carry Dolby Digital +, TrueHD, or DTS-HD, it will only output DTS or DD. I'm holding back on a HDCP video card because HD audio is an important part of HD movies.

    Thanks.
  • Ajax9000 - Sunday, November 19, 2006 - link

    p.13 "Both the 8800 GTX and GTS are fully HDCP compatible, and HDCP is enabled through both DVI ports" p.22 "Some of the cards, like the HDMI Gigabyte 7600 GS and ASUS EN7600 GT, were only able to play our Blu-ray movies over HDMI and not through the DVI port. Conversely, we found that with our MSI NX7600 GT Diamond Plus, the Blu-ray content wouldn't play through the HDMI connection but it would through the DVI port."

    OK, so which cards (other than the 8800s) could do HDCP over both ports?

    ----

    p.19 "The end result is that an NVIDIA card with more pipelines that is better at 3D performance will not necessarily be better at video decoding."

    In other words overclocking (say) a 7600 is likely to give as good or better HD video results than using (say) a 7950GX2?

    ----

    p.20 "In the future, we could see power consumption go down with acceleration enabled. As graphics hardware is better suited to processing video than a CPU, efficiency should go up when using hardware acceleration."

    The results for the 9750GTs already seem to show this 147W average non-accelerated >> 142 for EVGA & PNY (although Gigabyte > 148W).

    Adrian
  • chucky2 - Friday, November 17, 2006 - link

    ...can we get it added to these results just for comparison?

    Also, you don't happen to know when that'd be, would you? :)

    Chuck
  • BigLan - Friday, November 17, 2006 - link

    I noticed that in the cpu utilisation tests you said it was around 51% for no acceleration - was this because the player software is single threaded and so only used one core?

    Also, is click encoded in h264, or mpeg2 like the initial bluray discs?
  • DigitalFreak - Friday, November 17, 2006 - link

    Now that the Xbox 360 HD-DVD drive is available and is proven to work with a PC, any chance on doing another round-up "real soon now" using HD-DVD? I'd really love to see the numbers for VC1 and H.264 decoding.

    Still amazed that the lowly X1600 card spanked all Nvidia cards but the G80 boards in CPU utilization.

    Good job guys.
  • DigitalFreak - Friday, November 17, 2006 - link

    BTW, the new releases from Fox on Blu-Ray use the H.264 codec. Behind Enemy Lines, Fantastic 4, etc. I think Behind... is already out.

Log in

Don't have an account? Sign up now