Gaming Performance using Oblivion

We'll close out our gaming performing analysis with Oblivion. We ran at a setting that more or less corresponds to "medium quality", without antialiasing. This game is demanding of both CPUs and GPUs, though if you have to choose just one we would still recommend a faster GPU over a faster CPU. Remember, we are using arguably the fastest GPU setup for running Oblivion; if you're only running a single GPU, your average frame rates will be far lower. That said, let's take a look at performance:

Gaming Performance - Oblivion

Gaming Performance - Oblivion

The additional L2 cache doesn't seem to matter as much in Oblivion, but faster processor speeds definitely help out. The performance spread is 67% in the town portion of the benchmark, but only 51% in the dungeons. This likely has a lot to do with the number of creatures present in most towns, as there are far more AI calculations to perform.

The patterns we've seen in other games and applications continue here, and with some overclocking both of the slowest Core 2 processors are essentially out of reach of the fastest AM2 offerings. It will almost certainly take more than a die shrink and faster clock speeds for AMD to close the gap. Those of you who are interested in purchasing a high-performance CPU and keeping it for a while while you upgrade your graphics cards will definitely not be displeased with what you can get from Intel's Core 2 lineup.

Gaming Performance using F.E.A.R. & Rise of Legends Final Words
Comments Locked

137 Comments

View All Comments

  • Gary Key - Wednesday, July 26, 2006 - link

    quote:

    ...that indeed you all get weekly checks from Intel for the favorable press.


    Damn, Intel must have lost my address. ;-)
  • coldpower27 - Wednesday, July 26, 2006 - link

    This is just so sad, how far AMD fanboys will go. I really wish there was moderation allowed, here the user point system is hardly effective enough.
  • JarredWalton - Wednesday, July 26, 2006 - link

    I'm pretty sure that was sarcasm, Coldpower. LOL :)
  • coldpower27 - Wednesday, July 26, 2006 - link

    Well, I guess my bad, though without a /sarcasm tag it's hard to tell. This is n't real life where you can here the tone of people's voices. :P
  • Sunrise089 - Wednesday, July 26, 2006 - link

    You were unable to tell "the Magic Money Fairy" was sarcasm?

    Why not just come clean and admit that you didn't read carefully.
  • coldpower27 - Wednesday, July 26, 2006 - link

    I read plenty carefully, thanks.
  • Sunrise089 - Thursday, July 27, 2006 - link

    Seriously?

    You were seriously unable to tell the following was sarcasm:

    "...you all get weekly checks from Intel"
    "...most Intel processors really don't even work at all"
    "...Intel pays off the companies to say they're Intel Inside"
    and of course
    "the Magic Money Fairy."?

    Dude, it's understandable that you were reading fast and thought the post was fanboy-ism, which there is indeed a lot of. Refusing to admit that and stating instead that the original (actually quite funny) post wasn't clear is insulting to that poster and frankly somewhat alarming.
  • coldpower27 - Thursday, July 27, 2006 - link


    Your taking this way to seriously, if I can't recognize without smiley faces or a tag that it is sarcasm, it's acceptable considering this is written language. I rely on the tone of the conversation, which is absent here.

    There is nothing to admit. There continues to be alot of AMD fanboyism at this site, even reading carefully, it sometimes isn't a simple task to deduce what is sarcasm from the rest of the fanboy drivel.

    I am not refusing to admit anything, the poster should have considered this before he made the post, that not everyone will be able to catch the sarcasm, I assume the poster would have thought about this, and I already said my bad in the above post. You may think the establish indicators are sufficient for you, for me they are not.

    Not everyone can percieve exactly the same things you can allright, and assuming otherwise is ridiculous in itself.
  • lewisc - Thursday, July 27, 2006 - link

    lol - was a bit of a knee-jerk response, I thought the same thing until I read all the replies before, and then realised that it was indeed (hopefully!) sarcasm. You can't blame coldpower with the amount of rubbish being spouted by some users with how 'biased' this site is.
  • VooDooAddict - Wednesday, July 26, 2006 - link

    Nice to see you get your own article again from time to time, reminds me of when I started visiting AT.

    The following deals with the gaming performance question most people are asking. I understand that the true focus of the article is 2m Cache vs. 4m Cache and the Overclocking impact. E6600 is still at the top of my list for a powerful new SFF thanks to the article. Regardless the article has prompted the following:

    (Maybe the following can be highlighted in a Budget/Midrange Gaming system buying guide...)

    I think it's undeniably clear that the Core 2 Duo Chips offer the most headroom for future GPUs and should therefore be at the top of most gamers’ lists if they can afford it.

    What I think some people may still find important is that with any of these amazing CPUs ... gaming is still GPU limited. It begs for the comparison of the E6300/E6400, the 3800/4200 X2, and 3500/3800 Single Cores. With a quality lower cost boards and single video cards like the 7950GX2 and 1900 XT. Do the lower end CPUs really limit gaming with a single card solution? I think the 7950 would also give a good showing as to if the new higher end and Dual Cores really needed for SLI, or if new new "low end" which used to be the high end are enough to keep that 7950 going. Most gamers I find at LANs still only run 1280x1024 without massive AA/AF simply due to the popularity of the cheep 17" and 19" sub 12ms LCDs. I also find a large number of gamers (who enjoy gaming but don't really spend much time enthusiastic about the hardware) don't ever turn on AA/AF.

    I'm not saying you didn't state that most games are still GPU bound. You clearly tell gamers clearly in the article that it's probably best to buy the E6300 with a high-end video card then a E6600 and a mid range video card. I just think that it needs to be shown.

Log in

Don't have an account? Sign up now