POST A COMMENT

100 Comments

Back to Article

  • hkBst - Sunday, May 28, 2006 - link

    If as you say the error in your measurements is about 5% which for a 30fps score is about 1.5 fps then it is not warranted to post measurements like 31.4 fps. The .4 in there means absolutely nothing. It is clearly not a significant digit. You should either perform multiple measurements to reduce the error or post the less accurate results which you _actually measured_. Reply
  • markshrew - Sunday, May 14, 2006 - link

    I like the new graphs you've been using lately, what software is it? thanks :) Reply
  • Warlock15th - Monday, May 08, 2006 - link

    Check page 7 of this article, titled "Mid Range GPU Performance w/ Bloom Enabled"
    How come the both the vanilla 6800 and the 6600GT outperform the 6800GS? It hing the resutls for these cards were somehow entered wrongly while making the performance graphics...
    Reply
  • othercents - Thursday, May 04, 2006 - link

    Some people need to look at the overall picture on video cards instead of just saying that ATI is the best because it runs the best on Oblivion. If you look at other reviews with games like HL2 and Doom3 you will start to get a different picture of what card is better for you. Don't take this one review buy a new video card and think it is a cure for all games. Especially for whoever said that the 7900GT sucked. That is a great card and performs well in most game. If you don't want it just mail it to me.

    Personally I think I will wait until the new video cards come out before I purchase anything specifically for this game. Why pay the $300-$400 on a new card that performs alright when there should be new ones out near the end of the year that will perform way better. Not to mention drivers that are in the works just for this game and the patches that are coming out. My x600 is playable, so I'm satisfied.

    I guess if you really need that cutting edge performance then you won't mind spending the $1200 or more to buy SLI or Crossfire video cards. This is probably the same peopel who upgrade their desktops every 6 months because there is something else better available. However for the rest of us we should probably wait until the game has 6 months of patches and new video drivers before deciding on upgrading. Experience with HL2 and almost every other game tells me that performance will get better with time.

    Other
    Reply
  • araczynski - Friday, April 28, 2006 - link

    i would love to see these benchmarks compared to another exactly the same run but WITHOUT grass enabled at all. everyone and their dog knows that the grass in this game is implemented very badly, and has a tremendous impact on performance.

    the worst part is that the grass is quite frankly FUGLY! complete sticks out like a sore thumb in every scene i tried it in.

    in any case. i'm running a 2.4@3.2 northwood/1gig/6800gt@ultra+/1280x1024,hdr,nograss,noshadows and it plays very nicely for me.

    i would suspect that anyone with anything from the 7800+ & x1800+ families would be extremely happy with the game if they took off the grass and toned down the shadows.
    Reply
  • araczynski - Friday, April 28, 2006 - link

    oh yeah, and it looks teh sexy from my projector on the wall at 1280x1024 :) Reply
  • JarredWalton - Friday, April 28, 2006 - link

    I like having the grass. Without the grass, it's just yet another game with a flat, green texture representing the ground. That hasn't been acceptable since 2003. I do think the grass could do with a bit more variation (i.e., taller in some areas, shorter in others, more or less dense, etc.), but if the added variation also increased system demands, then forget it.

    Of course, the grass isn't perfect. It doesn't bend around your character's feet/legs; when an object like heavy armor, a sword, a dead body, etc. falls down in tall grass, it can also make it a bit tricky to find. I do find myself disabling the grass at times just for that reason, not to mention shutting it off when there's too much going on and frame rates have slowed to a crawl. Every time I turn off the grass, though, it seems like I've started to turn Oblivion into Morrowind.

    Obviously, we're still at the stage where game engines are only giving a very rough approximation of reality. We won't even discuss how the shadowing/shading/texturing on some faces and objects can look downright awful. We're at the point now where every increase in graphical realism requires an exponential increase in time by the programmers, artists, modelers, etc. as well as an exponential increase in computational power required to render the scenes.
    Reply
  • thisisatest - Thursday, April 27, 2006 - link

    Wow load up the bullshit. Not only are your framerates strange, but it seems that you somehow did something to make the x1900XTX perform worse than the x1900XT... which is impossible. The XTX bests the XT by at least 3-4FPS in the worst of cases, so we can't even consider a performance decrease to be within margin of error.

    We know all the x1k Radeons best any nvidia card out there now in this game because of the simple fact that SM3.0 was done better on ATI and because of the massive pixel shading power the x1900 cards have.

    I don't see any AA tests as well. You forgot to mention that the game looks like garbage without AA and I know many nvidia users that prefer bloom+AA than just HDR.

    Thanks for wasting my time.
    Reply
  • JarredWalton - Thursday, April 27, 2006 - link

    I'm not sure what benchmarks you're looking at, but only places where the XT comes out ahead of the XTX are in the city and dungeon benchmarks, and for all intents and purposes the two are tied. Due to the nature of benchmarking with a Bolivian (we're not playing back a recorded demo, but instead wandering through a location in a repeatable manner) the margin of error is going to be higher. The minimum frame rates are particularly prone to variations, as you might just happen to get a hard drive access that causes a frame rate drop in one run but not another.

    The most important tests are by far the Oblivion gate areas, and trust me you will be seen plenty of them in real gameplay. Who cares if you get 70 frames per second or what ever in the dungeon if your frame rates dropped to 20 or lower outside? Looking at those figures, the XTX comes out about 3% faster, which is pretty much what you can expect from the clock speed increase. 625/1450 vs. 650/1550... if we're GPU limited (as opposed to bandwidth limited, which appears to be the case with all Bolivian), the XTX has a whopping 4% clock speed advantage. Throw in CPU and system limitations, and you're not going to be able to tell the difference between the XT and the XTX in actual use.

    As for your assertion that the game looks like garbage without anti-aliasing, I strongly disagree, as I'm sure many others do. Just because you think AA is required doesn't mean everyone feels the same. I would take HDR rendering in a heartbeat over AA in this game. That's how I've been playing the game for well over 100 hours. When running at the native panel resolution of my LCD (1920x1200), anti-aliasing is something I worry about only when everything else is maxed out. My point isn't that you're wrong, but merely that you're wrong for saying that many people prefer bloom + anti-aliasing. *Some* people will prefer that, but far more likely is that people will prefer bloom without anti-aliasing over HDR, simply because it gives you higher frame rates.
    Reply
  • thisisatest - Friday, April 28, 2006 - link

    At least 2 different "benchmarks" where the XT is ahead of the XTX. Once I see crap like this, sort of makes me doubt everything they say.

    Also we aren't told what settings were used in the drivers for each video card. That plays a huge role here.
    Reply
  • smitty3268 - Friday, April 28, 2006 - link

    Well, all the tests that had the XT ahead of the XTX were obviously CPU bound, so for all intents and purposes you should have read the performance as being equal.

    I would like to know a bit about the drivers though. Were you using Catalyst AI and does it make a difference?
    Reply
  • coldpower27 - Thursday, April 27, 2006 - link

    Quite a nice post there, well said Jarred. Reply
  • JarredWalton - Thursday, April 27, 2006 - link

    LOL - a Bolivian = Oblivion. Thanks, Dragon! :D (There are probably other typos as well. Sorry.) Reply
  • alpha88 - Thursday, April 27, 2006 - link

    Opteron 165, 7800GTX 256meg

    I run at 1920x1200 with every ingame setting set to max, HDR, no AA, (16x AF)

    The game runs just fine.

    I don't know what the framerates are, but whatever they are, it's very playable.

    I have a few graphics mods installed (new textures), and the graphics are good enough that I randomly stop and take screenshots, the view looked so awesome.
    Reply
  • z3R0C00L - Thursday, April 27, 2006 - link

    The game is a glimpse at the future of gaming. The 7x00 series is old. True, nVIDIA were able to remain competitive with revamped 7800's which they now call 7900's but consumers need to remember that these cards have a smaller die space for a reason... they offer less features, less performance and are not geared towards HDR gaming.

    Right now nVIDIA and ATi have a complete role reversal from the x800XT PE vs. 6800 Ultra. The 6800 Ultra performed on par or beat the x800XT PE. The kick was that the 6800 Ultra produced more heat (larger die) was louder (larger cooler) but had more features and was more forward looking. Right now we have the same thing.

    ATi's x1900 series has a larger die, produces more heat (larger die means more voltage to operate) and comes with a larger cooler. The upside is that it's a better card. The x1900 series totally dominate the 7900 series. Some will argue about OpenGL others will point to inexistant flaws in ATi drivers... the truth is those who make these comments on both sides are hardware fans. Product wise.. the x1900 series should be the card you buy if you're looking for a highend card... if you're looking more towards the middle of the market the x1800XT is better then the 7900GT.

    Remember performance, features and technology.. the x1k series has all of them above the 7x00 series. Larger die space.. more heat. Larger die space.. more features.

    Heat/Power for features and performance... hmmm fair tradeoff if you ask me.
    Reply
  • aguilpa1 - Thursday, April 27, 2006 - link

    inefficient game programming is no excuse to go out and spend 1200 on a graphics system. Games like the old Crytek Cryengine have proven they can provide 100% of the oblivion immersion and eye candy without crippling your graphics system and bring your computer to a crawl, ridicoulous game and test,....nuff said. Reply
  • dguy6789 - Thursday, April 27, 2006 - link

    The article is of a nice quality, very informative. However, what I ponder more than GPU performance in this game is CPU performance. Please do an indepth cpu performance article that includes Celerons, Pentium 4s, Pentium Ds, Semprons, Athlon 64s, and Athlon 64 X2s. Firing squad did an article, however it only contained four AMD cpus that were of relatively the same speed in the first place. I, as well as many others, would greatly appreciate an indepth article speaking of cpu performance, dual core benefits, as well as anything else you can think of. Reply
  • coldpower27 - Thursday, April 27, 2006 - link

    I would really enjoy a CPU scaling article with Intel based processors from the Celeron D's, Pentium 4's, and Pentium D's in this game.

    Reply
  • frostyrox - Thursday, April 27, 2006 - link

    It's something i already knew but I'm glad Anandtech has brought it into full view. Oblivion is arguably one of the best PC games i've seen in 2006, and could very well turn out to be one of the best we'll see all year. Instead of optimizing the game for the PC, Bethesda (and Microsoft indirectly) bring to the PC a half *ss, amature, embarassing, and insanely bug-ridden 360 Port. I think I have the right to say this because I have a relatively fast PC (a64 3700+, x800 xl, 2gb cosair, sata2 hdds, etc) and I'm roughly 65hrs into Oblivion right now. Next time Bethesda should use the Daikatana game engine - that way gamers with decent PCs might not see framerates of 75 go to 25 everytime an extra character came onto the screen and sneezed. Right now you may be thinking that I'm mad about all this. Not quite. But I will say this much: next time I get the idea of upgrading my pc, I'll have to remember that upgrading the videocard may be pointless if the best games we see this year are 360 ports running at 30 frames. So here's to you Bethesda and Microsoft, for ruining a gaming experience that could've been so much more if you gave a d*mn about pc gamers. Reply
  • trexpesto - Thursday, April 27, 2006 - link

    Maybe Oblivion should be $100? Reply
  • bobsmith1492 - Wednesday, April 26, 2006 - link

    I'm playing with a 9700 mobility (basically 9600 ultra) in my laptop with a P-M and a gigger at 1024, medium settings about like you set it. Where in the world did all those extra settings come from though (shadows, water)? Is that something outside the game itself? Reply
  • ueadian - Thursday, April 27, 2006 - link

    I played this game fine on my X800XL with high settings.. Yeah it PROBABLY dipped into the 20's but honestly I never really noticed "lag". I shortcircuited my X800XL by stupidly putting a fan with a metal casing on top of it it went ZZZZT and died. I bought a 7900 GT for 299.99 and voltmoded it to GTX speeds and I really don't notice a difference while playing the game. Yeah I'm sure if I payed attention to FPS I'd see it, but really, the only place I noticed lag with my X800XL at high settings was by oblivion gates, and my 7900 GT at 680 core 900 mem locks up near oblivion gates as well. I was sort of forced to "upgrade" my card, but the 7900 GT is the best value for the money right now considering you can do a pen mod to get it to run PAST GTX speeds fairly easy. I have a crappy CRT who's max resolution is 1024x768 and dont plan on upgrading it anytime soon, so I don't need 512mb memory to throw the resolution up to goddly high settings, besides, im pretty blind, I find it easier to play most online games like FPS's at lower resolution just to gain an advantage. Oblivion is near perfection as a GAME it's the most enjoyable game I've ever played, and I've been playing games since Doom. Yeah the engine does suck, and I was really disapointed to have my brand new top of the line video card actualy STUTTER in a game, but really, does it completely ruin the whole game for you? If you have played it you know that it doesn't. Reply
  • thisisatest - Thursday, April 27, 2006 - link

    7900 series isn't what I consider to be the top of the line. There is high end and there is top of the line. The top of the line is clear. Reply
  • poohbear - Wednesday, April 26, 2006 - link

    im really curious to see how dualcore cpus perform as Oblivion is supposed to take advantage of multithreading. if anandtech could do a cpu performance chart that'd be great. firingsquad did a cpu performance chart but only @ 2 resolutions, 800x600 & 1280x1024, they found significant differences between dualcore and singlecore on 800x600 but no diff on 1280x1024. now, i play @ 1024x768 on my 6800GT, so wondering if a dualcore would help in that resolution. also, if u could investigate some of the supposed tweaks for dualcores and if they truly work that'd be great too. thanks. Reply
  • Eris23007 - Wednesday, April 26, 2006 - link


    A friend of mine is playing it on a 3.4GHz Northwood; he told me that when he enabled HyperThreading he got an immediate ~10% (or so) improvement.

    That's a pretty good indication that dual cores will help a *lot*, in my view...
    Reply
  • mpeavid - Thursday, April 27, 2006 - link

    10% is VERY pooor multi threading performance. A decent multi threaded app should give 40-60 and higher for highlt efficient codes.

    Reply
  • nullpointerus - Thursday, April 27, 2006 - link

    HT isn't the same as having dual cores. IIRC, ~10% improvement from HT is rather typical in certain areas where multiple cores have significantly better returns. Reply
  • Akaz1976 - Wednesday, April 26, 2006 - link

    Anyone have any idea how 9800PRO compares to x800? Reply
  • hoppa - Friday, April 28, 2006 - link

    What this test fails to mention is that I'm running a 9800 pro, Athlon XP 3000+, 1.5 gigs of ram, at 1280x768, and the game runs quite well even at medium settings. This game is very stressful at maximum everything but still manages to run incredibly well on older rigs and lower settings. Had I not played this game, after seeing this article I would've thought that it'd be impossible on my rig, but the truth is I've got plenty of computing power to spare. Reply
  • xsilver - Wednesday, April 26, 2006 - link

    9800pro is considered midrange/lowend now -- i guess that article is coming later

    my guess is aprox 10% less than the lowest card on each graph besides the 7300gs (also you dont have HDR)
    Reply
  • Yawgm0th - Wednesday, April 26, 2006 - link

    ...or Oblivion is playable with an average of 20 FPS. I did a benchmark of my own (at a big Oblivion gate with 6-10 enemies and several allies fighting) with settings completely maxed (everything at it's highest except AA) at 1280x960, and my system pulled framerates slightly better than the 7900GT according to the FRAPS results. More importantly, the game is completely playable in all areas. Framerates are low for about four seconds anytime I enter a new area through a door of some kind, but that's not unusual for most games. After those first couple seconds, things pick up and I see no reason for the game to appear to have such abysmal performance as the article would indicate. My system consists of the following:
    2x1GB of Patriot 2-3-2 at 205MHz in dual-channel
    Venice: 274x9 (about 2.47GHz)
    7800GT with a slight overclock
    Audigy 2
    XP Pro x64 with latest nVidia drivers

    Furthermore, that RAM was a recent upgrade. I had the game maxed with 1GB of the same stuff in single-channel.

    At this point, I'm convinced that either there's something wrong with FRAPS (and there's certainly something different that caused the low frames in this article, because I shouldn't be outperforming the Anandtech test system when it's better than mine) or that the game is completely playable with mid-20s framerates. I don't think I've ever played a 3D game and found anything less than high-30s to be playable.
    Reply
  • ueadian - Thursday, April 27, 2006 - link

    Yep you nailed it that's my exact feeling. I played the game with my X800XL and it was very playable on high settings, oblivion gates killed my computer but not enough to drive me insane, other then that i didnt see any lag other then after entering a new area. Benchmarks are overrated I played Counter-Strike : Source at 20-30 fps for a year just fine and when I got a card to do 50+ fps miminum I really didn't notice that much of a difference. Reply
  • TejTrescent - Thursday, April 27, 2006 - link

    Crazy.

    Testing just now, I got 20-30 on my system, no matter where I was, with a bit higher than those medium settings.

    The game's ENTIRELY playable at even 18.

    Dunno how, but it doesn't feel choppy when it falls, as long as it's above 15. Weird.
    Reply
  • dhei - Wednesday, April 26, 2006 - link

    Laugh, real excitement comes from online play. Might as well pay $15 a month for a game with just as good graphics that is updated constantly. Plus you can play missions and fight monsters yourself just like a single player game if thats your bag or slay other people that are actull people online.

    Looking at screenshots, i've seen 4 year old MMO games that look better after they got free graphics updates. /shrug

    I never understood why people pay for single player games like this. :D
    Reply
  • kmmatney - Thursday, April 27, 2006 - link

    You really can't judge the graphics of Oblivion by screenshots. The actual look and feel is much more impressive than the screen shots show. Reply
  • ueadian - Thursday, April 27, 2006 - link

    Agreed. Screenshots do not to ANY game justice. HL2 didnt really impress me visualy with screenshots, but then I played the game all the way through and was blown away by the graphics. Reply
  • poohbear - Wednesday, April 26, 2006 - link

    "i've seen 4 year old MMO games that look better after they got free graphics updates. /shrug "

    name one
    Reply
  • bobsmith1492 - Wednesday, April 26, 2006 - link

    What's "mmo" ?? Reply
  • xsilver - Thursday, April 27, 2006 - link

    when a cow is on crack it cant say "moo" properly :P

    Massivly Multiplayer Online

    as mentioned before its difficult to play a game that has no end and is pressuringly addictive if you join a guild/faction

    eg. most deaths that have resulted from gaming have been from players of such games; most recently from WoW i think
    Reply
  • dhei - Wednesday, April 26, 2006 - link

    Dark Age of Camelot. Reply
  • JarredWalton - Thursday, April 27, 2006 - link

    I can't say I'd even begin to consider DOAC as better looking graphics. But if that's what you like, more power to you. Reply
  • dhei - Thursday, April 27, 2006 - link

    quote:

    I can't say I'd even begin to consider DOAC as better looking graphics. But if that's what you like, more power to you.


    Thats easy to say when you prob don't play it currently. It has all the graphics features you see in Oblivion minus maybe HDR.
    Reply
  • ueadian - Thursday, April 27, 2006 - link

    Lol are you serious? DAoC might look better while you are smoking the reefer bud but I've played it many times and Oblivion blows it away graphicaly. Reply
  • dhei - Thursday, April 27, 2006 - link

    Well your doing drugs to think oblvion has good graphics. I consider them medicore compared to other games of same kind, has I have seen people play the game.

    DAOC graphics look a ton better to me imo...drug free..
    Reply
  • hondaman - Wednesday, April 26, 2006 - link

    Why play oblivion instead of MMO?

    1. No monthly fees
    2. Oblivion has an end. MMO doesnt. Thats a good thing for those of us with lives, but little self control.
    3. No annoying kids to deal with.
    4. No annoying cliques
    5. No annoying server downtimes.
    6. Not having to answer "a/s/l" every 30 seconds.

    There is a pleasant serenity about single player RPG's that is impossible with MMO.
    Reply
  • TejTrescent - Wednesday, April 26, 2006 - link

    But hmm.

    My rig is no where close to the rig that you guys used for comparison, and I don't know my exact framerates because I've not yet ran FRAPS with Oblivion, but..

    My 3500+ Newcastle, not overclocked, with 2GB of Corsair/Mushkin running dual channel at 2.5-3-3-7, with my AGPx8 6800OC from BFGTech (not overclocked any further either).. I pull highly playable framerates (aka no choppiness unless I'm getting jumped by 6+ Daedric mages, that lightning is killer) at settings MILDLY better than the medium GPU ones (though still 1024x768, just higher fade rates), no tweaks on either. I can even run MediaMonkey for music in the background without any choppy feelings.

    I guess Oblivion isn't very CPU dependant or gains anything from multithreads really or something, because huh. I mean, I can't believe my crappy 3500+ is keeping decent pace with an FX60. o_o And better RAM. Just huh. I can generally tell if a framerate falls below 24 thanks to FPS games being painful at any lower than that.. and just huh. Weiird.
    Reply
  • JarredWalton - Wednesday, April 26, 2006 - link

    I would say there's a good chance your framerates are below 20 on a regular basis in the wilderness. FX-60, 7900 GT SLI, 2GB RAM, 2x150GB Raptors, and at Fort Bloodford looking towards the closest Oblivion Gate, I pull 13-15 FPS. (1920x1200, most detail settings at high.) I've also found that lowering a lot of settings doesn't have much of an impact. The various "fade" settings don't do much for me.

    Open the console and type "TDT" to see your frame rates. I personally find anyting above 15 FPS to be acceptable for Oblivion, but opinions vary. :)
    Reply
  • TejTrescent - Wednesday, April 26, 2006 - link

    Well.. I'll check it out later tonight, but if they are, it's absolutely the smoothest sub 20fps I've ever seen.

    Lot more playable than UT2004, Painkiller, FarCry, or... pretty much anything else on this comp. XD
    Reply
  • TejTrescent - Thursday, April 27, 2006 - link

    Wish there was an edit because I feel stupid replying to myself, but. Huh.

    18-30 outside commonly, 18-35 in the city, and consistent 25-35 in dungeon areas.

    I am so so confused right now. The 18 isn't even noticeable. How did they.. what did they.. wha.. Guess it's just the slower pace making it less noticeable..
    Reply
  • nullpointerus - Thursday, April 27, 2006 - link

    Uh...I'm not a graphics guru, but is it possible that the dips in fps are smoother? If we draw a graph with "1" indicating a frame draw and "x" indicating stalling of some kind - such as processing sound, physics, or waiting on the GPU hardware - then I can illustrate what I'm talking about.

    1xx1xx1xx1xx1xx
    1xx1xx1xx1xx1xx
    1xx1xx1xx1xx1xx
    1xx1xx1xx1xx1xx (20 fps on a 60Hz display - balanced)

    1xxxx11xx1xxx1x
    1xxxx11xx1xxx1x
    1xxxx11xx1xxx1x
    1xxxx11xx1xxx1x (20 fps on a 60Hz display - choppy)

    Or do 3D games not work like this?
    Reply
  • Crassus - Wednesday, April 26, 2006 - link

    quote:

    This first part will focus on high-end and mid-range PCIe GPU performance and future articles will look at CPU performance as well as low-end GPU and AGP platform performance if there is enough demand for the latter two.


    Here, demand *wave* (at least concerning an AGP platform review). I've got a trusty K8N Neo2 with a 939 3000+ and a 6800 GT that was sort of standard about 1 1/2 years ago and I'm trying to figure out if I would gain a FPS increase worth talking about switching over to PCIe, while staying in about the same price range, or getting a 7800 AGP, or just turning down settings and saving the money.

    Cheers and thanks for an article that I was very anxious to read.
    Reply
  • Ozenmacher - Wednesday, April 26, 2006 - link

    Everyone argues about how bad the performance is outside. I am running a 7900 GT and I was getting around 20 fps outside, I truned the grass off completely and now I get over 30. And honestly, I don't even notice it. I tjhink the game looks better than the rather lame looking grass. Reply
  • Madellga - Wednesday, April 26, 2006 - link

    I can't see any pictures in this article. I've tried both IE and Firefox, at 2 different computers.

    Is it only me or anyone else has the same?
    Reply
  • JarredWalton - Thursday, April 27, 2006 - link

    Can you see the graphs even? Reply
  • nicolasb - Thursday, April 27, 2006 - link

    I can't see any of the pictures either, including the graphs.

    I'm getting really fed up of this happening in Anandtech articlres: it's been happening on and off for months, now. Could you guys please <b>sort it out</b>? It's really very unprofessional to have allowed the problem to go on for this long.
    Reply
  • JarredWalton - Thursday, April 27, 2006 - link

    Nothing has changed on the images in a long time. Are you surfing from work or home? Do you have any ad blocking software? I suppose it could also be browser settings. I know you can configure Internet Explorer and Firefox to not load images by default, or to only load images for the originating web site. The article content comes from www.AnandTech.com or as the images come from images.AnandTech.com. There's really nothing for us to sort out if the problem is on your end, but hopefully my comments will help you solve your issues. Reply
  • Madellga - Thursday, April 27, 2006 - link

    I found it. The feature causing the problem is Privacy Control. I just turned off and now the page works fine. Reply
  • JarredWalton - Friday, April 28, 2006 - link

    I'm glad to hear you got it working. If it's any consolation, I've felt that every release of Norton Internet Security since 2002 has actually been a step backwards. The latest release seems more bulky, slow, and more prone to causing errors. There was a time where I thought Norton Utilities was the greatest thing ever in terms of useful software. Windows 95 started to reduce its usefulness, and as far as I'm concerned windows XP pretty much killed it off. These days, I feel like I can get better overall quality from free software in many cases.

    I think the only reason Symantec is still in business is due to all of their bundling agreements with various computer manufacturers. So many systems come with Symantec software preinstalled as a 90 day trial, but lucky for me I've found that 90 days gives me more than enough time to uninstall the junk! I've found that a hardware firewall is generally much more useful, in that it's less likely to cause problems, either with web sites or with system performance.
    Reply
  • Madellga - Thursday, April 27, 2006 - link

    It was definetely not the browser. All options are default.

    Add blocking is turned off. I have to find which Norton feature is causing the issue. Interesting enough, the problem seems to happen only at Anandtech.
    Reply
  • Madellga - Thursday, April 27, 2006 - link

    Thanks for the reply. I am surfing from home and I have this problem since a couple of weeks. Based on the comments above, the only thing I have running that could be the problem is Norton Internet Security......

    I just turned off the Norton Firewall and reloaded the page. Now it works!

    Jeez, it sounds silly but I expected more from Symantec. And my copy is an original one, legally purchased - no hacking.

    That feels awful....

    Reply
  • nullpointerus - Thursday, April 27, 2006 - link

    Really? I follow most of this site's articles, and I've never run into this problem. Reply
  • cgrecu77 - Wednesday, April 26, 2006 - link

    this is the article I've been waiting for, I was leaning towards upgrading to x1800xt but wasn't sure.

    The game is easily one of the best i've played (and it's my first RPG, I'm a TBS fan). While it doesn't have the depth and replay value of series like Civ or HOMM, it's still far better than any FPS (can't compare with other RPGs) I ever played.

    Whoever says that the graphics in Oblivion are not the best is just full of b..t or lacks the hardware to turn everything on, looking from the top of the mountain at the Imperial City on a clear moon night it simply breathtaking.

    The gameplay and interface are also among the best I've seen and there are few occasions where I think: "this should have been improved". The inventory system is probably a weaker point, but even that is debatable (it's quite obvious that much thought was put into it but maybe the decisions taken there are not the greatest).

    However, performance is a big issue. My system is middle to upper range (a64 3200, 2gb ram, x850XT) and I can barely play at 1280x1024 with all sliders to the top and without shadows or AA.

    Outdoor I get ~20fps which is ok, actually even excelent considering the huge number of objects rendered (especially grass and trees) - and acceptable since most battles are inside. What I don't get is why I have such a poor performance indoors, there are moments in a heated dungeon battle (especially where there are many fires, like inside sigil towers) where frame rates drops to low teens (from mid 50s). Graphics are average in those building, I only battle 2-3 opponents or less, the map is quite small (since any door leads to loading times )- so I don't get it, how come the game slows to a crawl there. I would consider this an obvious place where optimizations are lacking. Another thing that's missing is a way to alter the grass length (from the game menu, most people only look there to alter settings) and a few other things that were proven to greatly improve performance.

    Reply
  • oneils - Wednesday, April 26, 2006 - link

    The hit to performance in dungeons may be due to a mix of having the shadow detail and specular lighting (or filtering?) set too high. I have the same problem with my 6800gt. Especially when I am fighting spell casters. If we are both casting spells, the system crawls. Reply
  • DigitalFreak - Wednesday, April 26, 2006 - link

    Opteron 165 @ 2.51Ghz
    2GB RAM
    Geforce 7900GTX SLI

    Check the tweak guides. There is A LOT you can do to make it run smoother without lowering graphic settings.
    Reply
  • bollwerk - Wednesday, April 26, 2006 - link

    I also have 7900GTX SLI and it also runs fine for me with maxed settings at 1920x1200. (Athlon 64 X2 3800+, 2GB ram, A8N32SLI) It's obviously not high FPS, but it's also not choppy at all as far as I can tell. Totally playable for an RPG. I'm loving it and I'm glad I didn't get the 360 version. The PC mods are soooo worth it. Reply
  • JarredWalton - Wednesday, April 26, 2006 - link

    But the tweak guides ARE lowering the graphics settings, just in a different way. I'm okay with 1920x1200 at modified details on 7900GT, but there are still times when frame rates drop into the single digits. Reply
  • Yawgm0th - Wednesday, April 26, 2006 - link

    No, they're not. I mean, you can do tweaks that involve lowering settings, but that are tons you can do that improve graphics and graphics performance at the same time. Reply
  • JarredWalton - Thursday, April 27, 2006 - link

    For example? I'd really like to get more graphics quality for less graphics work. Reply
  • kmmatney - Wednesday, April 26, 2006 - link

    I would also like to see the performance without Bloom and HDR. A lot of times, I prefer games without this effect (its often not implemented very well). PLays well on my setup without AA and bloom, and with AF, at 1280 x 1024. Sempron 2800+ @ 2.4 GHz, and modded X800GTO2 running at X850XT speeds. I'd rather play at the higher resolution than lower resolution + Bloom. Reply
  • OrSin - Wednesday, April 26, 2006 - link

    My problem is half the hipe of this game is that you a need a monster system to use it.
    Does any one remeber when a game bragged about the fast that it doesn't need powerful card to play it. Now its just the opposite. No wonder game will not be made for hard core gamers soon. I just can't understand have to pay $400 for a card that only 1-2 games will actually need. When in 6 months that same card is $250 maybe 6 games out might need it. Program in this age are jsut lazy or the products are being rushed. Can we get some optization and have people talk about great graphics on $150 cards.
    As much as I hate consoles I'm leaning to them more and more. They will always the play the game good and mulitplayer support internet support is here (not this game).
    I just hate the UI of most of them.
    Reply
  • cgrecu77 - Wednesday, April 26, 2006 - link

    consoles are not great and they're a waste of money - think about Oblivion, the game barely runs on 6 months box (Xbox360). Do you honestly think that in 5 years a top of the line PC game will run on an XBox? I bought my x850XT last year for less than $300 on eBay and I can play this game at my lcd's native resolution (1280x1024). I'm pretty sure that with my current video card I'll be able to play games for 1 year or so and then I can probably buy the X1900XT for ~300 and that will give me another 2 years.
    Lower the resolution a little and you can extend it even further.
    The biggest danger is to avoid buying middle range/ extrem upper range - this way you'd be screwed. If I would've bought a 6600GT last year it would've been a bad decision. If I'd buy a x1800GTO now it would be a poor decision also, because it can barely play the new games, in one year it'll become obsolete. The trick is to buy things that last 2 years at a decent price.

    The problem with consoles are that if you buy beyond the first year or two you are actually buying an obsolete piece of hardware (think about the original Xbox, still selling - it's p3 with geforce 2 or 3). In the past consoles had a big advantage by running at low res, but once hdtv becomes mainstream they would have to support pc resolutions. If you have an hdtv you know that low-res content looks like crap with or without upconverting to a higher res (take a picture at 400x300 and enlarge it with Photoshop to 1600x1200 and you'll see what I'm talking about).
    Reply
  • pnyffeler - Wednesday, April 26, 2006 - link

    Is there any way anyone can look at these numbers and not just by an Xbox360? I had a feeling my 6800GT wouldn't cut it, but wow! It's games like this that make the feasibility of upgrading your PC to play the latest game ridiculously stupid. Spend the $470 to get the console and the game.

    If you really feel the need to spend more money than that, I'd recommend investing in a nice HDTV. Oblivion in hi-def from your couch is about as good as it gets.

    Now if I could just get the Mrs. to stop hogging the TV...
    Reply
  • mesyn191 - Wednesday, April 26, 2006 - link

    In every bench instance shown they're testing at a higher resolution than the X360 will ever display...

    Also the X360 version only looks good if you've got the HDTV to go with it, otherwise it really does look like crap on a SDTV.

    Gameplay wise this game is very very good, not perfect, but better than any other game I've played.
    Reply
  • erwos - Thursday, April 27, 2006 - link

    That's flatly untrue. The X360 has a 1280x1024 resolution with the VGA adapter. Also, the PC version would also look like crap on an SDTV, so that's not really a valid point against the 360.

    -Erwos
    Reply
  • mesyn191 - Thursday, April 27, 2006 - link

    That is the X360's built in video scalar doing the work though, max that any game renders real time ATM on the X360 is 720p due to the limited amount of EDRAM available. Real and not scaled 1080i may be possible, but only if the developer specifically designs the game for it by cutting out effects, reducing textures, etc. and does not use FSAA.

    http://www.beyond3d.com/articles/xenos/">http://www.beyond3d.com/articles/xenos/

    My point about how the X360 games look on a SDTV is valid as most people don't use thier PC's with a SDTV or any TV at all, they use a CRT or LCD monitor... The opposite is almost never true in the X360's case.
    Reply
  • hondaman - Wednesday, April 26, 2006 - link

    I would have upgraded my "crappy" 7800gtx had I known how bad my FPS really is.

    I love this game. Easily the best game I've ever played in my life. Its not perfect, and I wont defend it as being so, but overall, its magnificent. I have over 270 game hours, and still havent done everything there is to do.
    Reply
  • mpeavid - Wednesday, April 26, 2006 - link

    The engine is extremely inefficient. The quests are not really that interesting.
    The NPCs are stiffs.

    Overall? 8/10.

    The game engine needs major tweaking.

    FYI - eliminate the grass totally and you can increase your frame count by as much as 50%. There is also a tweak to increase threads (another possible 50% increase).
    I went from 20 fps outdoors to near 30 fps with higher visual settings (except the grass part)
    Reply
  • Spoonbender - Wednesday, April 26, 2006 - link

    You know, it seems they've compiled the game in debug mode... :)
    That's how much they've bothered tweaking performance. ;)

    When the game crashes (which happens to be every time I exit the game, or alt-tab too much), I get an access violation, always on special debug locations, like 0xCDCDCDCD or 0xFEEEFEEE.

    So, yay for high performance tweaking.
    Reply
  • JarredWalton - Wednesday, April 26, 2006 - link

    Lowering grass height and detail is probably the best tweak you can make. I set mine for "60" in terms of density and half as tall (I think?) and performance went WAY up without seriously ruining (IMO) the appearance.

    As for the game, it depends on what you're after. I really like the game, but the UI has some major issues. I think the quests are relatively well done overall. I love the huge game world (huge in content, not in square miles). Eventually, though, a lot of the game becomes repetitive. I've been doing too many side quests; time to hit the main quest in earnest.... :)
    Reply
  • kmmatney - Wednesday, April 26, 2006 - link

    I installed a mod for the UI (BTMod) to make the UI windows larger and its a huge improvment. I dont bother with Bloom, and performance is acceptable on my modded X800GTO2 and Sempron @ 2.4 GHz. Reply
  • blackbrrd - Wednesday, April 26, 2006 - link

    Using the BTmod myself, it works for me ;)

    I would like a few more shortcut keys myself, but other than that, it took me about 2 minutes to figure out how to use the interface. The 8 free shortkeys that you can assign to weapons/spells/potions etc works well, you just want more shortkeys :P

    I am playing the game on a laptop with a radeon 9600. It obviosly doesn't look as good as in the pictures, but it runs ok, so I would say that the graphics engine scales nicely for any graphics card bought the last 2-3 years*

    *A friend of mine has a geforce fx5900 and he gets horrible performance - there should have been a seperate shader 1.x path for those cards.

    I do agree that the game is just nearly finished, for instance the textures for 256mb and 512mb graphics cards could be much larger, there are several mods available as it is, but it should have been in the game.

    All in all I think it was a good compromise between launching the game as early as possible and performance wise. Personally I haven't had any problems with the game except for multitasking which won't work properly if you don't pull down the console first :P **

    **The game has quirks - but its a good game, and there are work arounds. :) Its also the first game that have made me actually consider upgradeing/buying a proper gameing machine.
    Reply
  • kmmatney - Wednesday, April 26, 2006 - link

    There is a shader 1.X path - look up Oldblivion. it allows the game to run on the 5900 quite well, from what I've heard. Reply
  • Ryan Norton - Wednesday, April 26, 2006 - link

    the elderscrolls.com/forums do crack me the fuck up... there is literally no aspect of the game no matter how glaringly mis-implemented that the fanboys will not defend to their last gasp.

    I don't have a link for it, but the website/guy that does "tweak guides" for 3D games put up a super-lengthy one for Oblivion. I'd already stumbled onto some of the things but it was still good for making the game seem a little smoother outdoors.

    I love the line about outdoors performance making users contemplate $1200 on video cards... until I started playing Obliv I'd always thought SLI a waste of money, but now I catch myself thinking "hmm another 7800GTX is 'only' another $450"... must restrain self.
    Reply
  • Powermoloch - Wednesday, April 26, 2006 - link

    I had been waiting for quite a while for anandtech's take on oblivion. And I'm very surprised that you got alot of GPUs tested out for us. Especially being a x850xt agp owner, I'm very pleased that it has enough juice to play @ 1280x1024 at almost @ med-high settings lol.

    Kudos for the great job guys, great benchmark results ;).
    Reply
  • Frallan - Thursday, April 27, 2006 - link

    I agree!!!

    Excelent Reveiw!!!

    But as an owner of older Hardware Id love to know where my 6800Gt stumbles in on the list. Usually I run it @ 425/1150 which is almost Ultra speeds but....

    Please any1 who is in the know???
    Reply
  • bob661 - Thursday, April 27, 2006 - link

    You can compare it to the 6800GS. They're the same card. Reply
  • michal1980 - Wednesday, April 26, 2006 - link

    playing the xbox 360 version.

    and really, it does not look much better then like hlf2.

    I'm sorry but anyone that says (not that anyone here has) that this is a great engine with great graphics needs to take a break.

    there can be alot going on sometimes, but the draw distance sucks, loads every 2 mins. controls are a little wishy washy.

    its an ok game, but at times seems way to unfocused. with a story line that is weak at best.
    Reply
  • Jackyl - Wednesday, April 26, 2006 - link

    Correct. The graphics are not "next gen" as was hyped. The problem with the performance of the gamebryo engine is that it doesn't support culling, hidden-surface removal. It draws everything, which causes a lot of slow down. If you are outside, standing behind a building, it still calculates whatever is on the other side, even though you can't see it. Bad design IMO for a "next gen" engine.
    Reply
  • JarredWalton - Wednesday, April 26, 2006 - link

    Okay, I'm not going to dispute your claims, but how on earth do you know that the engine isn't doing HSR? Damn, that was one of the first things that was discussed in terms of 3D engine optimization in my Graphics class. I'm not sure how you prove what they are or aren't doing without seeing the code, though.

    I also have to say that I don't think the Gamebryo engine is as bad as you're making it out to be. I see very little in the way of load times (the "loading" screens are mostly there for Xbox360), large outdoor areas, relatively nice effects (HDR, reflections, etc.), and generally interesting gameplay mechanics. You're certainly not going to get all of these things from other engines on the market. Doom3 would choke outdoors, for example.

    What we need is an engine that offers:
    Doom3 indoor areas
    Far Cry outdoors
    HL2/FEAR shaders
    Dungeon Siege load times


    Any UI that doesn't have console roots! UGH! Sell... Are you sure? Buy... Are you sure? Heaven forbid that we actually sell more than one type of item at a time. How about something like Fallout's barter interface, with a few tweaks to bring it into 2006 era? Also, what the hell is the point of "maximum gold" for a shop. "I can only buy $500 worth of stuff at a time, but if you sell things to me one at a time, I can effectively buy out your whole inventory!" Thank you Bethesda for dumbing down the economic system. Maybe they should have more magical weapons readily available, and then allow you to trade equipment to get them recharged? Naw, real bartering would make too much sense....
    Reply
  • nts - Wednesday, April 26, 2006 - link

    Hidden surface removal is obviously there, every game has it lol :p

    What this game needs and is missing is some sort of Occlusion Culling (not sending down geometry that won't be visible in the final frame, eg terrain/trees/grass behind city walls).
    Reply
  • MrCoyote - Wednesday, April 26, 2006 - link

    What good does it do to benchmark this game? The engine they licensed for this game is very limited in performance and unoptimized <b>(GameByro)</b>. Please wait until a patch comes out, then benchmark the game. This performance limit can be seen in various reviews across the web. A 7900GT or X1900XT should be getting more than 20-30FPS on this game. Reply
  • munky - Wednesday, April 26, 2006 - link

    People are not gonna wait for a patch to buy the game, and the whole premise of improved performance with a patch is uncertain. This review is useful for those who have the game, and would like to know if a certain upgrade would give them a significant improvement, or would just like to see how their performance compares to those with other video cards. Reply
  • Jackyl - Wednesday, April 26, 2006 - link

    That's very true that the game is very much in an "unfinished" stage and was released too soon. Any professional developer knows that the Gamebyro engine has it's share of performance problems. It is not in the same league of engines as others. Why they licensed this engine is beyond me.

    Warning...Don't post anything even remotely negative about the game on the official forums. Fanboys will be waiting to lash back at you. Yet people are complaining left and right on the forums of the performance problems, bugs, and interface problems. The PC game looks nothing but just a console port. Bad inventory screen system and GUI. Hopefully a 3rd party modder can change these things.
    Reply
  • poohbear - Wednesday, April 26, 2006 - link

    the bad inventory has already been resolved w/ a user mod. look for the Btmod-2.20 Reply
  • nts - Wednesday, April 26, 2006 - link

    The X1800XT can be had for less than the 7900GT and performs better...

    Why was it not include it in the Mid Range graphs?
    Reply
  • SiliconDoc - Friday, July 17, 2009 - link

    Wow, how quickly the complaining masses forget. $1,200.00 for two ati cards - LOL
    And here I've been told by all the disgruntled red fans that nvidia is the greedy scalping horror....
    BWHAHAHAHAHAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
    ---
    It's so nice that anandtech doesn't delete their old reviews.
    TWELVE HUNDRED BUCKAROOS FOR 2 ATI CARDS.
    --
    Once again I find out the red fans have been deceiving me all along.
    This is great - another myth of the web, exploded thanks to the web.
    Reply
  • Spoelie - Wednesday, April 26, 2006 - link

    So very true
    http://www.alternate.de/html/productDetails.html?a...">http://www.alternate.de/html/productDetails.html?a...
    http://www.alternate.de/html/productDetails.html?a...">http://www.alternate.de/html/productDetails.html?a...

    For the money, the x1800xt 512mb seems a better deal.
    Reply
  • bob661 - Thursday, April 27, 2006 - link

    Just checked on Newegg and the prices are the same unless you go with the 256MB version of the X1800XT. Reply
  • bob661 - Thursday, April 27, 2006 - link

    On Zipzoomfly, the 7900GT is a bit cheaper. Reply
  • Anand Lal Shimpi - Wednesday, April 26, 2006 - link

    We inadvertently left it out of the midrange tests; I just updated the graphs and conclusion to reflect its inclusion.

    Take care,
    Anand
    Reply

Log in

Don't have an account? Sign up now