Lower Quality Performance

Our editorial comments about Oblivion aside, some gamers will attempt to play games like this and Rise of Legends on budget 128 MB cards. This is their God-given right, and we've included some special tests for the slowest of our cards, the Sparkle 7300 GS Ultra 2 at even lower quality settings for Oblivion and Rise of Legends. This is so that we can get an idea of the performance we might expect in games like these with just the most basic graphical elements enabled.

The quality settings in Oblivion for these tests were the same as listed for the normal performance tests, except with HDR effects turned to "off." In Rise of Legends, we set the performance slider exactly halfway between the two choices: "run faster," and "look better." This turns down settings like shadow and special effects details.

Low Quality Results
800x600 %Improvement 1024x768 %Improvement
Oblivion Town Low Quality 42.8 67.84% 33.6 70.56%
Oblivion Gate Low Quality 23.9 47.53% 17.4 51.30%
Rise of Legends Medium Quality 22.1 92.17%

For reference, in the town benchmark, the Sparkle card got 25.5 fps at 800x600 and 19.7 fps at 1024x768 in the normal performance tests (with HDR on). In the oblivion gate benchmark, the card got 16.2 fps at 800x600 and 11.5 at 1024x768. This shows us the kind of performance demands that just something like HDR lighting adds to the game. In Rise of Legends, the framerate we got at 1024x768 for the Sparkle 7300 GS Ultra 2 was 11.5.

Both of these games obviously see large improvements in performance with these lower settings as the table shows. While Rise of Legends becomes playable by turning down the graphics settings, this card still has trouble running Oblivion even at 800x600. For those determined enough to run this game on a card like this, it is technically playable, but again we recommend waiting until you've saved up at least enough money for a card like the 7600 GT or X1900 GT.

Oblivion Power & Heat
Comments Locked

49 Comments

View All Comments

  • yyrkoon - Thursday, August 31, 2006 - link

    If its silly, why even bother replying . . . No need to go out of your way to be a jerk.
  • nullpointerus - Friday, September 1, 2006 - link

    Jerks don't take the time to apologize. As for why I apologized, I felt badly for responding in kind. I was belittling people who felt the need to belittle the site without taking the trouble to think their arguments through. Apparently that put some kind of chip on your shoulder such that you felt the need to attack me after I'd already apologized.
  • DerekWilson - Friday, September 1, 2006 - link

    maybe we can take a different angle as the standard reasoning has been rolled out already ...

    if we decide to test with a system that "matches" the graphics card, we are making a decision about what is reasonable for either a specific level of performance or price point. By making such a decision, we limit ourselves -- for instance, in this review we may have chosen a system to match a 7600 GS. But maybe it's too under powered for a 7600 GT, or perhaps its too overpriced for a 7300 GS.

    we absolutely can't test every card with every processor and every memory configuration on every chipset for every review.

    en lieu of choosing one system that is supposed to be a "one size fits all", we can remove the system from consideration by choosing the highest end configuration possible.

    when a graphics card peforms better in our system, we know it is capable of better performance in any system. this is true in almost every case.

    this does put a burden on the reader to understand the limitations of his or her own system -- i.e., will the fact that the 7600 GT performs higher than 7600 GS expose a CPU limitation on the system the reader is building/upgrading.

    this question can be answered in a couple ways.

    with game tests, if you can borrow a high end graphics card and see where the cpu limitation falls at something like 800x600 without aa and af, you'll know where the upper limit on framerate is based on the CPU. thus a decision can be made about the best fit for a card.

    if you can't borrow a higher end card, you can turn all the graphics settings down as far as possible and run at 640x480 or lower if possible (does anything aside from the chronicles of riddick still support 320x240?). this isn't ideal, but even on a low end card you can get a pretty good idea of whether or not there will be a cpu limitation entering into the mix.

    when you know what the cpu limit of your system is, pick the resolution you want to run, and find a card that gives you a number just over this limit. this card is the ideal fit for your system at your resolution. it will deliver the performance your cpu will ask for.

    I know its complicated, but its much better than the can of worms we'd open if we went in another direction.

    In GPU reviews meant to demonstrate the capabilities of a graphics card, we will not add unnecessary bottlenecks to the system.
  • nullpointerus - Friday, September 1, 2006 - link

    You need a form letter, or something. Maybe you could put up a short page entitled Why We Test this Way and link to it on the front page of each article.
  • nullpointerus - Thursday, August 31, 2006 - link

    Hmm...that last paragraph came out a little too harsh. I apologize in advance if I've offended anyone. I still think the points are valid, though.
  • JarredWalton - Thursday, August 31, 2006 - link

    If you look at the performance difference between an E6400 stock and 3.0 GHz OC in our http://www.anandtech.com/systems/showdoc.aspx?i=28...">PC Club system review, you will see that it makes virtually no difference in performance even with a 7900 GT. All of these GPUs are the bottleneck in gaming, but we use a higher-end (relatively speaking) CPU just to make sure.
  • imaheadcase - Thursday, August 31, 2006 - link

    I disagree 800x600 is great for sniping, i play on a 9700 Pro and normally switch between 800x600 and 1024x768 and like 800x600 better on large maps. It brings the objects "bigger" to me and lets me get better accuracy.

    Even if i had a 7900GT i would prob not go higher than 1024x768. Don't know why people play at higher rez, makes everything so tiny. Squinting to play a game is annoying and distracting from gameplay :D
  • Josh7289 - Thursday, August 31, 2006 - link

    People who have larger monitors have to use higher resolutions to keep things from getting too large, and to make good use of all that real estate, especially when it's an LCD (native resolution).

    For example, a 17" CRT is best run at 1024 x 768 for games, while a 21" or so LCD is best run at 1600 x 1200 or 1680 x 1050, depending on its native resolution.
  • Olaf van der Spek - Thursday, August 31, 2006 - link

    What do you mean with 'too large'?
    In games it's not like in Windows where objects get smaller if you increase the resolution.
  • DerekWilson - Thursday, August 31, 2006 - link

    this is correct (except with user interfaces for some reason -- and there the exception is warcraft 3). thanks Olaf.

    lower resolution will give you much less accuracy -- larger pixels in the same screen area decrease detail.

    the extreme example is if you have a 4x3 grid and you need to snipe someone -- his head has to be in the center of one of the 12 blocks you have to aim through to even be able to hit him. The smaller these blocks are, the more pixels fit into the head, the more capable you will be of sniping.

Log in

Don't have an account? Sign up now