Does Size Matter?

To answer our question: it depends. Different games seem to be impacted in dramatically different ways, and resolution does play a large role in how much memory size matters. In order to understand the differences, we have taken all our 8800 GTS and 8800 GTS 320MB numbers and looked at how much faster the 640MB part performs via percent increase.

The graphs below are organized by resolution. Unfortunately, the scale between each graph couldn't be kept the same as the variation on the data was much too high. We should also remember that each of our tests can have a bit of variance. We try to keep this to 3%, but that means these numbers could have a little higher deviance. First up is 1600x1200.



Quake 4 jumps out as being a huge beneficiary of more memory. We do test with Ultra Mode, which means uncompressed textures and uncompressed normal maps. This seems to have a huge impact on performance, affording the 640MB card a 50% performance advantage over its new little brother.

In most of the other cases where size matters, the big performance hit comes along with enabling 4xAA. The memory requirement for enabling AA can be quite high, but the exception here is Quake 4. Memory size seems to have less of an impact with AA enabled, but keep in mind that the performance of both cards is much lower with 4xAA enabled.



Looking at 1920x1200, most of the numbers are very similar to what we saw with 1600x1200. This isn't surprising, as the number of pixels being rendered at each of these resolutions is similar. This time around, the odd man out is Battlefield 2. There is a much larger impact on performance under BF2 with 4xAA enabled at 19x12 when running the 320MB 8800 GTS as opposed to the 640MB part.



The trend continues here with BF2 jumping way up in performance difference at 2560x1600. F.E.A.R. and Battlefield 2 both see a larger performance drop at this resolution even with AA disabled. Also of interest is the fact that this resolution shows an impact on Half-Life 2: Episode One with 4xAA whereas others did not.

It is very important to note that Oblivion and Rainbow Six: Vegas don't see much of a performance loss with the decreased memory size. Of course, we can't test these applications with AA enabled, but it is still interesting that there remains so little difference between these numbers. This is especially compelling; as Oblivion and Vegas are the two best looking games in our test suite. Rainbow Six even uses the Unreal Engine 3 from Epic which is capable of producing some incredible visuals.

Does that mean size won't matter in the future or with other UE3 titles? We can't say that with any real certainty, as developers can always find ways to push memory usage. But that does mean that right now, gamers who play a lot of Oblivion and Rainbow Six: Vegas will find a better value in the 8800 GTS 320MB than the 640MB version.

When looking at other titles, especially with AA enabled at high resolutions, the 640MB card does offer much more than the 320MB part. But is it compelling enough to warrant spending an extra $100? Let's take a look at the individual performance numbers and find out.

The 8800 GTS 320MB and The Test Battlefield 2 Performance
Comments Locked

55 Comments

View All Comments

  • Marlin1975 - Monday, February 12, 2007 - link

    Whats up with all the super high resolutions? Most people are running 19inch LCDs that = 1280 X 1024. How about some comparisions at that resolution?
  • poohbear - Tuesday, February 13, 2007 - link

    have to agree here, most people game @ 12x10, especially people looking @ this price segment. Once i saw the graphcs w/ only 16x12+ i didnt even think the stuff applies to me. CPU influence is`nt really a factor except for 1024x768 and below, i`ve seen plenty of graphs that demonstrate that w/ oblivion. a faster cpu didnt show any diff until u tested @ 1024x768, 12x10 didnt show much of a diff between an AMD +3500 and a fx60 @ that resolution (maybe 5-10fps). please try to include atleast 12x10 for most of us gamers @ that rez.:)thanks for a good review nonetheless.:)
  • DigitalFreak - Monday, February 12, 2007 - link

    WHY didn't you test at 640x480? Waaahhh
  • DerekWilson - Monday, February 12, 2007 - link

    Performance at 1280x1024 is pretty easily extrapolated in most cases. Not to mention CPU limited in more than one of these games.

    The reason we tested at 1600x1200 and up is because that's where you start to see real differences. Yes, there are games that are taxing on cards at 12x10, but both Oblivion and R6:Vegas show no difference in performance in any of our tests.

    12x10 with huge levels of AA could be interesting in some of these cases, but we've also only had the card since late last week. Even though we'd love to test absolutely everything, if we don't narrow down tests we would never get reviews up.
  • aka1nas - Monday, February 12, 2007 - link

    I completely understand your position, Derek. However, as this is a mid-range card, wouldn't it make sense to not assume that anyone looking at it will be using a monitor capable of 16x12 or higher? Realistically, people who are willing to drop that much on a display would probably be looking at the GTX(or two of them) rather than this card. The lower widescreen resolutions are pretty reasonable to show now as those are starting to become more common and affordable, but 16x12 or 19x12 capable displays are still pretty expensive.
  • JarredWalton - Monday, February 12, 2007 - link

    1680x1050 displays are about $300, and I think they represent the best fit for a $300 (or less) GPU. 1680x1050 performance is also going to be very close - within 10% - of 1600x1200 results. For some reason, quite a few games run a bit slower in WS modes, so the net result is that 1680x1050 is usually within 3-5% of 1600x1200 performance. Lower than that, and I think people are going to be looking at midrange CPUs costing $200 or so, and at that point the CPU is definitely going to limit performance unless you want to crank up AA.
  • maevinj - Monday, February 12, 2007 - link

    Exactly. I want to know how this card compares to a 6800gt also. I run 1280x1024 on a 19' and with the 6800gt and need to know if it's worth spending 300 bucks to upgrade or just wait.
  • DerekWilson - Monday, February 12, 2007 - link

    The 8800 GTS is much more powerful than the 6800 GT ... but at 12x10, you'll probably be so CPU limited that you won't get as much benefit out of the card as you would like.

    This is especially true if also running a much slower processor than our X6800. Performance of the card will be very limited.

    If DX10 and all the 8 series features are what you want, you're best off waiting. There aren't any games or apps that take any real advantage of these features yet, and NVIDIA will be coming out with DX10 parts suitable for slower systems or people on more of a budget.
  • aka1nas - Monday, February 12, 2007 - link

    It would be nice to actually have quantifiable proof of that, though. 1280x1024 is going to be the most common gamer resolution for at least another year or two until the larger panels come down in price a bit more. I for one would like to know if I should bother upgrading to a 8800GTS from my X1900XT 512MB but it's already getting hard to find direct comparisons.
  • Souka - Monday, February 12, 2007 - link

    Its faster than a 6800gt.... what else do you want to know.

    :P

Log in

Don't have an account? Sign up now