Memory Size Scaling

We were very interested in how the additional RAM affected game performance, so we underclocked our 7800 GTX 512 to 430/1.2 (core/mem) in order to see what (if any) difference we would find between the original 7800 GTX and the new model from framebuffer size alone. We will look at 2048x1536 both with and without AA as this is the resolution where any difference was most pronounced.

As we can see from the tests, the added RAM had no real impact on performance in any game (and a slightly negative impact in D3 and Q4).

Looking at the numbers after we enable AA, there are only two games that see any slight benefit from the extra RAM alone: Black and White 2 and Day of Defeat: Source. Battlefield 2 sees a tiny boost, but this is only evident at this extreme resolution. Clearly the majority of the benefit the 7800 GTX 512 has is from core and memory clock speed.

We do want to mention that there could be slightly more benefit from the added RAM as we have still not been able to confirm that dropping the clock speeds of the 7800 GTX 512 part results in the same clock speeds all round as the 7800 GTX. If you recall from earlier articles, the 7800 GTX has multiple clocks which aren't always all adjusted when over/under clocking. It's possible that dropping the clock speed to 430 pushed some of the internal clocks lower than they are in the original 7800 GTX. This would have a minimal impact, but an impact nonetheless.

Quake 4 Performance Final Words
Comments Locked

97 Comments

View All Comments

  • ViRGE - Monday, November 14, 2005 - link

    AFAIK 4xAA is the last level of AA that's constant between ATI and NV. The X850 tops out at 6xAA(which NV doesn't have), then there's 8xS, and the list goes on...
  • Griswold - Monday, November 14, 2005 - link

    Thats a beast no less. The only thing ATI can do now is kick off that mysterious R580 and it better have a few more pipes than the 520 at the same or even higher clock speeds - and no paperlaunch this time. Or just give up and get the launch right for the next generation...

    Is there any particular reason for only showing nvidia SLI results and no crossfire numbers at all?
  • Ryan Smith - Monday, November 14, 2005 - link

    This is something we discussed when working on this article, and there's really no purpose in testing a Crossfire setup at this point. The X1800 Crossfire master cards are not available yet to test an X1800 setup, and as we noted in our X850 Crossfire review, an X850 setup isn't really viable(not to mention it tops out at 1600x1200 when we test 2 higher resolutions).
  • Griswold - Monday, November 14, 2005 - link

    Ah well, woulda thought AT has a few master cards in their closet. Guess not. :)
  • Kyanzes - Monday, November 14, 2005 - link

    ONE WORD: DOMINATION
  • yacoub - Monday, November 14, 2005 - link

    Very interesting to see that 512MB has little to no impact on the performance - it is instead almost entirely the clock speed of the GPU and the RAM that makes the difference.

    Also, I think this is the first time in PC gaming history where I've seen testing done where video cards more than ~9 months old are all essentially 'obsolete' as far as performance. Even the 7800 GT which only even came out maybe six months ago is already near the bottom of the stack at these 1600x1200 tests, and considering that's what anyone with a 19" or greater LCD wants to ideally play at, that's a bit scary. Then you realize that the 7800GT is around $330 for that bottom-end performance and it just goes up from there. It's really $450-550 for solid performance at that resolution these days. That's disappointing.
  • ElFenix - Monday, November 14, 2005 - link

    no one with a 19" desktop LCD is playing a game at any higher than 1280x1024, in which case this card is basically a waste of money. i have a 20" widescreen lcd and i find myself playing in 1280x1024 a lot because the games often don't expand the field of view, rather they just narrow the screen vertically.
  • tfranzese - Monday, November 14, 2005 - link

    SLi/XFire scews the graphes. You need to take that into account when looking at the results.
  • Cygni - Monday, November 14, 2005 - link

    We have seen this in every successive generation of video cards. Unless your running AA at high res (ie over 1280x1024), RAM size has little impact on performance. Heck, 64mb is probably enough for the textures in most games.
  • cw42 - Monday, November 14, 2005 - link

    You really should have included COD2 in the tests. I remember seeing a test on another site that showed COD2 benefited GREATLY from 512mb vs 256mb of ram.

Log in

Don't have an account? Sign up now