The 8800 GTS 320MB and The Test

Normally, when a new part is introduced, we would spend some time talking about number of pipelines, computer power, bandwidth, and all the other juicy bits of hardware goodness. But this time around, all we need to do is point back to our original review of the G80. Absolutely the only difference between the original 8800 GTS and the new 8800 GTS 320MB is the amount of RAM on board.

The GeForce 8800 GTS 320MB uses the same number of 32-bit wide memory modules as the 640MB version (grouped in pairs to form 5 64-bit wide channels). The difference is in density: the 640MB version uses 10 64MB modules, whereas the 320MB uses 10 32MB modules. That makes it a little easier for us, as all the processing power, features, theoretical peak numbers, and the like stay the same. It also makes it very interesting, as we have a direct comparison point through which to learn just how much impact that extra 320MB of RAM has on performance.

Here's a look at the card itself. There really aren't any visible differences in the layout or design of the hardware. The only major difference is the use of the traditional green PCB rather than the black of the recent 8800 parts we've seen.





Interestingly, our EVGA sample was overclocked quite high. Core and shader speeds were at 8800 GTX levels, and memory weighed in at 850MHz. In order to test the stock speeds of the 8800 GTS 320MB, we made use of software to edit and flash the BIOS on the card. The 576MHz core and 1350MHz shader clocks were set down to 500 and 1200 respectively, and memory was adjusted down to 800MHz as well. This isn't something we recommend people run out and try, as we almost trashed our card a couple times, but it got the job done.

The test system is the same as we have used in our recent graphics hardware reviews:

System Test Configuration
CPU: Intel Core 2 Extreme X6800 (2.93GHz/4MB)
Motherboard: EVGA nForce 680i SLI
Chipset: NVIDIA nForce 680i SLI
Chipset Drivers: NVIDIA nForce 9.35
Hard Disk: Seagate 7200.7 160GB SATA
Memory: Corsair XMS2 DDR2-800 4-4-4-12 (1GB x 2)
Video Card: Various
Video Drivers: ATI Catalyst 7.1
NVIDIA ForceWare 93.71 (G7x)
NVIDIA ForceWare 97.92 (G80)
Desktop Resolution: 2560 x 1600 - 32-bit @ 60Hz
OS: Windows XP Professional SP2


Index Does Size Matter?
Comments Locked

55 Comments

View All Comments

  • nicolasb - Monday, February 12, 2007 - link

    The conclusion to this article:

    quote:

    Based on the games and settings we tested, we feel very confident in recommending the NVIDIA GeForce 8800 GTS 320MB to gamers who run at 1920x1200 or less. With or without AA, at these resolutions games look good and play well on the new part.


    This conclusion does not seem to bear much resemblance to the actual observations. In virtually every case the card performed well without AA, but dismally as soon as 4xAA was switched on. A fair conclusion would be to recommend the card for resolutions up to 1920x1200 without AA, but definitely not with.
  • DerekWilson - Monday, February 12, 2007 - link

    The GTS 320MB still performs well if taken on its own at 19x12 with 4xAA ... But I will modify the comment to better reflect what I mean.
  • nicolasb - Tuesday, February 13, 2007 - link

    The way the conclusion now reads is a big improvement, IMNSHO. :-)
  • munky - Monday, February 12, 2007 - link

    I was expecting better performance with AA enabled, and the article just glossed over the fact that the in half the games with AA the card performed on par or worse than last gen card that cost less.
  • Bob Markinson - Monday, February 12, 2007 - link

    For the base Oblivion install, yes, it's not so much of a memory hog. In-game texture use usually doesn't exceed 256 MB with HDR and 4xAA on @ 1152x864. (Also, please test AA perf too with HDR, both ATI and Nvidia do support it on their current gen cards at the same time.)
    Most popular texture mods will bring up the memory usage north of 500 MB. I've seen it hit over 700 MB. Thus, there's a good chance that any 256 MB card would be crippled with texture swapping. I should know, mine is.
  • Bob Markinson - Monday, February 12, 2007 - link

    For the base Oblivion install, yes, it's not so much of a memory hog. In-game texture use usually doesn't exceed 256 MB with HDR and 4xAA on @ 1152x864. (Also, please test AA perf too with HDR, both ATI and Nvidia do support it on their current gen cards at the same time.)
    Most popular texture mods will bring up the memory usage north of 500 MB. I've seen it hit over 700 MB. Thus, there's a good chance that any 256 MB card would be crippled with texture swapping. I should know, mine is.
  • DerekWilson - Monday, February 12, 2007 - link

    What texture mod would you recommend we test with?
  • Bob Markinson - Monday, February 12, 2007 - link

    Qarl's Texture Pack 2 and 3 are quite popular world texture packs. Please check this site for more details:
    http://devnull.devakm.googlepages.com/totoworld">http://devnull.devakm.googlepages.com/totoworld

    Note that version 3 really does need a lot of texture memory. Also, check out Qarl's 4096 compressed landscape LOD normal map texture pack, it'll add far more depth than the plain, overly filtered Oblivion LOD textures.
  • DerekWilson - Monday, February 12, 2007 - link

    We will take a look at those texture packs and do some testing ...

    Hopefully we can provide a follow up further exploring the impact of memory on the 8800 architecture.
  • blackbrrd - Monday, February 12, 2007 - link

    I looked at the Oblivion scores, and the first thing that hit me was: they are using the standard crappy looking textures!

    No oblivion fan running a 8800gts would run with the standard texture pack. It is, at times, really really bad.

    Running a texture pack like the one above is quite normal. If you have enough video card memory there isn't much of a slowdown - except when the data is loaded into memory - which happends all the time... It does make the game look nicer though!

Log in

Don't have an account? Sign up now