The Card, The Test, and Power

There are a few key changes from the original 7800 GTX silicon that allow the 7800 GTX 512 to perform so highly. The 7800 GTX 512 still uses a 110nm process like the original 7800 GTX, but NVIDIA has tuned their fab process to speed up key data paths in the chip. These enhancements, while not altering the feature set in anyway, allow the chip to reach clock speeds of 550MHz (which is 120MHz faster than the original 7800 GTX). On top of changes in the silicon, the 7800 GTX 512 has gotten a PCB revision. And just in case anyone is wondering, the huge HSF solution is actually very quiet. NVIDIA is using a fan with larger blades that move a good volume of air without needing to run at super high RPMs. While it may look like an industrial sized leaf blower, it bark is certainly nothing compared to the bite this thing takes out of our performance tests.

Current 7800 GTX cards feature 8Mx32 GDDR3 with four chips on each side of the PCB. Most cards have a heat spreader on the back of the board, while some vendors have attached heatsinks. NVIDIA needed a better way to cool their RAM in order to hit the memory clock speeds they wanted. To this end the 7800 GTX 512 sees all of its RAM on the front of the PCB cooled by the very large heatsink previously employed on the Quadro FX 4500. Moving all the RAM to one side of the PCB may also have improved the routing to certain memory modules, which would also help increase attainable stable memory clock speeds. There are still only 8 modules total, as NVIDIA has also moved to higher density 16Mx32 GDDR3. The RAM used is also rated at 900MHz (1800MHz data rate), giving the stock memory clock speed of 1700MHz a little head room for vendors who like to overclock the cards they sell.

Here is a quick comparison of NVIDIA's 7800 series line up with the new GTX 512:

We can expect some applications to scale with either core clock speed or memory clock speed depending on where they are limited, in which case we could see anywhere from a 25% to 40% boost in performance. Of course, we will run into things like CPU and architectural limitations that could decrease the impact of the improved clock speed. As Intel found out via the Pentium 4, it doesn't matter how fast your clock spins if the chip spends a significant amount of time waiting on other hardware. This is where HyperThreading came into play, and is likely also the reason ATI put so much development time into keeping a huge number of contexts open at a time.

Slower architectures have the advantage of being less effected by latency as a long clock cycle time allows data to move further per cycle. At the same time, to compete with high frequency processing, much more work has to get done per clock cycle than a faster chip. Graphics tend to lend themselves to this type of architecture, which is partly why we haven't seen multiple-GHz graphics chips.

In any case, increasing core and memory clocks, framebuffer size, and adding a gigantic HSF will certainly require a little more power than the standard 7800 GTX. NVIDIA currently still recommends the same types of power supplies for the 7800 GTX 512 as it does for the 7800 GTX, but, as we can see from our tests, the 7800 GTX 512 does result in a much higher power draw from the wall. In fact, the outlets in our lab had some trouble getting consistent power to our PSU during SLI testing. Most people won't run into a problem like this unless they run quite a few PCs off the same circuit breaker at home. We actually had to solve our problem by running one of the 7800 GTX 512 cards off of a second power supply plugged into an extension cord running off of a different circuit. If nothing else, this setup could help people test for wiring problems in their homes.

These power numbers are measured at the wall before the PSU.

Power Consumption


Power Consumption


Certainly the 7800 GTX 512 is a power hog to say the least. Unfortunately, we didn't have a reliable way to test power draw for the 7800 GTX 512 SLI setup, but if we ever get around to rewiring the lab ...

Let's take a look at the hardware we will use.

Test Hardware

CPU: AMD Athlon 64 FX-57 (2.8GHz)
Motherboard: ASUS A8N32-SLI Deluxe
Chipset: NVIDIA nForce4 SLI X16
Chipset Drivers: nForce4 6.82
Memory: OCZ PC3500 DDR 2-2-2-7
Video Card: ATI Radeon X800 XL
ATI Radeon X1800 XT
ATI Radeon X850 XT
NVIDIA GeForce 6800 GS
NVIDIA GeForce 7800 GT
NVIDIA GeForce 7800 GTX
NVIDIA GeForce 7800 GTX 512
Video Drivers: ATI Catalyst 5.11 (WHQL)
NVIDIA ForceWare 81.89 (Beta)
Desktop Resolution: 1280x960 - 32-bit @ 60Hz
OS: Windows XP Professional SP2
Power Supply: OCZ PowerStream 600W PSU


This is definitely a powerful system we will be examining today. For our benchmarks, we test with sound disabled. Most of these tests are timedemos played back using ingame functionality, but the Black & White 2 benchmark is a FRAPS test using an ingame cut scene. We will provide bar graphs for the very popular 1600x1200 resolution while reporting data from 1280x960 to 2048x1536 in resolution scaling graphs.

Index Battlefield 2 Performance
Comments Locked

97 Comments

View All Comments

  • ViRGE - Monday, November 14, 2005 - link

    AFAIK 4xAA is the last level of AA that's constant between ATI and NV. The X850 tops out at 6xAA(which NV doesn't have), then there's 8xS, and the list goes on...
  • Griswold - Monday, November 14, 2005 - link

    Thats a beast no less. The only thing ATI can do now is kick off that mysterious R580 and it better have a few more pipes than the 520 at the same or even higher clock speeds - and no paperlaunch this time. Or just give up and get the launch right for the next generation...

    Is there any particular reason for only showing nvidia SLI results and no crossfire numbers at all?
  • Ryan Smith - Monday, November 14, 2005 - link

    This is something we discussed when working on this article, and there's really no purpose in testing a Crossfire setup at this point. The X1800 Crossfire master cards are not available yet to test an X1800 setup, and as we noted in our X850 Crossfire review, an X850 setup isn't really viable(not to mention it tops out at 1600x1200 when we test 2 higher resolutions).
  • Griswold - Monday, November 14, 2005 - link

    Ah well, woulda thought AT has a few master cards in their closet. Guess not. :)
  • Kyanzes - Monday, November 14, 2005 - link

    ONE WORD: DOMINATION
  • yacoub - Monday, November 14, 2005 - link

    Very interesting to see that 512MB has little to no impact on the performance - it is instead almost entirely the clock speed of the GPU and the RAM that makes the difference.

    Also, I think this is the first time in PC gaming history where I've seen testing done where video cards more than ~9 months old are all essentially 'obsolete' as far as performance. Even the 7800 GT which only even came out maybe six months ago is already near the bottom of the stack at these 1600x1200 tests, and considering that's what anyone with a 19" or greater LCD wants to ideally play at, that's a bit scary. Then you realize that the 7800GT is around $330 for that bottom-end performance and it just goes up from there. It's really $450-550 for solid performance at that resolution these days. That's disappointing.
  • ElFenix - Monday, November 14, 2005 - link

    no one with a 19" desktop LCD is playing a game at any higher than 1280x1024, in which case this card is basically a waste of money. i have a 20" widescreen lcd and i find myself playing in 1280x1024 a lot because the games often don't expand the field of view, rather they just narrow the screen vertically.
  • tfranzese - Monday, November 14, 2005 - link

    SLi/XFire scews the graphes. You need to take that into account when looking at the results.
  • Cygni - Monday, November 14, 2005 - link

    We have seen this in every successive generation of video cards. Unless your running AA at high res (ie over 1280x1024), RAM size has little impact on performance. Heck, 64mb is probably enough for the textures in most games.
  • cw42 - Monday, November 14, 2005 - link

    You really should have included COD2 in the tests. I remember seeing a test on another site that showed COD2 benefited GREATLY from 512mb vs 256mb of ram.

Log in

Don't have an account? Sign up now