The Card, The Test, and Power

There are a few key changes from the original 7800 GTX silicon that allow the 7800 GTX 512 to perform so highly. The 7800 GTX 512 still uses a 110nm process like the original 7800 GTX, but NVIDIA has tuned their fab process to speed up key data paths in the chip. These enhancements, while not altering the feature set in anyway, allow the chip to reach clock speeds of 550MHz (which is 120MHz faster than the original 7800 GTX). On top of changes in the silicon, the 7800 GTX 512 has gotten a PCB revision. And just in case anyone is wondering, the huge HSF solution is actually very quiet. NVIDIA is using a fan with larger blades that move a good volume of air without needing to run at super high RPMs. While it may look like an industrial sized leaf blower, it bark is certainly nothing compared to the bite this thing takes out of our performance tests.

Current 7800 GTX cards feature 8Mx32 GDDR3 with four chips on each side of the PCB. Most cards have a heat spreader on the back of the board, while some vendors have attached heatsinks. NVIDIA needed a better way to cool their RAM in order to hit the memory clock speeds they wanted. To this end the 7800 GTX 512 sees all of its RAM on the front of the PCB cooled by the very large heatsink previously employed on the Quadro FX 4500. Moving all the RAM to one side of the PCB may also have improved the routing to certain memory modules, which would also help increase attainable stable memory clock speeds. There are still only 8 modules total, as NVIDIA has also moved to higher density 16Mx32 GDDR3. The RAM used is also rated at 900MHz (1800MHz data rate), giving the stock memory clock speed of 1700MHz a little head room for vendors who like to overclock the cards they sell.

Here is a quick comparison of NVIDIA's 7800 series line up with the new GTX 512:

We can expect some applications to scale with either core clock speed or memory clock speed depending on where they are limited, in which case we could see anywhere from a 25% to 40% boost in performance. Of course, we will run into things like CPU and architectural limitations that could decrease the impact of the improved clock speed. As Intel found out via the Pentium 4, it doesn't matter how fast your clock spins if the chip spends a significant amount of time waiting on other hardware. This is where HyperThreading came into play, and is likely also the reason ATI put so much development time into keeping a huge number of contexts open at a time.

Slower architectures have the advantage of being less effected by latency as a long clock cycle time allows data to move further per cycle. At the same time, to compete with high frequency processing, much more work has to get done per clock cycle than a faster chip. Graphics tend to lend themselves to this type of architecture, which is partly why we haven't seen multiple-GHz graphics chips.

In any case, increasing core and memory clocks, framebuffer size, and adding a gigantic HSF will certainly require a little more power than the standard 7800 GTX. NVIDIA currently still recommends the same types of power supplies for the 7800 GTX 512 as it does for the 7800 GTX, but, as we can see from our tests, the 7800 GTX 512 does result in a much higher power draw from the wall. In fact, the outlets in our lab had some trouble getting consistent power to our PSU during SLI testing. Most people won't run into a problem like this unless they run quite a few PCs off the same circuit breaker at home. We actually had to solve our problem by running one of the 7800 GTX 512 cards off of a second power supply plugged into an extension cord running off of a different circuit. If nothing else, this setup could help people test for wiring problems in their homes.

These power numbers are measured at the wall before the PSU.

Power Consumption


Power Consumption


Certainly the 7800 GTX 512 is a power hog to say the least. Unfortunately, we didn't have a reliable way to test power draw for the 7800 GTX 512 SLI setup, but if we ever get around to rewiring the lab ...

Let's take a look at the hardware we will use.

Test Hardware

CPU: AMD Athlon 64 FX-57 (2.8GHz)
Motherboard: ASUS A8N32-SLI Deluxe
Chipset: NVIDIA nForce4 SLI X16
Chipset Drivers: nForce4 6.82
Memory: OCZ PC3500 DDR 2-2-2-7
Video Card: ATI Radeon X800 XL
ATI Radeon X1800 XT
ATI Radeon X850 XT
NVIDIA GeForce 6800 GS
NVIDIA GeForce 7800 GT
NVIDIA GeForce 7800 GTX
NVIDIA GeForce 7800 GTX 512
Video Drivers: ATI Catalyst 5.11 (WHQL)
NVIDIA ForceWare 81.89 (Beta)
Desktop Resolution: 1280x960 - 32-bit @ 60Hz
OS: Windows XP Professional SP2
Power Supply: OCZ PowerStream 600W PSU


This is definitely a powerful system we will be examining today. For our benchmarks, we test with sound disabled. Most of these tests are timedemos played back using ingame functionality, but the Black & White 2 benchmark is a FRAPS test using an ingame cut scene. We will provide bar graphs for the very popular 1600x1200 resolution while reporting data from 1280x960 to 2048x1536 in resolution scaling graphs.

Index Battlefield 2 Performance
Comments Locked

97 Comments

View All Comments

  • nourdmrolNMT1 - Monday, November 14, 2005 - link

    i still need to figure out what to get for my Computer so i can run CSS at native res (1680*1050)

    its hard having to always scale the games.
  • ElFenix - Monday, November 14, 2005 - link

    i would assume so, seeing as how it would be a very good use of the second slot. but two slot designs don't always do that.
  • Fluppeteer - Tuesday, November 15, 2005 - link

    At least one picture I've seen (Ars) shows vents in the second slot backplane. The
    air seems to blow in both directions. Which is better than nothing, but I might
    still have to make some kind of ducting to stop the fans at the front of my
    case blowing into the open end of the shroud. This is one reason I prefer water
    cooling, but I'm too wary of cooking the RAM if it's not fully cooled.

    (Saving up...)
  • Fluppeteer - Monday, November 14, 2005 - link

    Good review, good to see some high(er) resolutions being benchmarked.
    Thanks for the efforts, people.

    Just wondering, have any of the cards other specs changed? Is it still
    one dual-link and one single link DVI (the latter run from the chip,
    the former from external SiI parts)? (Since the 512MB 6800Ultra was dual
    link and the Quadro FX4500 is dual dual link, I thought I'd check.) I'm
    still hoping someone will get around to testing the G70's DVI quality on
    the single link output for me, since the issue with the 6800.

    I don't suppose nVidia took the opportunity to stick some of SiI's
    HDCP-capable TMDS transmitters on it, did they? They're playing
    catch-up with the X1800, and it would be a good time for them to
    spend the extra few dollars on fixing it.

    I'd be quite interested in some audio measurements of the fan, too.

    Speaking of which, is the airflow actually useful with the Quadro
    fan? I've got a lot of air blowing from the front of my case to the
    back, and I've suspected that the overheating issues I've seen with
    my 6800 are because the card's fan is fighting the case airflow
    (for some reason nVidia's fans seem to blow the wrong way round).

    --
    Fluppeteer
  • Sunbird - Monday, November 14, 2005 - link

    So how bad does this spank my 5900XT? :P
  • bob661 - Tuesday, November 15, 2005 - link

    I had one of those. You might be able to dig up an early benchmark on the 6600GT that will show how it compares to the 5900XT.
  • bob661 - Tuesday, November 15, 2005 - link

    quote:

    benchmark on the 6600GT that will show how it compares to the 5900XT
    http://tinyurl.com/77v66">Here you go. :)
  • Griswold - Monday, November 14, 2005 - link

    Ok, if this peanut represents the 5900XT, the GTX 512 would be the size of a melon. ;)
  • viciousvee - Monday, November 14, 2005 - link

    For the price this is really for people with a "I don't care what it cost" (big) budget! Get 2 GT's (7800 ones) and call it a day. N.E ways Good article but I would like to see more benches with WOW (World of warcraft, even though they don't support SLI setup) and with 2 setups rather than one, one with the AMD 3500+ and the 57!
  • Spoonbender - Monday, November 14, 2005 - link

    You mean, get two GT's (which would cost about the same as one of these, while offering far less performance? No thanks, if I were to spend $6-700, I'd go for the faster solution. Which means this card.

    As for the rest, well, why is it relevant? AT is a hardware site, reviewing hardware. They're not benchmarking games to find "the best WoW card", they're benchmarking to find the best card overall. As for the CPU's, what would it add to a review of a card like this? Again, the purpose isn't to tell you "how many fps would you gain if you upgraded your CPU to a FX57?". It's to test this card versus the competition.

Log in

Don't have an account? Sign up now