Overclocked Performance

Once we achieved our overclocks for these cards, we wanted to see what kind of performance gains we would see with the higher clock speeds. Our test system was the same one we used for the 8800 launch tests:

CPU: Intel Core 2 Extreme X6800 (2.93GHz/4MB)
Motherboard: EVGA nForce 680i SLI
Intel BadAxe
Chipset: NVIDIA nForce 680i SLI
Intel 975X
Chipset Drivers: Intel 7.2.2.1007 (Intel)
NVIDIA nForce 9.35
Hard Disk: Seagate 7200.7 160GB SATA
Memory: Corsair XMS2 DDR2-800 4-4-4-12 (1GB x 2)
Video Card: Various
Video Drivers: ATI Catalyst 6.10
NVIDIA ForceWare 96.97
NVIDIA ForceWare 91.47 (G70 SLI)
Desktop Resolution: 2560 x 1600 - 32-bit @ 60Hz
OS: Windows XP Professional SP2

We tested the performance of these cards with two games at the same settings as our performance tests in the 8800 launch article to keep things consistent. The two games we chose for performance testing are F.E.A.R and Oblivion. We tested both games at 1920x1440 resolution, with 4xAA for F.E.A.R, and no AA/16xAF for Oblivion. Our settings for Oblivion were the same as those in our 8800 launch tests, with all quality settings on or at their highest and with HDR enabled. In F.E.A.R. we put all quality settings at their highest, with the exception of the "soft shadows" option, which we feel creates too much of a performance hit given the poor effect it creates.

F.E.A.R Performance

The Elder Scrolls IV: Oblivion Performance

In F.E.A.R. with the overclock on the XFX 8800 GTS we see about a 16% boost in performance at 1920x1440. In Oblivion we see almost a 40% increase in performance with the overclocked XFX 8800 GTS. Among the 8800 GTX cards, we see that as we would expect the MSI and EVGA samples saw some of the best performance with their overclocks.

In F.E.A.R we see increases in performance of about 14%, and in Oblivion, increases of around 26% between the factory clocked and user-overclocked cards. Keep in mind that with Oblivion, there is some variance (< 5%) in the frame rate between benchmarks because it is a manual run-through of a save game using FRAPS.

Overclocking Power Consumption
Comments Locked

34 Comments

View All Comments

  • JarredWalton - Monday, November 13, 2006 - link

    Derek already addressed the major problem with measuring GPU power draw on its own. However, given similar performance we can say that the cards are the primary difference in the power testing, so you get GTX cards using 1-6w more power at idle, and the Calibre uses up to 15W more. At load, the power differences cover a 10W spread, with the Calibre using up to 24W more.

    If we were to compare idle power with IGP and an 8800 card, we could reasonably compare how much power the card requires at idle. However, doing so at full load is impossible without some customized hardware, and such a measurement isn't really all that meaningful anyway if the card is going to make the rest of the system draw more power anyway. To that end, we feel the system power draw numbers are about the most useful representation of power requirements. If all other components are kept constant on a testbed, the power differences we should should stay consistent as well. How much less power would an E6400 with one of these cards require? Probably somewhere in the range of 10-15W at most be likely.
  • IKeelU - Monday, November 13, 2006 - link

    Nice roundup. One comment about the first page, last paragraph:

    "HDMI outputs are still not very common on PC graphics cards and thus HDCP is supported on each card."

    Maybe I'm misinterpreting, but it sounds like you are saying that HDCP is present *instead* of HDMI. The two are independent of each other. HDMI is the electrical/physical interface, whereas HDCP is the type of DRM with which the information will be encrypted.
  • Josh Venning - Monday, November 13, 2006 - link

    The sentence has been reworked. We meant to say HDCP is supported through DVI on each card. Thanks.
  • TigerFlash - Monday, November 13, 2006 - link

    Does anyone know if the Evga WITH ACS3 is what is on retail right now? Evga's website seems to be the only place that distinguishes the difference. Everyone else is just selling an "8800 GTX."

    Thanks.
  • Josh Venning - Monday, November 13, 2006 - link

    The ACS3 version of the EVGA 8800 GTX we had for this review is apparently not available yet anywhere, and we couldn't find any info on their website about it. Right now we are only seeing the reference design 8800 GTX for sale from EVGA, but the ACS3 should be out soon. The price for this part may be a bit higher, but our sample has the same clock speeds as the reference part.
  • SithSolo1 - Monday, November 13, 2006 - link

    They have two different heat sinks so I assume one could tell by looking at the product picture. I know a lot of sites use the product picture from the manfacture's site but I think they would use the one of the ASC3 if that's the one they had. I also assume they would charge a little more for it.
  • imaheadcase - Monday, November 13, 2006 - link

    Would love to see how these cards performance in Vista, even RC2 would be great.

    I know the graphics drivers for nivdia are terrible, i mean terrible, in vista atm but when they at least get a final out for 8800 a RC2 or a Vista final roundup vs winxp SP@ would be great :D
  • DerekWilson - Monday, November 13, 2006 - link

    Vista won't be that interesting until we see a DX10 driver from NVIDIA -- which we haven't yet and don't expect for a while. We'll certainly test it and see what happens though.
  • imaheadcase - Monday, November 13, 2006 - link

    Oh the current drivers for the 8800 beta do not support DX10? Is that the new detX drivers I read about Nvidia working on are for?
  • peternelson - Saturday, November 25, 2006 - link


    I'd be interested to know if the 8800 drivers even support SLI yet? The initial ones I heard of did not.

Log in

Don't have an account? Sign up now