Overclocking

Let's talk about overclocking for a second. There are a few factors that affect how a card overclocks, the most important being the type of heat sink on it. Overclocking boosts the performance of the card, but in turn, causes the processor to heat up more. The heat sink is made to expel the heat generated by the card's graphics processing unit, which keeps it running smooth and stable. The risk with overclocking is the possibility of causing damage to the card because of excessive heat. Generally, every card has a "sweet spot" where the clock speed is set high enough to get good performance, yet low enough to ensure that the heat sink will be able to handle the temperature.

Every card overclocks differently, so the data that you see here won't necessarily be the same for every EVGA e-GeForce 7800 GTX. We found our fastest clock speed through trial and error, basically bumping it up more and more until it wouldn't run.

We used Coolbits to detect the optimal settings for overclocking the card and came up with 487MHz for the core clock and 1.27GHz for the memory. We then tested these settings out by looping high- res Battlefield 2 demos over and over for about 45 minutes. Although it ran, there were some graphical tearing and artifacts that showed up on the screen, so we tried a few lower settings and eventually we were able to get it to run cleanly with the clocks set at 475MHz and 1.25ghz. This is as compared to 430/1.2 for a stock 7800 GTX.

After overclocking, we ran some benchmarks to see how well three choice games performed. We'll look at those next.

The Card Performance Tests
POST A COMMENT

26 Comments

View All Comments

  • Fluppeteer - Friday, July 22, 2005 - link

    Isn't that an old ATi card?

    I don't know about a 7800Ultra, but it looks like the Quadro 4500 (based on the 7800) might be on for a SIGGRAPH launch. Since the 4400's a 512MB card, I doubt the 4500 will be a 256MB one. And hopefully *that* will bode well for a 512MB consumer card.

    Fingers crossed.

    Mind you, if a 4500 is a 7800GTX-based card (as the 3400 is a 6800GTo card), perhaps there'll be a 5500 (7800Ultra-based) in the manner of the 4400. By which point, presumably people will have stopped selling GeForce 5500 cards, or it's going to get confusing (other than a factor of a hundred in the price).
    Reply
  • araczynski - Thursday, July 21, 2005 - link

    i think i'll wait for the 8800xyz Reply
  • Fluppeteer - Wednesday, July 20, 2005 - link

    I understand the eVGA 7800GTX card (unusually) has a dual-link DVI connection. Since this was a feature which seemed to cause a lot of confusion among 6800-series card manufacturers, I just wondered if the reviewers (or anyone else) had the chance to test it? If it *is* dual link is an external TMDS transmitter used? What's the quality of nVidia's TMDS transmitter implementation this time round (reports of the 6800 series were critical)?

    The 7800GTX is probably the best card out there for trying to render at the resolutions supported by an IBM T221 or Apple's 30" cinema display - although I'm inclined to wait for a 512Mb version for my T221; it would be good to know whether it's capable of driving one.

    Cheers
    Reply
  • smn198 - Tuesday, July 19, 2005 - link

    A suggestion:

    Regarding measuring the card's noise output and the way you measured the sound
    "We had to do this because we were unable to turn on the graphics card's fan without turning on the system."

    Would it be possible to try and measure the voltages going to the fan when the card is idle and under full load? Then supply the fan with these voltages when the system is off using a different power supply such as a battery (which is silent) and a variable resister.

    It would also be interesting to see a graph of how the noise increases when going from idle to full load over 10 minutes (or however long it takes to reach the maximum speed) on cards which have . Instead of trying to measure the noise with the system on, again measure the voltage over time and then using your battery, variable resistor and voltage meter recreate the voltages and use this in conjunction with the voltage/time data to produce noise/time data.

    Thanks
    Reply
  • PrinceGaz - Sunday, July 17, 2005 - link

    Just look at the original review to see how the 7800GTX compares with older cards, they looked at a lot more games and with a wider range of settings.

    This series of articles is a comparison of 7800GTX cards and is meant to focus on the differences between them. We all know a 7800GTX is faster than a 6800U Ultra so there is no point including that on the graphs.
    Reply
  • Zak - Sunday, July 17, 2005 - link

    I agree, without any comparison to older cards this is pretty useless. Z. Reply
  • PrinceGaz - Sunday, July 17, 2005 - link

    Any word from Derek or Josh(?) as to why AA was not enabled in the tests? It would certainly be a lot more meaningful than the curent set of results at resolutions as high as 2048x1536 without AA where the lowest average framerate in any game is over 70fps. The argument about 4fps more being worthwhile because it is an extra 240 frames per minute is one of the daftest things I've read in a gfx card review.

    Unless you include minimum framerates and ideally a framerate graph like [H} do, and comment on playability at different resolutions and AA settings; remarks like an overclocked card getting 76fps being wothwhile over the non-overclocked one only managing 72fps are ludicrous. I bet you couldn't even tell the two apart in a test where you weren't told which was which. Turn on 4x AA and lets see how they stand up. It may come down more to memory-bandwidth but thats okay. I'm sure some manufacturer will use Samsung's 1.4nS (1400MHz) chips, or at least their 1.5nS (1333MHz) chips sooner or later, assuming the core and circuit board are up to handling those speeds.
    Reply
  • stephenbrooks - Sunday, July 17, 2005 - link

    On page 5 (Heat, Power and Noise) it says under the first graph:
    -----
    As you can see in the graph, there's no difference in the temperature of the reference card and the EVGA e-GeForce 7800 GTX in normal mode, and it only went up by one degree when we overclocked it.
    -----
    ...that's not quite right, as in fact both the e-GeForces were at 81C (overclocked or not) whereas teh reference card was at 80C.
    Reply
  • overclockingoodness - Sunday, July 17, 2005 - link

    #9 and #17: If you want to see the numbers, maybe you should go read the original 7800GTX review. These are just vendor series and you they are comparing the vendor's performance, which is always going to be a couple of frames here and there. It's useless to include 6800 and ATI cards in there. Reply
  • z0mb1e - Sunday, July 17, 2005 - link

    I agree with #9, it would be nice if it had some numbers from the 6800 and an ATI card Reply

Log in

Don't have an account? Sign up now