Performance Tests

Again, we're keeping with the same 3 games that we tested in the last article: Battlefield 2, Doom 3, and Half Life 2. (Half Life 2 and Doom 3 are tested at 1920x1440 and Battlefield 2 at 2048x1536.) We tested the card on the same system as the EVGA.

MSI K8N Neo4 Platinum/SLI motherboard
AMD Athlon 64 FX-55 Processor
1 GB OCZ 2:2:2:6 DDR400 RAM
Seagate 7200.7 120 GB Hard Drive
OCZ 600 W PowerStream Power Supply

As we mentioned earlier, we added a set of benchmarks with 4xAA enabled. This will help us get a better idea of the subtle differences between each card's performances. The purpose of including these benchmarks is to see what happens when stress is added to memory bandwidth on these parts. One of the first things to look at is how the numbers compare between the EVGA and MSI cards out of the box, without any overclocking. As our tests on the MSI card quickly showed no difference between its performance and that of our reference card, this question has already been answered in our previous article. Please note that we did not add an MSI NX7800 GTX entry to our graphs as our tests showed it to perform exactly the same as our reference card. NX7800GTX out-of-the-box performance is highlighted in green .

Battlefield 2 Performance

Battlefield 2 Performance


As you can see, The EVGA slightly outperforms the MSI across the board at stock speeds. This was predictable given that our EVGA e-GeForce 7800 GTX came to us with the core clocked at 450MHz, as opposed to MSI's standard 430MHz. When it comes to the maximum overclock, our MSI card was able to surpass what we saw with the EVGA part. With Battlefield 2, we see that the percentage gain is more pronounced without the 4xAA enabled; our NX7800GTX overclock gave us a frame rate increase of 10.4%.

Doom 3 Performance

Doom 3 Performance


Doom 3 seemed to get about the same percentage gains from overclocking with and without AA. Without AA, overclocking the MSI card returned a 5.3% gain; and with AA, we see just slightly more, 7.5%.

Half-Life 2 Performance

Half Life 2 Performance


Half Life 2 is the reverse of Battlefield 2. We see a higher increase in performance from overclocking with AA enabled than without. This could be because we are bumping into a CPU limitation without AA turned on. With AA enabled, we see an 8.8% increase in performance when overclocked, as opposed to only a 5.3% increase with no AA.

All the gains that we see here from overclocking are fairly significant and on par with what we would expect from a 12.8% increase in core clock speed based on our analysis of clock speeds in the 7800 GTX. Of course, we know that core speed is not as straightforward a measure as we would like it to be, but we will continue to press NVIDIA on the matter.

In comparing the EVGA and MSI max core clock numbers, remember that every card is different and may not achieve the same results that we've seen here. Since these cards had the same HSF, we would expect similar overclocking performance, and hopefully the more we test, the more we'll know about how variable (in terms of max clock speed) the 7800 GTX is. It is obvious from the numbers that there is no difference in performance between a G70 clocked at 475 and one clocked at 485.

Overclocking Heat, Power and Noise
Comments Locked

42 Comments

View All Comments

  • Fluppeteer - Friday, July 29, 2005 - link

    I'm not sure how board-specific this would be (although the BIOS could easily get
    in on the act), but I notice nVidia are claiming a big readback speed increase on
    the Quadro FX4500 over the FX4400 (2.4GB/s vs 1GB/s). This doesn't seem to apply
    to the 7800GTX in the GPUbench report I managed to find, but it's the kind of thing
    which could be massively driver and BIOS-dependent.

    I know this is a more artifical figure than the games which have been run, but
    significant jumps like this (along with the increased vector dot rate) make these
    cards much more attractive than the 6800 series for non-graphical work. Would it
    be possible to try to confirm whether this speed-up is specific to the Quadro
    board, or whether it applies to the consumer cards too? (Either by a little bit
    of test code, or by running some artificial benchmarks.)

    Just curious. Not that I'll be able to afford a 4500 anyway...
  • tmehanna - Thursday, July 28, 2005 - link

    ALL 7800GTX cards at this point are manufactured by nvidia and sold as is by the "vendors". ONLY physical difference is the logo on the cooler. If some vendors screen and OC their cards before selling, clock speeds would be the only difference. ANY perfomance or heat dissipation differences at similar clock speeds are MERELY manufacturing variances.
  • DerekWilson - Thursday, July 28, 2005 - link

    Not true. Vendors have some bios control over aspect of the cards that are not exposed to users. We have not been able to confirm any details from any vendor or NVIDIA (as they like to keep this stuff under wraps), but temp, heat, and noise (and even overclockability) could be affected by video bios settings.

    We don't know the details; we need more clarification. In the meantime, these are the numbers we are seeing so we will report them. If we are able to get the information we need to really say why we see these differences then we will definitely publish our findings.
  • lambchops3344 - Wednesday, July 27, 2005 - link

    no matter how much better a card does im always going to by evga... ive saved more time and money with the step up program. there customer support is soo good too.
  • NullSubroutine - Tuesday, July 26, 2005 - link

    After reading an article about how CPU performance is tapering off (murphy's law or moores law, i forget which one), but GPU performance has continued to increase, and has showed signs that it will continue to increase. Also I remember an article about Nvidia or ATi (i cant remember which) was asked about any "dual core" GPU's that will be developed. They answered that if you really look at the hardware, GPUs are like multiprocessors, or something to that nature. Perhaps this could be the reason for the clockspeed questions? It would seem logical to me that their technology doesnt run like a typical cpu, because each "processor" runs at a different speed? I think you might understand what im trying to say, at least I hope so cuz im failing miserably at...what was i sayin?
  • Gamingphreek - Monday, July 25, 2005 - link

    Not sure if this has already been discussed in earlier articles, but, the 7800GTX as everyone (including myself) seems bottlenecked at every resolution except 16x12. And then with AA and AF enabled the X850XT seems to catch up. While the averages might be the same, has anandtech ever thought of including the minimum and maximum framerates on their graphs.

    Thanks,
    -Kevin Boyd
  • Fluppeteer - Monday, July 25, 2005 - link

    Just wanted to thank Derek and Josh for clarifying the dual link situation. MSI don't mention anything about dual link, but after the debacle with their 6800"GT" I'm not sure I'd have trusted their publications anyway... If *all* the 7800GTXs are dual link, I'm more confident (although if there's actually a chance to try one with a 30" ACD or - preferably - a T221 DG5 in a future review I'd be even happier!)

    Good review, even if we can expect most cards to be pretty much clones of the reference design for now.
  • DerekWilson - Monday, July 25, 2005 - link

    We'll have some tests with a Cinema Display at some point ...

    But for now, we can actually see the Silicon Image TMDS used for Dual-Link DVI under the HSF. :-)
  • Fluppeteer - Monday, July 25, 2005 - link

    Cool; it'd reassure me before I splash out! (Although I'm still hoping for the extra RAM pads to get filled out - got to hate 36MB frame buffers - but with the Quadro 4500 allegedly due at SIGGRAPH it shouldn't be long now.)

    Sounds like the same solution as the Quadro 3400/6800GTo, with the internal transmitter used for one link and the SiI part for the other. I don't suppose you've pulled the fan off to find out the part number?

    I'd also be interested in knowing whether the signal quality has improved on the internal transmitter; nVidia have a bad record with this, and the T221 pushes the single link close to the 165MHz limit (and the dual link, for that matter). People have struggled with the 6800 series, even in Quadro form, where the internal transmitters have been in use. It'd be nice to find out if they're learning, although asking you to stick an oscilloscope on the output is a bit optimistic. :-) These days this probably affects people with (two) 1920x1200 panels as well as oddballs like me with DG5s, though.

    On the subject of DVI, I don't suppose nVidia have HDCP support yet, do they? (Silicon Image do a part which can help out, or I believe it can be done in the driver.) It's really a Longhorn thing, but you never know...

    Now, if only nVidia would produce an SLi SFR mode with horizontal spanning which didn't try to merge data down the SLi link, I'd be able to get two cards and actually play games on two inputs to the T221 (or two monitors); the way the 7800 benchmarks are going, 3840x2400 is going to be necessary to make anything fill rate limited in SLi. (Or have they done this already? There was talk about Quadros having dual-card OpenGL support, but I'm behind on nVidia drivers while my machine's in bits.)

    Thanks for the info!

    (Starts saving up...)
  • meolsen - Wednesday, July 27, 2005 - link

    Nether Evga NOR MSI advertise that their card is capable of driving at the resolutions that would suggest that the dual-link DVI is enabled.

    E.g., MSI:

    Advanced Display Functionality
    • Dual integrated 400MHz RAMDACs for display resolutions up to and including 2048x1536 at 85Hz
    • Dual DVO ports for interfacing to external TMDS transmitters and external TV encoders
    • Full NVIDIA nView multi-display technology capability

    Why would they conceal this feature/?

Log in

Don't have an account? Sign up now