GT200 vs. G80: A Clock for Clock Comparison

The GT200 architecture isn't tremendously different from G80 or G92, it just has a lot more processing power. The comparison below highlights the clock for clock difference between GT200 and its true predecessor, NVIDIA's G80. We clocked both GPUs at 575MHz core, 900MHz memory and 1350MHz shader, so this is a look at the hardware's architectural enhancements combined with the pipeline and bus width increases. The graph below shows the performance advantage of GT200 over G80 at the same clock speeds:

Clock for clock, just due to width increases, we should be at the very worst 25% faster with GT200. This would be the case where we are texture bound. It is unlikely an entire game will be blend rate bound to the point where we see greater than 2x speedups, and while test cases could show this real world apps just aren't blend bound. More realistically, the 87.5% increase in SPs will be the upper limit on performance improvements at the same clock rate. We see our tests behave within these predicted ranges.

Based on this, it appears that Bioshock is quite compute bound and doesn't run into many other bottlenecks when the burden is eased. Crysis on the other hand seems to be limited by more than just compute as it didn't benefit quite as much.

The way compute has been rebalanced does affect the conditions under which performance will benefit from the additional units. More performance will be available in the case where a game didn't just need more compute, but it needed more computer per texture. The converse is true when a game could benefit from more compute, but only if there was more texture hardware to feed them.

NVIDIA's Dirty Dealing with DX10.1 and How GT200 Doesn't Support it Power and Power Management
Comments Locked

108 Comments

View All Comments

  • Anand Lal Shimpi - Monday, June 16, 2008 - link

    Thanks for the heads up, you're right about G92 only having 4 ROPs, I've corrected the image and references in the article. I also clarified the GeForce FX statement, it definitely fell behind for more reasons than just memory bandwidth, but the point was that NVIDIA has been trying to go down this path for a while now.

    Take care,
    Anand
  • mczak - Monday, June 16, 2008 - link

    Thanks for correcting. Still, the paragraph about the FX is a bit odd imho. Lack of bandwidth really was the least of its problem, it was a too complicated core with actually lots of texturing power, and sacrificed raw compute power for more programmability in the compute core (which was its biggest problem).
  • Arbie - Monday, June 16, 2008 - link

    I appreciate the in-depth look at the architecture, but what really matters to me are graphics performance, heat, and noise. You addressed the card's idle power dissipation but only in full-system terms, which masks a lot. Will it really draw 25W in idle under WinXP?

    And this highly detailed review does not even mention noise! That's very disappointing. I'm ready to buy this card, but Tom's finds their samples terribly noisy. I was hoping and expecting Anandtech to talk about this.

    Arbie
  • Anand Lal Shimpi - Monday, June 16, 2008 - link

    I've updated the article with some thoughts on noise. It's definitely loud under load, not GeForce FX loud but the fan does move a lot of air. It's the loudest thing in my office by far once you get the GPU temps high enough.

    From the updated article:

    "Cooling NVIDIA's hottest card isn't easy and you can definitely hear the beast moving air. At idle, the GPU is as quiet as any other high-end NVIDIA GPU. Under load, as the GTX 280 heats up the fan spins faster and moves much more air, which quickly becomes audible. It's not GeForce FX annoying, but it's not as quiet as other high-end NVIDIA GPUs; then again, there are 1.4 billion transistors switching in there. If you have a silent PC, the GTX 280 will definitely un-silence it and put out enough heat to make the rest of your fans work harder. If you're used to a GeForce 8800 GTX, GTS or GT, the noise will bother you. The problem is that returning to idle from gaming for a couple of hours results in a fan that doesn't want to spin down as low as when you first turned your machine on.

    While it's impressive that NVIDIA built this chip on a 65nm process, it desperately needs to move to 55nm."
  • Mr Roboto - Monday, June 16, 2008 - link

    I agree with what Darkryft said about wanting a card that absolutely without a doubt, stomps the 8800GTX. So far that hasn't happened as the GX2 and GT200 hardly do either. The only thing they proved with the G90 and G92 is that they know how to cut costs.

    Well thanks for making me feel like such a smart consumer as it's going on 2 years with my 8800GTX and it still owns 90% of the games I play.

    P.S. It looks like Nvidia has quietly discontinued the 8800GTX as it's no longer on major retail sites.
  • Rev1 - Monday, June 16, 2008 - link

    Ya the 640 8800 gts also. No Sli for me lol.
  • wiper - Monday, June 16, 2008 - link

    What about noise ? Other reviews show mixed data. One says it's another dustblower, others says the noise level is ok.
  • Zak - Monday, June 16, 2008 - link

    First thing though, don't rely entirely on spell checker:)) Page 4 "Derek Gets Technical": "borrowing terminology from weaving was cleaver" I believe you meant "clever"?

    As darkryft pointed out:

    "In my opinion, for $650, I want to see some f-ing God-like performance."

    Why would anyone pay $650 for this? Ugh? This is probably THE disappointment of the year:(((

    Z.
  • js01 - Monday, June 16, 2008 - link

    On techpowerups review it seemed to pull much bigger numbers but they were using xp sp2.
    http://www.techpowerup.com/reviews/Point_Of_View/G...">http://www.techpowerup.com/reviews/Point_Of_View/G...
  • NickelPlate - Monday, June 16, 2008 - link

    Pfft, title says it all. Let's hope that driver updates widen the gap between previous high end products. Otherwise, I'll pass on this one.

Log in

Don't have an account? Sign up now