Power Consumption

As expected, overall power consumption is significantly reduced over the G80 based 8800 Ultra. The 65nm 8800 GTS 512 offers much better performance per watt than its predecessor thanks to the basics of Moore's Law:

Power Consumption - Idle

Power Consumption - Crysis 1920 x 1200 Benchmark

8800 GT 256 vs. Radeon HD 3870/3850 Bar Charts for All
POST A COMMENT

56 Comments

View All Comments

  • Griswold - Thursday, December 13, 2007 - link

    Also (partly) wrong. Its a good price/performance part and its short in supply. That is why its priced higher. And I'm willing to bet the supply shortage is artificial. Look at how the availability of the GTS 512 is - seems to be much better than that of the GT. Its no surprise. Nvidias margins with the GT must be abyssmal compared to that of higher priced units (thats a given, but they also rendered almost their complete lineup obsolete for several weeks prior to the launch of the GTS 512), but they needed that horse to compete with the 3850/3870 price point.

    And you really need to stop talking out of your ass about the 3850. Its selling well and its selling at MSRP because supply is decent (and you lecture him about fundamentals... ). I think there was a the register claim of 150k units in 3 weeks. Well, thats three times the amount of the available 8800GT units in the same timeframe. Speaks for itself.
    Reply
  • neogodless - Tuesday, December 11, 2007 - link

    Whew... just bought an 8800GT and would like to feel like it was a good buy for a *little while*! Hope it has enough supply to help drive prices down in general though... Reply
  • R3MF - Tuesday, December 11, 2007 - link

    Where are the G92 GTS cards with memory over 2.0GHz?
    Does this preage the entrance of a G92 GTX with memory at 2.4GHz and a higher core clock?

    It isn't rocket science to put some decent speed memory on a midrange card. Witness the 3870 with 2.35GHz memory, so why haven't any of the so called "OC" versions of the G92 GTS got overclocked memory?

    At the same time we all want a card that can play Crysis at 1920x1200 at High details and still get around 30FPS. The GTS can get ~30FPS at Medium details........... whoopy-do!

    So, we know its possible to economically provide more bandwidth and we know its necessary, but nobody has done so including the OC'ed versions.

    Is this because there is a G92 GTX product around the corner?

    Yes i know there is rumoured to be a G92 GX2 dual-card sometime in january, but how about a non-cack single card version.

    A card with:
    720MHz core clock
    2000MHz shaders
    2400MHz memory
    1GB or memory

    would absolutely rock, so why haven't we got one?
    Reply
  • kilkennycat - Tuesday, December 11, 2007 - link

    Memory tweaking of the current series is a tiny marginal benefit with a huge increase in power-dissipation. The G92 represents the last gasp of the current G8x/G9x architecture. The shrink was absolutely essential to nVidia's GPU business to get away from the huge, power-hungry and low-yield G80 GPU.

    The true high-end replacement family for the 8xxx-series is coming around Q2 of 2008. It has been in design for at least the past year and is NOT just a tweak of the G8x/G9x architecture. If you really HAVE TO upgrade your system right now, just get a SINGLE 8800GT 512. At this point in time, do not invest in SLI. Keep you hands in your pockets and wait for the next gen. A single copy of the high-end version of the next-gen GPU family from nVidia is likely to have more GPU horsepower than dual 8800GTX.
    Reply
  • Griswold - Thursday, December 13, 2007 - link

    "The true high-end replacement family for the 8xxx-series is coming around Q2 of 2008. It has been in design for at least the past year and is NOT just a tweak of the G8x/G9x architecture."

    Its going to be an evolved (note: thats a fair bit more than just tweaked) G80/G92. You dont design a completely new architecture in a year. Remember what nvidia claimed at launch of the G80? Its been in the works for several years. They will squeeze every bit of revenue out of this architecture before they launch their true next generation architecture (on which at least one team must have been working since the launch of G80).
    Reply
  • retrospooty - Tuesday, December 11, 2007 - link

    A card with:
    720MHz core clock
    2000MHz shaders
    2400MHz memory
    1GB or memory

    would absolutely rock, so why haven't we got one?

    Ummm.... Wait until the high end card is released in january and then see what the specs are. Its suppsed to be a dual GPU version like the 7950GTX was. So think 2 8000GT SLI performance. The memory wont likely be 2400mhz, but it will be dual channel for 512mbit bandwidth.
    Reply

Log in

Don't have an account? Sign up now