Power Consumption

As expected, overall power consumption is significantly reduced over the G80 based 8800 Ultra. The 65nm 8800 GTS 512 offers much better performance per watt than its predecessor thanks to the basics of Moore's Law:

Power Consumption - Idle

Power Consumption - Crysis 1920 x 1200 Benchmark

8800 GT 256 vs. Radeon HD 3870/3850 Bar Charts for All
POST A COMMENT

56 Comments

View All Comments

  • sliblue - Friday, December 14, 2007 - link

    Im begining to wonder --- I built a new machine based on the qx9650, asus p5e3 deluxe, and 1 8800 gtx (pci 16 on pci 2.0) 4 gig a ram and Vista 64. Loaded up crysis and told it to auto detect my settings and low and behold it spit out the recomendation of very high for everything. I launched the game and couldnt believe how smooth it was with one card on Very High. I am not overclocking anything and can see a huge difference between the qx9650 and the amd blackbox 6400 x2 Reply
  • DLeRium - Friday, December 14, 2007 - link

    The 8800GT review was SOLID, but based on the comparisons you made with the 8800GT, don't you think you should include it here? You did 8800GT vs GTX in the last article, so don't you think you should do 8800GTS vs. GT vs. GTX? But instead you jump to Ultra. I guess it's great that we can go BACK to the 8800GT article and then kinda interpolate how the GTS will do against the GTX, and that's why I hate about reviews that don't include more info for our benefit.

    I don't see why a lot of these graphs can't be combined together?

    I think another issue for me is why the ATI cards now use so little power? In the ATI review, you showed the 3870 gobbling more power than the 8800GT under load, but now it's a clear winner in power consumption. What's the deal here?
    Reply
  • DLeRium - Friday, December 14, 2007 - link

    I think my other gripe with this review is that this is a NEW revision of the GTS. Don't you think it should be wise to compare both the old GTSes against this new revision? That's one thing I really wanted to see in the GT review too. How do the 320/640 GTSes stack up against the GT. What about in this review? Reply
  • afrost - Thursday, December 13, 2007 - link

    Except that to get a decent cooler on the 8800GT you have to spend another $40 at least for an aftermarket cooler.

    I personally prefer the GTS becuase I can just stick it straight in to my box without ripping the stock cooler off, and it's a little bit faster on top of it. I also didn't have Crysis and mine comes with it in the box.....so overall a good buy in my particular situation.
    Reply
  • nubie - Thursday, December 13, 2007 - link

    I am sorry you feel that way, my EVGA 256-P3-N791-AR 8800gt 256 comes with the re-designed heatsink and fan and all I had to do was pull a slider to get 710 1720 1000 clocks, and it didn't overheat.

    Most of the newer GT's (possibly all the 256mb ones) are coming with a better cooling solution. As for the GTS, yes, great, if you have 2 slots for cooling, not everyone does. Oh, yeah, and a spare $100-150 for only 16 more stream processors?? My GT has the same memory bandwidth (64GB/s) when I pull the slider to 1Ghz(2G DDR).

    In a perfect world, of course I choose the absolute best, but on a budget an 8800gt is just fine.
    Reply
  • ashegam - Thursday, December 13, 2007 - link

    Did I miss this in the article or in the comments or has no one mentioned that this new gen card won't be supporting directX 10.1?
    and that doesn't bother anyone interested in purchasing this card?
    should it not be a concern for a potential buyer?
    Reply
  • Distant - Wednesday, December 12, 2007 - link

    I'll apologize if they acknowledge their mistake and include 8xAA in their tests from now on, I think cards are powerful to the point now were it should be happening in the mid/high range cards anyway.

    Why's this matter? most of those frame rates aren't playable? Well not quite as you saw Oblvion and Prey were, furthermore that site in particular only really tested the very newest games.

    Don't you play any older games? How about any of Valves games? TF2 maybe? I take it you do and I would think most people would want to know their 8800GT is going to get obliterated when they try higher then 4xAA

    And what about the implications for SLI/Crossfire? Surely if you have a cross/sli setup your going to want to run 8 and in some cases even 16xAA on games not named Crysis.
    Reply
  • Distant - Wednesday, December 12, 2007 - link

    In case you guys are wondering what Nvidia payed anandtech not to show you take a look at

    http://www.computerbase.de/artikel/hardware/grafik...">http://www.computerbase.de/artikel/hard...en/2007/...

    You can clearly see in the 8 tests that they did do 8xAA on

    Anno1701
    Clive Barkers Jericho
    FEAR
    Oblivion
    Prey
    Company of heroes DX9 and 10
    Lost Planet

    In every single one of those with the exception of FEAR and Company of heroes in DX9 mode the 8800GT's framerates literally drop like a freaking rock, in some cases it's performance getting cut in more then half while the 3870's really takes much of hits at all and because of this the 3870 overtakes the 8800GT at this level of AA.

    Now call me crazy, but I think most of us don't have a 24+ inch monitor and if you do you really need two cards anyway, my point is looking at the frame rates in most games your going to be wanting to game at 8xAA and if you just read this article you wouldn't have known that the 8800GT appears to be garbage at high levels of AA

    Reply
  • Zak - Wednesday, December 12, 2007 - link

    What exactly are we supposed to be looking at? Besides the fact that it's in German, I see no graphs, no tables. And actually a lot gamers play at resolutions higher than 1280x1024 so 8xAA is mostly unrealistic for any game on any card today. I'm happy when I can get playable framerates with 4X or even 2x on 8800GTX OC in modern games. Accusing Anandtech of being paid by Nvidia (not payed by the way) is baseless and I think out of order. I'd apologize if I were you...

    Z.
    Reply
  • nubie - Wednesday, December 12, 2007 - link

    I bought the EVGA 8800GT 256MB from Newegg on Sunday for $215, of course they are long gone by now.

    This card has 1800mhz RAM, where does that fall in the charts? This means it has identical memory bandwidth to the GT 512's

    I hope that the prices fall and the EVGA Step-Up program will let me get something nice within the 90-days, after this madness is over.

    Either way my monitor is only 1280x1024, and I am not married to AA, so this seems the best choice, for now.

    I am still waiting for some good drivers, I barely beat my 3Dmark01 score(+200) against my old 7900GS 650mhz, that is pretty sad, I don't think that these drivers are properly functional yet.

    Stereoscopic 3D gaming is out the window too, I had a spontaneous reboot already, whether this is the fault of the GT, all 8 series cards, dual-core processor, non-existent stereo drivers or some wacky combination, I don't know.

    When you guys get the EVGA 1800mhz card to bench, let us know asap, they don't claim it as a "clocked" card on the line-up, or in the RAM.

    I got 43/19/115 framerate in World In Conflict Very High settings, 1280x1024 on an x2 4600/2GB DDR2 800/ DFI Infinity II/M2, not bad for the money.

    I am still curious about the dual-core CPU effect, I can only hit 2.6 on this proc (65w Windsor), and I don't run it there. I am getting a single-core Windsor soon and hope to have it clocked to 3ghz like my 939 was, that seemed much faster in game than the x2 4600 ever was (obviously with no background apps.)
    Reply

Log in

Don't have an account? Sign up now