Thanks to AMD’s aggressive power optimizations, the idle power of the 5970 is rated for 42W. In practice this puts it within spitting distance of the 5800 series in Crossfire, and below a number of other cards including the GTX 295, 4870X2, and even the 4870 itself.

Once we start looking at load power, we find our interesting story. Remember that the 5970 is specifically built and binned in order to meet the 300W cap. As a result it offers 5850CF performance, but at 21W lower power usage, and the gap only increases as you move up the chart with more powerful cards in SLI/CF mode. The converse of this is that it flirts with the cap more than our GTX 295, and as a result comes in 69W higher. But since we’re using OCCT, any driver throttling needs to be taken in to consideration.

Looking at the 5970 when it’s overclocked, it becomes readily apparently why a good power supply is necessary. For that 15% increase in core speed and 20% increase in memory speed, we pay a penalty of 113W! This puts it in league with the GTX series in SLI, and the 5870CF, except that it’s drawing all of this power over half as many plugs. We’re only going to say this one more time: if you’re going to overclock the 5970, you must have a very good power supply.

Moving on, the vapor chamber cooler makes itself felt in our temperature testing. The 5970 is the coolest high-end card we’ve tested (yes, you’ve read that right), coming in at 38C, below even the GTS 250. This is in stark opposition to previous dual-GPU cards, which have inhabited the top of the chart. Even the 5850 isn’t quite as cool as a 5970 at idle.

At load, we see a similar but slightly different story. It’s no longer the coolest card, losing out to the likes of the 5850 and GTX 285, but at 85C it hangs with the GTX 275, and below other single and dual-GPU cards such as the 5870 and GTX 295. This is a combination of the vapor cooler, and the fact that AMD slapped an oversized cooler on this card for overclocking purposes. Although Anand’s card failed at OCCT when overclocked, my own card hit 93C here, so assume that this cool advantage erodes under overclocking.

Finally we have our look at noise. Realistically, every card runs up against the noise floor, and the 5970 is no different. At 38C idle, it can keep its fan at very low speeds.

It’s at load that we find another interesting story. At 63.8dB it’s plenty loud, but it’s still quieter than either the GTX 295 or 5870CF, the former of which it is significantly faster than. Given the power numbers we saw earlier, we had been expecting something that registered as louder, so this was a pleasant surprise.

We will add that on a subjective basis, AMD seems to have done something to keep the whine down. The GTX 295 (and 4870X2) aren’t just loud, but they have a slight whine to them – the 5970 does not. This means that it’s not just a bit quieter to sound meters, but it really comes across that way to human ears too. But by the same token, I would consider the 5850CF to quieter still, more so than 2dB would imply.

Left 4 Dead Conclusion
POST A COMMENT

114 Comments

View All Comments

  • Paladin1211 - Saturday, November 21, 2009 - link

    To be precise, anything above the monitor refresh rate is not going to be recognizable. Mine maxed out at 60Hz 1920x1200. Correct me if I'm wrong.

    Thanks :)
    Reply
  • noquarter - Saturday, November 21, 2009 - link

    If you read AnandTech's 'Triple Buffering: Why We Love It' article, there is a very slight advantage at more than 60fps even though the display is only running at ~60Hz. If the GPU finishes rendering a frame immediately after the display refresh then that frame will be 16ms stale by the time the display shows it as it won't have the next one ready in time. If someone started coming around the corner while that frame is stale it'd be 32ms (stale frame then fresh frame) before the first indicator showed up. This is simplified as with v-sync off you'll just get torn frames but the idea is still there.

    To me, it's not a big deal, but if you're looking at a person with quick reaction speed of 180ms, 32ms of waiting for frames to catch up could be significant I guess. If you increase the fps past 60 you're more likely to have a fresh frame rendered right before each display refresh.
    Reply
  • T2k - Friday, November 20, 2009 - link

    Seriously: is he no more...? :D Reply
  • XeroG1 - Thursday, November 19, 2009 - link

    OK, so seriously, did you really take a $600 video card and benchmark Crysis Warhead without turning it all the way up? The chart says "Gamer Quality + Enthusiast Shaders". I'm wondering if that's really how you guys benchmarked it, or if the chart is just off. But if not, the claim "Crysis hasn’t quite fallen yet, but it’s very close" seems a little odd, given that you still don't have all the settings turned all the way up.

    Incidentally, I'm running a GeForce 9800 GTX (not plus) and a Core2Duo E8550, and I play Warhead at all settings enthusiast, no AA, at 1600x900. At those settings, it's playable for me. People constantly complain about performance on that title, but really if you just turn down the resolution, it scales pretty well and still looks better than anything else on the market IMHO.
    Reply
  • XeroG1 - Thursday, November 19, 2009 - link

    Er, oops - that was supposed to say "E8500", not "E8550", since there is no 8550. Reply
  • mapesdhs - Thursday, November 19, 2009 - link


    Carnildo writes:
    > ... I was the administrator for a CAVE system. ...

    Ditto! :D


    > ... ported a number of 3D shooters to the platform. You haven't
    > lived until you've seen a life-sized opponent come around the
    > corner and start blasting away at you.

    Indeed, Quake2 is amazing in a CAVE, especially with both the player
    and the gun separately motion tracking - crouch behind a wall and be
    able to stick your arm up to fire over the wall - awesome! But more
    than anything as you say, it's the 3D effect which makes the experience.

    As for surround-vision in general... Eyefinity? Ha! THIS is what
    you want:

    http://www.sgidepot.co.uk/misc/lockheed_cave.jpg">http://www.sgidepot.co.uk/misc/lockheed_cave.jpg

    270 degree wraparound, 6-channel CAVE (Lockheed flight sim).

    I have an SGI VHS demo of it somewhere, must dig it out sometime.


    Oh, YouTube has some movies of people playing Quake2 in CAVE
    systems. The only movie I have of me in the CAVE I ran was
    a piece taken of my using COVISE visualisation software:

    http://www.sgidepot.co.uk/misc/iancovise.avi">http://www.sgidepot.co.uk/misc/iancovise.avi

    Naturally, filming a CAVE in this way merely shows a double-image.


    Re people commenting on GPU power now exceeding the demands for
    a single display...

    What I've long wanted to see in games is proper modelling of
    volumetric effects such as water, snow, ice, fire, mud, rain, etc.
    Couldn't all this excess GPU power be channeled into ways of better
    representing such things? It would be so cool to be able to have
    genuinely new effects in games such as naturally flowing lava, or
    an avalanche, or a flood, tidal wave, storm, landslide, etc. By this
    I mean it being done so that how the substance behaves is governed
    by the environment in a natural way (physics), not hard coded. So far,
    anything like this is just simulated - objects involved are not
    physically modelled and don't interact in any real way. Rain is
    a good example - it never accumulates, flows, etc. Snow has weight,
    flowing water can make things move, knock you over, etc.

    One other thing occurs to me: perhaps we're approaching a point
    where a single CPU is just not enough to handle what is now possible
    at the top-end of gaming. To move them beyond just having ever higher
    resolutions, maybe one CPU with more & more cores isn't going to
    work that well. Could there ever be a market for high-end PC
    gaming with 2-socket mbds? I do not mean XEON mbds as used for
    servers though. Just thoughts...

    Ian.

    Reply
  • gorgid - Thursday, November 19, 2009 - link

    WITH THEIR CARDS ASUS PROVIDES THE SOFTWARE WHERE YOU CAN ADJUST CORE AND MEMORY VOLTAGES. YOU CAN ADJUST CORE VOLTAGE UP TO 1.4V

    READ THAT:
    http://www.xtremesystems.org/forums/showthread.php...">http://www.xtremesystems.org/forums/sho...cd1d6d10...

    I ORDERED ONE FROM HERE:

    http://www.provantage.com/asus-eah5970g2dis2gd5a~7...">http://www.provantage.com/asus-eah5970g2dis2gd5a~7...


    Reply
  • K1rkl4nd - Wednesday, November 18, 2009 - link

    Am I the only one waiting for TI to come out with a 3x3 grid of 1080p DLPs? You'd think if they can wedge ~2.2 million mini-mirrors on a chip, they should be able to scale that up to a native 5760x3240. Then they could buddy up with Dell and sell it as an Alienware premium package of display + computer capable of using it. Reply
  • skrewler2 - Wednesday, November 18, 2009 - link

    When can we see benchmarks of 2x 5970 in CF? Reply
  • Mr Perfect - Wednesday, November 18, 2009 - link

    "This means that it’s not just a bit quieter to sound meters, but it really comes across that way to human ears too"

    Have you considered using the dBA filter rather then just raw dB? dBA is weighted to measure the tones that the human ear is most sensitive to, so noise-oriented sites like SPCR use dBA instead.
    Reply

Log in

Don't have an account? Sign up now