Power to the People

The major power hog of this generation is the X1900 XTX, as we have made clear in past articles. Almost disturbingly, a single X1900 XTX draws more power than a 7950 GX2, and X1900 XTX CrossFire is more power hungry than 7950 Quad SLI. While ATI already had the slightly lower clocked X1900 XT available for those who wanted something that acted slightly less as a space heater, they needed something that performed better and fit into the same (or better) power envelope to round out this generation of GPUs for them. What they latched on to has now given graphics cards sporting the R580+ a much needed drop in power: GDDR4.

As we explained in the GDDR4 section, the optimizations made to this generation of graphics memory technology have been designed with both power savings and potential speed in mind. We've already seen how the higher speed memory pulls through in our performance tests, but how does it hold up on the power front?

For this test, used our Kill-A-Watt to measure system power at the wall. Our load numbers are recorded as maximum power draw during a run of 3DMark06's fill rate and pixel shader feature tests.

System Power Consumption - Idle

System Power Consumption - Load

Apparently, JEDEC and ATI did their jobs well when deciding on the features of GDDR4 and making the decision to adopt it so quickly. Not only has ATI been able to improve performance with their X1950 XTX, but they've been able to do so using significantly less power. While the X1950 XTX is still no where near the envelope of the 7900 GTX, drawing the same amount of power as the X1900 XT and 7950 GX2 is a great start.

It will certainly be interesting to see what graphics makers can do with this RAM when focusing on low power implementations like silent or budget products.

Splinter Cell: Chaos Theory Performance Final Words
Comments Locked

74 Comments

View All Comments

  • Vigile - Wednesday, August 23, 2006 - link

    My thought exactly on this one Anand...
  • Anand Lal Shimpi - Wednesday, August 23, 2006 - link

    You can run dual monitors with a CrossFire card as well, the CrossFire dongle that comes with the card has your 2nd DVI output on it :)

    Take care,
    Anand
  • kneecap - Wednesday, August 23, 2006 - link

    What about VIVO? The Crossfire Edition does not support that.
  • JarredWalton - Wednesday, August 23, 2006 - link

    For high-end video out, the DVI port is generally more useful anyway. It's also required if you want to hook up to a display using HDCP - I think that will work with a DVI-to-HDMI adapter, but maybe not? S-VIDEO and Composite out are basically becoming seldom used items in my experience, though the loss of component out is a bit more of a concern.
  • JNo - Thursday, August 24, 2006 - link

    So if I use DVI out and attach a DVI to HDMI adaptor before attaching to a projector or HDTV, will I get a properly encrypted signal to fully display future blu-ray/hd-dvd encrypted content?

    The loss of component is a bit of a concern as many HDTVs and projectors still produce amazing images with component and, in fact, I gather that some very high resolutions+refresh rates are possible on component but not DVI due to certain bandwidth limitations with DVI. But please correct me if I am wrong. I take Anandtech's point on the crossfire card offering more but with a couple of admittedly small quesiton marks, I see no reason not to get the standard card and crossfire for the second later if you decided to go that route...
  • JarredWalton - Thursday, August 24, 2006 - link

    I suppose theoretically component could run higher resolutions than DVI, with dual-link being required for 2048x1536 and higher. Not sure what displays support such resolutions with component inputs, though. Even 1080p can run off of single-link DVI.

    I think the idea with CF cards over standard is that they will have a higher resale value if you want to get rid of them in the future, and they are also more versatile -- TV out capability being the one exception. There are going to be a lot of people that get systems with a standard X1950 card, so if they want to upgrade to CrossFire in the future they will need to buy the CrossFire edition. We all know that at some point ATI is no longer going to make any of the R5xx cards, so if people wait to upgrade to CrossFire they might be forced to look for used cards in a year or two.

    Obviously, this whole scenario falls apart if street prices on CrossFire edition cards end up being higher than the regular cards. Given the supply/demand economics involved, that wouldn't be too surprising, but of course we won't know for another three or four weeks.
  • UNESC0 - Wednesday, August 23, 2006 - link

    thanks for clearing that up Anand, news to me!
  • TigerFlash - Wednesday, August 23, 2006 - link

    I was wondering if anyone thinks it's wise to get an intel core duo 2 motherboard with crossfire support now that AMD is buying out ATI. Do you think ATI would stop supporting Intel motherboards?
  • johnsonx - Wednesday, August 23, 2006 - link

    quote:

    Do you think ATI would stop supporting Intel motherboards?


    Of course not. AMD/ATI isn't stupid. Even if their cross-licensing agreement with Intel didn't prevent them from blocking Crossfire on Intel boards (which it almost surely does), cutting out that part of the market would be foolish.
  • dderidex - Wednesday, August 23, 2006 - link

    What's with the $99 -> $249 gap?

    Weren't we supposed to see an X1650XT, too? Based on RV570? ...or RV560? Something?

Log in

Don't have an account? Sign up now