Power Consumption

The 4870 X2 is an incredible power hog. Yes, it does perform well. Yes it costs a bit less than two 4870 cards. But will we be able to afford the energy bill?

Idle Power

Load Power

In both idle and load power the 4870 X2 is near the bottom of our chart. The only thing more inane than the 4870 X2 is plugging two of them into one system. Keep in mind that this power measurement is taken running a 3dmark pixel shader test using almost no other resources. In actual game play power draw is much higher as the CPU, memory, and hard drive can come under load at the same time.

Welcome to the kilowatt era. We've been telling you these huge power supplies weren't that necessary until this generation, and we absolutely mean it. If you are thinking about a multi GPU solution involving the latest hardware you are going to want something upwards of 1KW. If you want GTX 280 SLI or 4870 X2 CrossFire you'll want to head on up to the 1200W deparment.

Oh, and, don't forget to turn it off when you aren't gaming.

Race Driver GRID Performance Scaling Final Words
Comments Locked

93 Comments

View All Comments

  • Spoelie - Tuesday, August 12, 2008 - link

    How come 3dfx was able to have a transparant multigpu solution back in the 90's - granted, memory still was not shared - when it seems impossible for everyone else these days.

    Shader functionality problems? Too much integration (a single card voodoo2 was a 3 chip solution to begin with)?
  • Calin - Tuesday, August 12, 2008 - link

    The SLI from 3dfx used scan line interleaving (or Scan Line Interleaving to be exact). The new SLI still has Scan Line Interleaving, amongst other modes.
    The reason 3dfx was able to use this is that the graphic library used was their own, and it was built specifically to the task. Now, Microsoft's DirectX is not built for this SLI thing, and it shows (see the CrossFire profiles, selected for the best performance for a game, depending on that game).

    Also, 3dfx's SLI had a dongle feeding video signal from the second card (slave) into the first card (master), and the video from the two cards was interleaved. Now, this uses lots of bandwidth, and I don't think DirectX is able to generate scenes in "only even/odd lines", and much of the geometry work must be done by both cards (so if your game engine is geometry bound, SLI doesn't help you)
  • mlambert890 - Friday, August 15, 2008 - link

    Great post... Odd that people seem to remember 3DFX and dont remember GLIDE or how it worked. Im guessing they're too young to have actually owned the original 3D cards (I still have my dedicated 12MB Voodoo cards in a closet), and they just hear something on the web about how "great" 3DFX was.

    It was a different era and there was no real unified 3D API. Back then we used to argue about OpenGL vs GLIDE and the same types of malcontents would rant and rave about how "evil" MSFT was for daring to think to create DirectX

    Today a new generation of illinformed malcontents continue to rant and rave about Direct3D and slam NVidia for "screwing up" 3DFX when the reality is that time moves on and NVidia used the IP from 3DFX that made sense to use (OBVIOUSLY - sometimes the people spending hundreds of millions and billions have SOME clue what they're buying/doing and actually have CS PhDs rather than just "forum posting cred")
  • Zoomer - Wednesday, August 13, 2008 - link

    Ah, I remember wanting to get a Voodoo5 5000, but ultimately decided on the Radeon 32MB DDR instead.

    Yes, 32MB DDR framebuffer!
  • JarredWalton - Tuesday, August 12, 2008 - link

    Actually, current SLI stands for "Scalable Link Interface" and has nothing to do with the original SLI other than the name. Note also that 3dfx didn't support anti-aliasing with SLI, and they had issues going beyond the Voodoo2... which is why they're gone.
  • CyberHawk - Tuesday, August 12, 2008 - link

    nVidia bought them .... and is now uncapable of take advantage if the technology :D
  • StevoLincolnite - Tuesday, August 12, 2008 - link

    They could have at least included support for 3DFX glide so all those GLIDE only games would continue to function.

    Also, ATI have had a "Dual GPU" Card for many years (Rage Furry Maxx) before nVidia released one.
  • TonyB - Tuesday, August 12, 2008 - link

    can it play Crysis though?



    two of my friends computer died while playing it.
  • Spoelie - Tuesday, August 12, 2008 - link

    no it can't, the crysis benchmarks are just made up

    stop with the bearded comments already
  • MamiyaOtaru - Wednesday, August 13, 2008 - link

    Dude was joking. And it was funny.

    It's apparently pretty dangerous to joke around here. Two of my friends died from it.

Log in

Don't have an account? Sign up now