These Aren't the Sideports You're Looking For

Remember this diagram from the Radeon HD 4850/4870 review?

I do. It was one of the last block diagrams I drew for that article, and I did it at the very last minute and wasn't really happy with the final outcome. But it was necessary because of that little red box labeled CrossFire Sideport.

AMD made a huge deal out of making sure we knew about the CrossFire Sideport, promising that it meant something special for single-card, multi-GPU configurations. It also made sense that AMD would do something like this, after all the whole point of AMD's small-die strategy is to exploit the benefits of pairing multiple small GPUs. It's supposed to be more efficient than designing a single large GPU and if you're going to build your entire GPU strategy around it, you had better design your chips from the start to be used in multi-GPU environments - even more so than your competitors.

AMD wouldn't tell us much initially about the CrossFire Sideport other than it meant some very special things for CrossFire performance. We were intrigued but before we could ever get excited AMD let us know that its beloved Sideport didn't work. Here's how it would work if it were enabled:

The CrossFire Sideport is simply another high bandwidth link between the GPUs. Data can be sent between them via a PCIe switch on the board, or via the Sideport. The two aren't mutually exclusive, using the Sideport doubles the amount of GPU-to-GPU bandwidth on a single Radeon HD 4870 X2. So why disable it?

According to AMD the performance impact is negligible, while average frame rates don't see a gain every now and then you'll see a boost in minimum frame rates. There's also an issue where power consumption could go up enough that you'd run out of power on the two PCIe power connectors on the board. Board manufacturers also have to lay out the additional lanes on the graphics card connecting the two GPUs, which does increase board costs (although ever so slightly).

AMD decided that since there's relatively no performance increase yet there's an increase in power consumption and board costs that it would make more sense to leave the feature disabled.

The reference 4870 X2 design includes hardware support for the CrossFire Sideport, assuming AMD would ever want to enable it via a software update. However, there's no hardware requirement that the GPU-to-GPU connection is included on partner designs. My concern is that in an effort to reduce costs we'll see some X2s ship without the Sideport traces laid out on the PCB, and then if AMD happens to enable the feature in its drivers later on some X2 users will be left in the dark.

I pushed AMD for a firm commitment on how it was going to handle future support for Sideport and honestly, right now, it's looking like the feature will never get enabled. AMD should have never mentioned that it ever existed, especially if there was a good chance that it wouldn't be enabled. AMD (or more specifically ATI) does have a history of making a big deal of GPU features that never get used (Truform anyone?), so it's not too unexpected but still annoying.

The lack of anything special on the 4870 X2 to make the two GPUs work better together is bothersome. You would expect a company who has built its GPU philosophy on going after the high end market with multi-GPU configurations to have done something more than NVIDIA when it comes to actually shipping a multi-GPU card. AMD insists that a unified frame buffer is coming, it just needs to make economic sense first. The concern here is that NVIDIA could just as easily adopt AMD's small-die strategy going forward if AMD isn't investing more R&D dollars into enabling multi-GPU specific features than NVIDIA.

The lack of CrossFire Sideport support or any other AMD-only multi-GPU specific features reaffirms what we said in our Radeon HD 4800 launch article: AMD and NVIDIA don't really have different GPU strategies, they simply target different markets with their baseline GPU designs. NVIDIA aims at the $400 - $600 market while AMD shoots for the $200 - $300 market. And both companies have similar multi-GPU strategies, AMD simply needs to rely on its more.

Let's Talk Pricing General Performance at 2560x1600
Comments Locked

93 Comments

View All Comments

  • Greene - Wednesday, August 13, 2008 - link

    Wow. Lots of this and that in here :-)

    No Hardware Info...
    No Driver Info...

    Did we lose a Page ?

    I'm also curious why Assessess Creed wasn't tested with the different versions ?
    There was such a big stink back in 99/2000 when ati fudged drivers to get better FPS scores, as well as the stink back when Nvidia did the same with 3DMark (what was it 05)?
    And here the "creed" developers drop some sort of support for ATI
    and the authors skip over it, and leave the different versions out of the test.

    Did you guys draft this article 2 weeks ago and forget to revise it ?

    Did you hire fox news editors ?

    I've really trusted and valued Anandtech's articles in the past.

    This just seems sloppy, incomplete and rushed... and i dropped out of college! :-)
  • Arbie - Wednesday, August 13, 2008 - link

    Every bar graph has the cards in a different order. This makes it impossible to scan the graphs and see how a card does overall, across a range of games. And there is no compensating benefit. If I want to know which card is fastest in Crysis, I can clearly see which bar is longer! It DOESN'T HAVE TO BE THE TOP BAR ON THE GRAPH.

    So... you won't do that again.

    Next: everyone should just go out and buy a 4850. It will do all you want for now. Let all these X2 kludges and 65nm dinosaurs pound each other into landfill. Check back again in 6-8 months.

    Arbie
  • hooflung - Wednesday, August 13, 2008 - link

    The numbers were not bad. They speak for themselves. However, the tone of this review was horrible. It is the fastest card in your review and has exactly what people want out of a multi gpu setup. 1 slot, full gig of ram, smashes the competition's closest competitor that cost more, only costs 100 above the best single gpu solution and doesn't require a new motherboard.

    Yet, Nvidia can't do any wrong. ATI decides its sideport isn't needed and disable's it which is a cardinal sin it seems. It still cost 100 dollars LESS than Nvidia's GTX280 when it first came out.

    The mixed signals coming from this review could make a cake if baked.
  • drank12quartsstrohsbeer - Wednesday, August 13, 2008 - link

    This article had the feel like the authors were annoyed that they had to write it. I certainly feel annoyed after reading it...
  • just4U - Wednesday, August 13, 2008 - link

    From my perspective this was a very valid and honest review that zones in on key issues that effect the majority of our gpu buying decisions. Yeah their getting some tough love feedback from it but that's to be expected as well.
  • Keldor314 - Wednesday, August 13, 2008 - link

    750 watts for the X2 in crossfire?! You'd better think of having an electrician come by and upgrade your home's powergrid! Seriously, though, for my house, I can't run a single 8800 gtx at the same time as a space heater without tripping the circut breakers in the garage. True, the heater in question is rated at 1500 watts. The total wattage to trip the circut breaker is thus probably less than 2000 watts, since I've also seen the heater trip it when only accompanied by a lamp (no computer on). Given that the X2 CF will probably, after counting the rest of the computer, send energy usage to over 1000W at load, there's a very real chance that such a computer would periodically cause your power to go out, especially if, god forbid, someone tried to turn on the room's lights.

    Upgrading a power supply is cheap. Rewiring your house to handle the higher wattage is not.
  • CK804 - Sunday, August 17, 2008 - link

    Actually, the power consumption numbers are of the entire system and not just the graphics cards alone. Still, it's amazing how much power these cards draw. My jaw dropped when I saw that the power consumption of a system with these cards under load exceeded 700 watts. When X-bit labs did a roundup of 1000 watt power supplies, the first thing they concluded was that there was no need for power supplies over 6-700 watts for any setup unless some sort of exotic cooling was to be used. I can attest to that statement when I had 4 first gen. 74GB Raptors in RAID 0 coupled with 2 7900GTs in SLI and an AMD X2 4800+ running on a Zalman 460 watt PSU.
  • animaniac2k8 - Wednesday, August 13, 2008 - link

    I 've been a reader of AnandTech's articles for many years and I have owned exlusively Nvidia cards since 2001.

    This is easily one of the worst and most biased articles I 've ever read on AnandTech. Very dissapointed to have wasted my time reading this. I 'll be looking elsewhere for quality reviews from now on.
  • CyberHawk - Wednesday, August 13, 2008 - link

    Same here. Reader since 2001, registered later.

    I always liked articles here. English is my second language and I liked that from time to time I found a new word that made me look into the diary.

    But, this article is a bunch of bull. One more like this and I am out of here. Not that this means the end of anandtech but anyway.
  • helldrell666 - Wednesday, August 13, 2008 - link

    Where's the system setup?
    Why the poster hates AMd that much?
    This is the worst review of the 4870x2 I've checked yet.

    The review at techreport.com is much better.


Log in

Don't have an account? Sign up now