These Aren't the Sideports You're Looking For

Remember this diagram from the Radeon HD 4850/4870 review?

I do. It was one of the last block diagrams I drew for that article, and I did it at the very last minute and wasn't really happy with the final outcome. But it was necessary because of that little red box labeled CrossFire Sideport.

AMD made a huge deal out of making sure we knew about the CrossFire Sideport, promising that it meant something special for single-card, multi-GPU configurations. It also made sense that AMD would do something like this, after all the whole point of AMD's small-die strategy is to exploit the benefits of pairing multiple small GPUs. It's supposed to be more efficient than designing a single large GPU and if you're going to build your entire GPU strategy around it, you had better design your chips from the start to be used in multi-GPU environments - even more so than your competitors.

AMD wouldn't tell us much initially about the CrossFire Sideport other than it meant some very special things for CrossFire performance. We were intrigued but before we could ever get excited AMD let us know that its beloved Sideport didn't work. Here's how it would work if it were enabled:

The CrossFire Sideport is simply another high bandwidth link between the GPUs. Data can be sent between them via a PCIe switch on the board, or via the Sideport. The two aren't mutually exclusive, using the Sideport doubles the amount of GPU-to-GPU bandwidth on a single Radeon HD 4870 X2. So why disable it?

According to AMD the performance impact is negligible, while average frame rates don't see a gain every now and then you'll see a boost in minimum frame rates. There's also an issue where power consumption could go up enough that you'd run out of power on the two PCIe power connectors on the board. Board manufacturers also have to lay out the additional lanes on the graphics card connecting the two GPUs, which does increase board costs (although ever so slightly).

AMD decided that since there's relatively no performance increase yet there's an increase in power consumption and board costs that it would make more sense to leave the feature disabled.

The reference 4870 X2 design includes hardware support for the CrossFire Sideport, assuming AMD would ever want to enable it via a software update. However, there's no hardware requirement that the GPU-to-GPU connection is included on partner designs. My concern is that in an effort to reduce costs we'll see some X2s ship without the Sideport traces laid out on the PCB, and then if AMD happens to enable the feature in its drivers later on some X2 users will be left in the dark.

I pushed AMD for a firm commitment on how it was going to handle future support for Sideport and honestly, right now, it's looking like the feature will never get enabled. AMD should have never mentioned that it ever existed, especially if there was a good chance that it wouldn't be enabled. AMD (or more specifically ATI) does have a history of making a big deal of GPU features that never get used (Truform anyone?), so it's not too unexpected but still annoying.

The lack of anything special on the 4870 X2 to make the two GPUs work better together is bothersome. You would expect a company who has built its GPU philosophy on going after the high end market with multi-GPU configurations to have done something more than NVIDIA when it comes to actually shipping a multi-GPU card. AMD insists that a unified frame buffer is coming, it just needs to make economic sense first. The concern here is that NVIDIA could just as easily adopt AMD's small-die strategy going forward if AMD isn't investing more R&D dollars into enabling multi-GPU specific features than NVIDIA.

The lack of CrossFire Sideport support or any other AMD-only multi-GPU specific features reaffirms what we said in our Radeon HD 4800 launch article: AMD and NVIDIA don't really have different GPU strategies, they simply target different markets with their baseline GPU designs. NVIDIA aims at the $400 - $600 market while AMD shoots for the $200 - $300 market. And both companies have similar multi-GPU strategies, AMD simply needs to rely on its more.

Let's Talk Pricing General Performance at 2560x1600


View All Comments

  • MamiyaOtaru - Wednesday, August 13, 2008 - link

    Dude was joking. And it was funny.

    It's apparently pretty dangerous to joke around here. Two of my friends died from it.
  • CyberHawk - Tuesday, August 12, 2008 - link

    ... but I find a response a bit cold.

    It's the fastest card for God sake!
  • Samus - Wednesday, August 13, 2008 - link

    it was pretty negative. there really isn't anything negative about this card. price and power consumption (the only arguably negative things about this card) are in line with anything nVidia would have had they made a product to compete against this. Reply
  • Finally - Tuesday, August 12, 2008 - link

    And what is this?!

    "When you start pushing up over $450 and into multi-GPU solutions, you do have to be prepared for even more diminished returns on your investment, and the 4870 X2 is no exception."

    Man! This is a bullshit card for bullshit buyers, sry I meant: ENTHUSIASTS... What the heck do you expect? Low power consumption and reasonable price-to-power-relations? I totally don't get it...
    Isn't this the "We like power supplies only if they can assure us that they will kill the rain forest single-handedly" site?
    Where is the bullshit, sry again: enthusiasm?
  • Finally - Tuesday, August 12, 2008 - link

    Is it just my eyes or did they actually read the following heading on page2?

    "NVIDIA Strikes Back"

    *sound of a Vegas-style gambling automat turning out big coin*
    If there was a prize for a totally out-of-order title... this would take rank 1 to 3...
  • Finally - Tuesday, August 12, 2008 - link

    You are right; got that perception, too...

    Although I would never buy a dualchip cardmonster like this one (save SLI or CF...) I actually love it how they manage to take an article about an AMD product and turn around till you don't know wheter it was about the new HD4870X2 or the lackluster 280...
  • drisie - Tuesday, August 12, 2008 - link

    In my opinion this review is way too negative. It is solutions like this that have caused Nvidia to drop prices and increase competition between the competitors. Its the best card money can buy for ffs. Reply
  • formulav8 - Tuesday, August 12, 2008 - link

    Yeps, this is one of the worst reviews Anand himself has ever done. He continues to praise nVideo who just a month or 2 ago was charging $600 for their cards.

    Give credit where credit is do. He even harps on a sideport feature that doesn't mean much now and AMD says it didn't provide no real benefit even when it was enabled.

    I've been a member of this site since 2000 and am dissappointed how bad the reviews here are getting especially when they have a biased tone to them.

    Of course, this is only my opinion.

  • BikeDude - Wednesday, August 13, 2008 - link

    I think Anand's initial comments has to be viewed in the light of his conclusion:

    "I keep getting the impression that multi-GPU is great for marketing but not particularly important when it comes to actually investing R&D dollars into design. With every generation, especially from AMD, I expect to see a much more seamless use of multiple GPUs, but instead we're given the same old solution - we rely on software profiles to ensure that multiple GPUs work well in a system rather than having a hardware solution where two GPUs truly appear, behave and act as one to the software."

    I wholeheartedly agree. The software profile solution has baffled me for years. Why are they messing about with this? It was supposed to be a temporary thing. Creating unique profiles for every game title is not feasible. At the very least give the developers an API that will help them do this themselves.

    Instead of messing about with the power hungry sideport nonsense, AMD should have invested some R&D time on how to get rid of software profiles.
  • Locutus465 - Thursday, August 14, 2008 - link

    Probably because from a design perspective it works... And based on the benchmark results it works very well indeed. Additionally we know from the early days of SLI that not all games will respond well to all modes, so it seems to be that at a HW level the task of getting the card to automatically perform amazingly with multiple GPU's is going to be difficult to say the least (perhaps feautal?). Reply

Log in

Don't have an account? Sign up now