These Aren't the Sideports You're Looking For

Remember this diagram from the Radeon HD 4850/4870 review?

I do. It was one of the last block diagrams I drew for that article, and I did it at the very last minute and wasn't really happy with the final outcome. But it was necessary because of that little red box labeled CrossFire Sideport.

AMD made a huge deal out of making sure we knew about the CrossFire Sideport, promising that it meant something special for single-card, multi-GPU configurations. It also made sense that AMD would do something like this, after all the whole point of AMD's small-die strategy is to exploit the benefits of pairing multiple small GPUs. It's supposed to be more efficient than designing a single large GPU and if you're going to build your entire GPU strategy around it, you had better design your chips from the start to be used in multi-GPU environments - even more so than your competitors.

AMD wouldn't tell us much initially about the CrossFire Sideport other than it meant some very special things for CrossFire performance. We were intrigued but before we could ever get excited AMD let us know that its beloved Sideport didn't work. Here's how it would work if it were enabled:

The CrossFire Sideport is simply another high bandwidth link between the GPUs. Data can be sent between them via a PCIe switch on the board, or via the Sideport. The two aren't mutually exclusive, using the Sideport doubles the amount of GPU-to-GPU bandwidth on a single Radeon HD 4870 X2. So why disable it?

According to AMD the performance impact is negligible, while average frame rates don't see a gain every now and then you'll see a boost in minimum frame rates. There's also an issue where power consumption could go up enough that you'd run out of power on the two PCIe power connectors on the board. Board manufacturers also have to lay out the additional lanes on the graphics card connecting the two GPUs, which does increase board costs (although ever so slightly).

AMD decided that since there's relatively no performance increase yet there's an increase in power consumption and board costs that it would make more sense to leave the feature disabled.

The reference 4870 X2 design includes hardware support for the CrossFire Sideport, assuming AMD would ever want to enable it via a software update. However, there's no hardware requirement that the GPU-to-GPU connection is included on partner designs. My concern is that in an effort to reduce costs we'll see some X2s ship without the Sideport traces laid out on the PCB, and then if AMD happens to enable the feature in its drivers later on some X2 users will be left in the dark.

I pushed AMD for a firm commitment on how it was going to handle future support for Sideport and honestly, right now, it's looking like the feature will never get enabled. AMD should have never mentioned that it ever existed, especially if there was a good chance that it wouldn't be enabled. AMD (or more specifically ATI) does have a history of making a big deal of GPU features that never get used (Truform anyone?), so it's not too unexpected but still annoying.

The lack of anything special on the 4870 X2 to make the two GPUs work better together is bothersome. You would expect a company who has built its GPU philosophy on going after the high end market with multi-GPU configurations to have done something more than NVIDIA when it comes to actually shipping a multi-GPU card. AMD insists that a unified frame buffer is coming, it just needs to make economic sense first. The concern here is that NVIDIA could just as easily adopt AMD's small-die strategy going forward if AMD isn't investing more R&D dollars into enabling multi-GPU specific features than NVIDIA.

The lack of CrossFire Sideport support or any other AMD-only multi-GPU specific features reaffirms what we said in our Radeon HD 4800 launch article: AMD and NVIDIA don't really have different GPU strategies, they simply target different markets with their baseline GPU designs. NVIDIA aims at the $400 - $600 market while AMD shoots for the $200 - $300 market. And both companies have similar multi-GPU strategies, AMD simply needs to rely on its more.

Let's Talk Pricing General Performance at 2560x1600
Comments Locked

93 Comments

View All Comments

  • Spoelie - Tuesday, August 12, 2008 - link

    How come 3dfx was able to have a transparant multigpu solution back in the 90's - granted, memory still was not shared - when it seems impossible for everyone else these days.

    Shader functionality problems? Too much integration (a single card voodoo2 was a 3 chip solution to begin with)?
  • Calin - Tuesday, August 12, 2008 - link

    The SLI from 3dfx used scan line interleaving (or Scan Line Interleaving to be exact). The new SLI still has Scan Line Interleaving, amongst other modes.
    The reason 3dfx was able to use this is that the graphic library used was their own, and it was built specifically to the task. Now, Microsoft's DirectX is not built for this SLI thing, and it shows (see the CrossFire profiles, selected for the best performance for a game, depending on that game).

    Also, 3dfx's SLI had a dongle feeding video signal from the second card (slave) into the first card (master), and the video from the two cards was interleaved. Now, this uses lots of bandwidth, and I don't think DirectX is able to generate scenes in "only even/odd lines", and much of the geometry work must be done by both cards (so if your game engine is geometry bound, SLI doesn't help you)
  • mlambert890 - Friday, August 15, 2008 - link

    Great post... Odd that people seem to remember 3DFX and dont remember GLIDE or how it worked. Im guessing they're too young to have actually owned the original 3D cards (I still have my dedicated 12MB Voodoo cards in a closet), and they just hear something on the web about how "great" 3DFX was.

    It was a different era and there was no real unified 3D API. Back then we used to argue about OpenGL vs GLIDE and the same types of malcontents would rant and rave about how "evil" MSFT was for daring to think to create DirectX

    Today a new generation of illinformed malcontents continue to rant and rave about Direct3D and slam NVidia for "screwing up" 3DFX when the reality is that time moves on and NVidia used the IP from 3DFX that made sense to use (OBVIOUSLY - sometimes the people spending hundreds of millions and billions have SOME clue what they're buying/doing and actually have CS PhDs rather than just "forum posting cred")
  • Zoomer - Wednesday, August 13, 2008 - link

    Ah, I remember wanting to get a Voodoo5 5000, but ultimately decided on the Radeon 32MB DDR instead.

    Yes, 32MB DDR framebuffer!
  • JarredWalton - Tuesday, August 12, 2008 - link

    Actually, current SLI stands for "Scalable Link Interface" and has nothing to do with the original SLI other than the name. Note also that 3dfx didn't support anti-aliasing with SLI, and they had issues going beyond the Voodoo2... which is why they're gone.
  • CyberHawk - Tuesday, August 12, 2008 - link

    nVidia bought them .... and is now uncapable of take advantage if the technology :D
  • StevoLincolnite - Tuesday, August 12, 2008 - link

    They could have at least included support for 3DFX glide so all those GLIDE only games would continue to function.

    Also, ATI have had a "Dual GPU" Card for many years (Rage Furry Maxx) before nVidia released one.
  • TonyB - Tuesday, August 12, 2008 - link

    can it play Crysis though?



    two of my friends computer died while playing it.
  • Spoelie - Tuesday, August 12, 2008 - link

    no it can't, the crysis benchmarks are just made up

    stop with the bearded comments already
  • MamiyaOtaru - Wednesday, August 13, 2008 - link

    Dude was joking. And it was funny.

    It's apparently pretty dangerous to joke around here. Two of my friends died from it.

Log in

Don't have an account? Sign up now