A year ago Lucid announced the Hydra 100: a physical chip that could enable hardware multi-GPU without any pesky SLI/Crossfire software, game profiles or anything like that.

At a high level what Lucid's technology does is intercept OpenGL/DirectX commands from the CPU to the GPU and load balance them across any number of GPUs. The final buffers are read back by the Lucid chip and sent to primary GPU for display.

The technology sounds flawless. You don't need to worry about game profiles or driver support, you just add more GPUs and they should be perfectly load balanced. Even more impressive is Lucid's claim that you can mix and match GPUs of different performance levels. For example you could put a GeForce GTX 285 and a GeForce 9800 GTX in parallel and the two would be perfectly load balanced by Lucid's hardware; you'd get a real speedup. Eventually, Lucid will also enable multi-GPU configurations from different vendors (e.g. one NVIDIA GPU + one AMD GPU).

At least on paper, Lucid's technology has the potential to completely eliminate all of the multi-GPU silliness we've been dealing with for the past several years. Today, Lucid is announcing the final set of hardware that will be shipping within the next ~30 days.


The MSI Big Bang, a P55 motherboard with Lucid's Hydra 200

It's called the Hydra 200 and it will first be featured on MSI's Big Bang P55 motherboard. Unlike the Hydra 100 we talked about last year, 200 is built on a 65nm process node instead of 130nm. The architecture is widely improved thanks to much more experience with the chip on Lucid's part.

There are three versions of the Hydra 200: the LT22114, the LT22102 and the LT22114. The only difference between the chips are the number of PCIe lanes. The lowest end chip has a x8 connection to the CPU/PCIe controller and two x8 connections to GPUs. The midrange LT22102 has a x16 connection to the CPU and two x16 connections for GPUs. And the highest end solution, the one being used on the MSI board, has a x16 to the CPU and then a configurable pair of x16s to GPUs. You can operate this controller in 4 x8 mode, 1 x16 + 2 x8 or 2 x16. It's all auto sensing and auto-configurable. The high end product will be launching in October, with the other two versions shipping into mainstream and potentially mobile systems some time later.

Lucid wouldn't tell us the added cost on a motherboard but Lucid gave us the guidance of around $1.50 per PCIe lane. The high end chip has 48 total PCIe lanes, which puts the premium at $72. The low end chip has 24 lanes, translating into a $36 cost for the Hydra 200 chip. Note that since the Hydra 200 has an integrated PCIe switch, there's no need for extra chips on the motherboard (and of course no SLI licensing fees). The first implementation of the Hydra 200 will be on MSI's high end P55 motherboard, so we can expect prices to be at the upper end of the spectrum. With enough support, we could see that fall into the upper mainstream segment.

Lucid specs the Hydra 200 at a 6W TDP.

Also unlike last year, we actually got real seat time with the Hydra 200 and MSI's Big Bang. Even better: we got to play on a GeForce GTX 260 + ATI Radeon HD 4890 running in multi-GPU mode.

Of course with two different GPU vendors, we need Windows 7 to allow both drivers to work at the same time. Lucid's software runs in the background and lets you enable/disable multi-GPU mode:

If for any reason Lucid can't run a game in multi-GPU mode, it will always fall back to working on a single GPU without any interaction from the end user. Lucid claims to be able to accelerate all DX9 and DX10 games, although things like AA become easier in DX10 since all hardware should resolve the same way.


NVIDIA and ATI running in multi-GPU mode on a single system

There are a lot of questions about performance and compatibility, but honestly we can't say much on that until we get the hardware ourselves. We were given some time to play with the system and can say that it at least works.

Lucid only had two games installed on the cross-vendor GPU setup: Bioshock and FEAR 2. There are apparently more demos at the show floor, we'll try and bring more impressions from IDF later this week.

Comments Locked

94 Comments

View All Comments

  • etriky - Saturday, October 31, 2009 - link

    10/29 has come and gone and still no board....
  • maomao0000 - Sunday, October 11, 2009 - link

    http://www.myyshop.com">http://www.myyshop.com

    Quality is our Dignity; Service is our Lift.

    Myyshop.com commodity is credit guarantee, you can rest assured of purchase, myyshop will

    provide service for you all, welcome to myyshop.com

    Air Jordan 7 Retro Size 10 Blk/Red Raptor - $34

    100% Authentic Brand New in Box DS Air Jordan 7 Retro Raptor colorway

    Never Worn, only been tried on the day I bought them back in 2002

    $35Firm; no trades

    http://www.myyshop.com/productlist.asp?id=s14">http://www.myyshop.com/productlist.asp?id=s14 (Jordan)

    http://www.myyshop.com/productlist.asp?id=s29">http://www.myyshop.com/productlist.asp?id=s29 (Nike shox)
  • Canadian87 - Tuesday, October 6, 2009 - link

    is on lock down.... I'm not spending a dime, waiting for a new christmas build, I'll just survive with my E5300, GTS250, 2gb ddr2 800 for now.

    Come chrimuh *inside joke* I'm gonna have enough dough to drop on core i7, 8gb ddr3, a multi gpu board, and any 2 graphics cards around. it's go time, hope this isht works.
  • YOMO - Thursday, October 1, 2009 - link

    it looks to me like there is 2 cables coming out the back of the monitor one for each video card so whats the point of having it with one monitor
  • MonicaS - Tuesday, September 29, 2009 - link

    I hate to knock a great idea, but this is something that should have been invented 3-4 years ago and not now. As far as I know, there's quite literally no reason to SLI anything. Single cards with multiple cores do the job more then well enough and nothing out there is going to require some crazy multiple card set up.

    Beyond that, I can't see any use for this. Its a great idea, but no very useful.

    Monica S
    Computer Repair Los Angeles
    http://www.sebecomputercare.com/?p=10">http://www.sebecomputercare.com/?p=10
  • shin0bi272 - Wednesday, September 30, 2009 - link

    What this is supposed to do is bring up the fps from what you get on sli or crossfire. Since there are dual gpu cards running in sli (4 gpus total) to run games like crysis at a better frame rate than they could with a single dual gpu card there is definitely a reason for this chip... IF (and thats a BIG if) it performs as its claimed to or even close to it. Your average SLi or Crossfire bump is 40% or so and thats IF the game supports the solution you have (if not you have up to 600 dollars worth of paperweight in your rig). This is supposed to get you double (or close to it) the fps if you put a second card in rather than 40% more. So yeah there's definitely a need for this technology. Now if we can just get games that have more replay value....
  • ValiumMm - Wednesday, September 30, 2009 - link

    crysis, kthnxbye
  • ValiumMm - Tuesday, September 29, 2009 - link

    Whats the point having two x16 lanes with 2 gfx cards when the upstream goes to x16, wouldnt it be exactly the same if u had x8 on two lanes cos the upstream is exactly the same ???
  • Sunagwa - Saturday, September 26, 2009 - link

    With this setup are both cards memory going to be utilized?

    Honestly the only thing that kept me from going multi-GPU so far is that I'm paying for 1Gig of video memory that isn't going to be used.
  • shin0bi272 - Sunday, September 27, 2009 - link

    If I'm reading the schematics correctly the threads are sent to the video card as they would be if the hydra wasnt there except the hydra only sends a thread to the card if its ready for one. So it would use the memory as far and I can tell of both cards (which would really be an improvement and probably where they get the scalability claim). Think of the hydra chip as a traffic cop. It sends data down the appropriate channel when that channel is ready for more data and lets the card handle the rendering using all of its tools.

Log in

Don't have an account? Sign up now