A year ago Lucid announced the Hydra 100: a physical chip that could enable hardware multi-GPU without any pesky SLI/Crossfire software, game profiles or anything like that.

At a high level what Lucid's technology does is intercept OpenGL/DirectX commands from the CPU to the GPU and load balance them across any number of GPUs. The final buffers are read back by the Lucid chip and sent to primary GPU for display.

The technology sounds flawless. You don't need to worry about game profiles or driver support, you just add more GPUs and they should be perfectly load balanced. Even more impressive is Lucid's claim that you can mix and match GPUs of different performance levels. For example you could put a GeForce GTX 285 and a GeForce 9800 GTX in parallel and the two would be perfectly load balanced by Lucid's hardware; you'd get a real speedup. Eventually, Lucid will also enable multi-GPU configurations from different vendors (e.g. one NVIDIA GPU + one AMD GPU).

At least on paper, Lucid's technology has the potential to completely eliminate all of the multi-GPU silliness we've been dealing with for the past several years. Today, Lucid is announcing the final set of hardware that will be shipping within the next ~30 days.


The MSI Big Bang, a P55 motherboard with Lucid's Hydra 200

It's called the Hydra 200 and it will first be featured on MSI's Big Bang P55 motherboard. Unlike the Hydra 100 we talked about last year, 200 is built on a 65nm process node instead of 130nm. The architecture is widely improved thanks to much more experience with the chip on Lucid's part.

There are three versions of the Hydra 200: the LT22114, the LT22102 and the LT22114. The only difference between the chips are the number of PCIe lanes. The lowest end chip has a x8 connection to the CPU/PCIe controller and two x8 connections to GPUs. The midrange LT22102 has a x16 connection to the CPU and two x16 connections for GPUs. And the highest end solution, the one being used on the MSI board, has a x16 to the CPU and then a configurable pair of x16s to GPUs. You can operate this controller in 4 x8 mode, 1 x16 + 2 x8 or 2 x16. It's all auto sensing and auto-configurable. The high end product will be launching in October, with the other two versions shipping into mainstream and potentially mobile systems some time later.

Lucid wouldn't tell us the added cost on a motherboard but Lucid gave us the guidance of around $1.50 per PCIe lane. The high end chip has 48 total PCIe lanes, which puts the premium at $72. The low end chip has 24 lanes, translating into a $36 cost for the Hydra 200 chip. Note that since the Hydra 200 has an integrated PCIe switch, there's no need for extra chips on the motherboard (and of course no SLI licensing fees). The first implementation of the Hydra 200 will be on MSI's high end P55 motherboard, so we can expect prices to be at the upper end of the spectrum. With enough support, we could see that fall into the upper mainstream segment.

Lucid specs the Hydra 200 at a 6W TDP.

Also unlike last year, we actually got real seat time with the Hydra 200 and MSI's Big Bang. Even better: we got to play on a GeForce GTX 260 + ATI Radeon HD 4890 running in multi-GPU mode.

Of course with two different GPU vendors, we need Windows 7 to allow both drivers to work at the same time. Lucid's software runs in the background and lets you enable/disable multi-GPU mode:

If for any reason Lucid can't run a game in multi-GPU mode, it will always fall back to working on a single GPU without any interaction from the end user. Lucid claims to be able to accelerate all DX9 and DX10 games, although things like AA become easier in DX10 since all hardware should resolve the same way.


NVIDIA and ATI running in multi-GPU mode on a single system

There are a lot of questions about performance and compatibility, but honestly we can't say much on that until we get the hardware ourselves. We were given some time to play with the system and can say that it at least works.

Lucid only had two games installed on the cross-vendor GPU setup: Bioshock and FEAR 2. There are apparently more demos at the show floor, we'll try and bring more impressions from IDF later this week.

Comments Locked

94 Comments

View All Comments

  • silverblue - Tuesday, September 29, 2009 - link

    Unfortunately, there's no chance of that...

    http://www.tomshardware.co.uk/nvidia-physx-ati-gpu...">http://www.tomshardware.co.uk/nvidia-physx-ati-gpu...

    Although Toms didn't touch on it, I expect it's partly in response to Lucid.
  • ValiumMm - Tuesday, September 22, 2009 - link

    So if you have two 4870's for example, would you not put it as crossfired in ATI settings and just let the hydra chip do its work, or would you still say to put it in crossfire?
  • petteyg359 - Tuesday, September 22, 2009 - link

    [quote]There are three versions of the Hydra 200: the LT22114, the LT22102 and the LT22114[/quote]

    I count two plus an evil twin.
  • chizow - Tuesday, September 22, 2009 - link

    Anand, how were you able to verify Bioshock was running in mixed-GPU mode? From this bit from the article:
    quote:

    If for any reason Lucid can't run a game in multi-GPU mode, it will always fall back to working on a single GPU without any interaction from the end user.

    It seems it would've been difficult to determine if you were running in single-GPU or mixed-mode without comparing to single-GPU performance for either Nvidia part. Not to mention Bioshock does run quite well on any single GT200 or RV670 part. Just seems VERY misleading to claim you saw Bioshock running in mixed-mode without expanding on how you came to that conclusion.

    quote:

    Lucid claims to be able to accelerate all DX9 and DX10 games, although things like AA become easier in DX10 since all hardware should resolve the same way.

    Beyond vendor specific custom AA modes, they also handle texture filtering differently. Big question marks here imo.

    Which brings us to price....$72 premium for what is provided already for free or for a very small premium is a lot to ask. My main concern besides compatibility would be of course latency and input lag. I'd love to see the comparisons there, especially given many LCDs already suffer 1-2 frame input lag.
  • AnnonymousCoward - Friday, September 25, 2009 - link

    All good points, chizow. Lag is my main concern, and if this adds 1 frame I won't even consider it.
  • LTG - Tuesday, September 22, 2009 - link

    You said Anand was "VERY misleading to claim you saw Bioshock running in mixed-mode".

    For all the losers who think they detect moments of misleading journalism on Anand's part here's a clue:

    If you ever did actually find "VERY misleading" journalism here then many other people would echo your sentiment. The fact that no one else is agreeing with your charges most like means you are wrong.

    How old are you?
  • youjinbou - Thursday, September 24, 2009 - link

    What an ugly and overused troll.
    Someone detected an issue, but since he's the only one, this has to be an error on his part.
  • chizow - Tuesday, September 22, 2009 - link

    No, it just means they were mislead to believe the configuration was properly running in mixed-GPU mode, which is my point.

    I'm not saying Anand was purposefully misleading, its quite possible he was also mislead to believe multi-GPU was functioning properly when there's really no way he could've known otherwise without doing some validation of his own.

    Now grow up and stop worrying about MY age. Heh.



  • glennpratt - Tuesday, September 22, 2009 - link

    This isn't a review, it's a preview of unreleased hardware. At some level, Anand can accept their word for it. If they are lying, they'll be found out soon enough.
  • chizow - Tuesday, September 22, 2009 - link

    I never claimed it was a review or anything comprehensive, but if a product is highly anticipated for a few features, say:

    1) Vendor Agnostic Multi-GPU

    and

    2) Close to 100% scaling

    And the "preview" directly implies they've observed one of those major selling points functioning properly without actually verifying that's the case, that'd be misleading, imo, especially given the myriad questions regarding the differences in vendor render outputs.

    But getting back to the earlier fella's question, I guess I'm old enough to engage in critical thinking and know better than to take everything I read on the internet at face value, even on a reputable site like Anandtech. As people who seem genuinely interested in the technology I'd think you'd want these questions answered as well, or am I wrong again? ;)

Log in

Don't have an account? Sign up now