So, lots of people were asking really good questions about Lucid and their Hydra engine after we posted the initial story on it. We had the opportunity to sit down with them and ask some of those and other questions. And they had quite a lot of really interesting things to say.

From a mile high and the promise of hardware not this year but next, it is tough to really get a good understanding of exactly what's going on and what the implications of this hardware could be if they can deliver on what they say they can. We'll do our best to explain what we know and also what the pitfalls could be.

First, let's address the issue of the box we showed off in the previous coverage. No it will not need an external device. Lucid has designed this to be solution that can be dropped onto a motherboard or a graphics card so integration and user experience should be seamless.

This would be even more transparent than SLI and CrossFire because not even an internal bridge would be needed. Just plug any two cards from the same vendor (and i think they also need to use the same driver version though this is less than clear) and performance will scale linearly with the capabilities of each card.

They did mention the fact that they can implement a solution in an external box for notebooks. For those who need something portable but want high end graphics at home, they could just attach the graphics cards linked with a Hyrda 100 (via PCIe over cables) to the notebook. Not ideal, but it still offers some advantages over high end internal cards (especially in the area of heat) that you might not need when you're on the road.

Sound too good to be true? Yes. Did we see it working? Sure. Do we have performance numbers? Not yet. So there's the rub for us. We really want to put this thing through its paces before we sign off on it. Running on both UT3 and Crysis (DX9 only for now -- DX10 before the product ships though) is cool, but claiming application agnostic linear scaling over an arbitrary number of GPUs of differing capability is a tough pill to swallow without independent confirmation.

We asked them for hardware, and we really hope they'll get us some sooner rather than later. They seemed interested in letting us test it as well. Even if we can't publish numbers on it, it would go a long way for us being more excited about the product if we could run our own benchmarks on it just to see for ourselves.

Let's Talk About Applications
POST A COMMENT

57 Comments

View All Comments

  • GTVic - Friday, August 22, 2008 - link

    This company is not making graphics cards, and to use their product you have to buy more graphics cards. Seems like a win-win situation. AMD and nVidia can dump development on crossfire/sli and sales go up. Reply
  • DerekWilson - Saturday, August 23, 2008 - link

    if nvidia dumps sli then there is zero reason for them to be in the chipset business right now.

    they are no longer needed for AMD because AMD isn't making horrid chipsets anymore. they aren't needed for Intel because Intel builds awesome motherboards.

    the only value add nvidia has on the platform side is sli. period.

    they do not want to see it become irrelevant.
    Reply
  • shin0bi272 - Friday, August 22, 2008 - link

    This is a gamers dream (assuming it works as advertised) and a video card makers nightmare.

    If they really wanted to demo it they probably should have been running 2 systems side by side, one with 1 card and one with the hydra running 2 cards to show the actual difference. Maybe also not run crysis since crysis has issues with framerate on any system... maybe run 3dmark vantage (I know its not an actual game but its a standardized program) especially if its transparent to the game and hardware.

    Personally if AMD and Nvidia have a problem with this technology and they disable it (or force me to so I can play any game) there's still Intel's Larabee on the horizon and I'm sure Intel wouldnt disable the hydra so Id just dump AMD and Nvidia all together to get linear performance increases (again assuming it works).

    On top of that AMD and Nvidia have their own performance issues and competition to worry about especially now that the physx war has begun (AMD hooking up with havoc and Nvidia buying Ageia).

    I think both AMD and Nvidia should embrace this technology and abandon their approaches so that they can concentrate more on individual card performance. Since the performance gains with both SLi and crossfire arent linear and this promises to be. Even if its not 100% linear but its a 90% speed gain thats still better than either of the other solutions.

    The game designers would also love this technology because they wouldnt have to worry about enabling SLi or crossfire in their games they could concentrate on the actual game play and making the game fun and cool looking.
    Reply
  • shin0bi272 - Friday, August 22, 2008 - link

    Oh also I forgot to mention that the article did say that you would have to have 2 of the same brand of card so youd still be locked into one manufacturer. So its not like youd be mixing an nvidia 280 with an amd 4870x2. So amd and nvidia really shouldnt have a huge problem with it. Reply
  • Diesel Donkey - Friday, August 22, 2008 - link

    That is false. The article states that any combination of two, three, or four cards from either AMD or Nvidia can be used. That's one reason this technology would be so amazing if it actually works and is implemented successfully. Reply
  • The Preacher - Saturday, August 23, 2008 - link

    I don't think you would like some portions of the same screen rendered by nvidia and others by ATI since they will look different and could create some discontinuities in the final image. Reply
  • DerekWilson - Saturday, August 23, 2008 - link

    they try really hard to render nearly the same image ... but if you played half-life 2 then this would be an issue.

    also, to enable this they would have to wait for vista to allow it (i think) ... thing is they are building a wddm driver ... so ... nvidia's display driver wouldn't be "running" either? I don't really know how that works.
    Reply
  • jordanclock - Friday, August 22, 2008 - link

    No, he is right. You can't have an nVidia card with an AMD card. As it stands, Windows won't allow two graphics drivers to run in 3D mode. This was addressed in the first article featuring this technology. Reply
  • prophet001 - Friday, August 22, 2008 - link

    how amazing would this be. nice article with what you were given. Reply
  • MrHanson - Friday, August 22, 2008 - link

    I thing having a separete box with it's own power supply(s) is ideal for something like this. That way if you want to add 2 or more gpu's to your hydra system, you don't have to rip apart your computer and put in a different motherboard and power supply. I imagine this system will probably come with it's own mainboard and power supply with several separate pcie x16 slots for scalablity. Also if you were to upgrade your motherboard and cpu, you don't have to worry about getting a motherboard with enough pcie x16 slots or if the motherboard supports the hydra engine. Any ol' motherboard with one pci express slot will do.


    Reply

Log in

Don't have an account? Sign up now