The Test & Our Results

With the limited game support for the Hydra and our limited time with the new drivers, we’ve had to throw out most of our usual video card testing suite. What we have here are games that are on the approved list and are at least somewhat graphically challenging, but by no means is it a complete list. As FRAPS and the Hydra software do not currently get along, we’re limited to games with built-in frame counters. This is also the reason that we will be doing an in-depth look at the image quality of the Hydra after CES, when we have more time to come up with ways to take screenshots.

Unfortunately our testing today will only paint a limited picture. The Fuzion board does not have SLI support, which means we cannot test a pair of NVIDIA cards in SLI and use that as a baseline for N-Mode performance. We’re hoping to track someone down from MSI here at CES in order to explain why the Fuzion doesn’t have SLI support, as this is a major oversight for a high-end board. Whether NVIDIA was willing or not to license SLI to MSI for the Fuzion is going to be the single biggest question hanging over our heads. Update: We have since received an official response from MSI. They did not implement SLI support as they felt the performance with Hydra was close enough to make SLI support unnecessary. We still find this questionable.

For our performance testing, we used the following games: Call of Juarez (DX10 benchmark), HAWX, Resident Evil 5, Company of Heroes, and Batman: Arkham Asylum. Batman is not supported under X-Mode, so we’ll leave it out of X-Mode testing.

We’ll start with Call of Juarez, which is one of the Hydra’s better titles. With our 5850s in Crossfire, we get 94fps, which is just less than double the performance of a single 5850 (49.5). Right off the bat the Hydra takes a noticeable performance dive when using a pair of 5850s, splitting the difference between 1 and 2 cards with 75.6fps. Certainly it doesn’t look like Hydra can beat SLI/Crossfire, so it shouldn’t come as a surprise that Lucid and MSI aren’t claiming otherwise.

Moving to A-Mode with dissimilar video cards, we get some more definitive good news. Combining a 5850 and a 4890 gets us 61.9fps, a 25% performance improvement. This is less than the raw power of the 4890 (which is to say, in a perfect situation the 4890 performs at well more than 25% of a 5850) but it’s solid proof that this works.

As for the NVIDIA side, since we don’t have SLI results we’ll have to look at single-card and Hydra results. While 1 GTX280 gives us 30.8fps, a pair of them in N-Mode gives us 53.9, a 75% increase. This is much better than the increase we got on the 5850s, and much closer to the 100% theoretical max that we expect SLI is close to.

Switching to a GTX280 + 9800GTX+, things aren’t so rosy. The performance difference is practically absent at 0.2fps. While the Hydra technology scales worse the more mismatched a set of cards are, this is particularly awful.

Finally we have X-Mode using a 5850 + GTX280, with the AMD card as the master card. Here we get 43.5fps, which is slower than a single 5850. During this time we got significant microstuttering and striping in the image. Our best guess is that the Hydra was having trouble in the compositing phase, as the striping is a product of that. However in A-Mode and N-Mode we encountered no such issues. To the naked eye, Call of Juarez looked just as good in A-Mode and N-Mode as it does on a single video card.

Moving to our old standby of HAWX, the picture is similar. Hydra performance on the 5850s is still below CF, but it’s still a 50% improvement. 5850 + 4890 is a 20% performance improvement.  The GTX280s weren’t quite a success story however – HAWX crashed with a Lucid driver error when attempting to load the main menu of HAWX. Strangely the GTX280 + 9800GTX+ had no launch issues, rather the combination significantly underperformed the single GTX280 at -56%. Finally the 5850 + GTX280 came in at 58fps, which at a few fps faster than the lone 5850 is less than flattering.

Besides the single crash at no point did we see evidence of graphical corruption. HAWX always rendered correctly, even in X-Mode.

Resident Evil 5, in spite of being on the approved game list, was problematic for all Hydra modes. Every mode suffered from the same flickering problem (we suspect it’s lighting related), which doesn’t make the game unplayable but does make it annoying to play.

The performance however was very, very good. The 5850s got within 4% of CF mode, and the 5850 + 4890 still offered a solid 33% performance boost. The GTX280s did 57% better than a single GTX280, and the GTX280 + 9800GTX+ was the odd man out by losing performance. In X-Mode, the 5850 + GTX280 managed to squeak ahead of the single 5850 by 3.5%.

Company of Heroes is another game on the supported list that we didn’t have much luck with. There was no graphical corruption this time around, but instead it crashed on every mode involving an AMD card.

As for the NVIDIA cards, the GTX280s picked up about 30%, and the GTX280 + 9800GTX+ lost nearly the same 30%. There was no graphical corruption to report.

Finally we have Batman: Arkham Asylum, one of the most popular games of 2009. It doesn’t work in X-Mode, but does in N-Mode and A-Mode.

Unfortunately for the AMD cards, Lucid is having some scaling issues in A-Mode, something they do note in their release notes. A pair of 5850s is barely better than a single 5850, and a 5850 + 4890 flat-out isn’t better.  The GTX280s pick up 15%, and the GTX280 + 9800GTX+ loses over 20%.

Frankly our testing experience was a mix of failures and successes, and this was sticking to the approved games list. Not a single game we picked to test worked perfectly in all cases, and the 1.4 driver release notes are a minefield of known issues. And unlike NVIDIA/AMD’s known issues, which tend to affect esoteric hardware and games, these are issues on what would be reasonably common high-end hardware configurations with popular games.

Here is by no means a complete list of games with known issues of graphical corruption: Call of Duty 4, World in Conflict (DX10), Rainbow Six Vegas 2, Operation Flashpoint Dragon Rising, Resident Evil 5. And as we’ve seen, this doesn’t include crashes, or undocumented graphical corruption.

We held off in doing significant testing on our Fuzion board until we had the 1.4 drivers in order to give Lucid time to get their next driver set out, since that set would be available when the hardware shipped. It was our hope in that moving from the 1.3 to 1.4 drivers that we’d see a reduction in graphical corruption compared to our tests in December when Lucid and MSI sent reps out, but that has not been the case.

This is by no means a complete list, and we aren’t going to claim otherwise. But in our testing thus-far, the Hydra drivers clearly need much more work. Frankly, we hesitate to think about what might have happened if the Fuzion shipped in November as originally announced.

A Look at the Hydra Software First Thoughts
Comments Locked

47 Comments

View All Comments

  • shin0bi272 - Thursday, January 7, 2010 - link

    I know thats what I was saying. The technology was supposed to be more worthwhile than this. Plus you cant mix gpus with a regular motherboard so Id have to get another 8800gtx to match mine on my sli supported motherboard. Or if I wanted to go with ati's new card Id have to get 2x5870's ($400ea)and a new crossfire mobo($150) to go crossfire instead. That's more expensive than getting this $350 mobo and adding a 5870 to my 8800gtx. Even if I went with 2 5850's at $300 each its still more expensive than buying this $350 mobo and one 5850. So you see why I really was hoping this tech would work better than it does.

    This would really do well in the budget mobo market IMO, so that people who didnt want to pay 300+ dollars for a motherboard then buy two video cards could use an old card and get better performance than they would have by just getting the low end mobo and using their old gpu.

    If they can get it to be truly scalar (or close to it) like they originally claimed it would be then maybe some other motherboard makers will pick it up but if they dont get it fixed it will end up falling by the wayside as a tech that missed its time (sort of like the hardware physx cards).

    Then again the crossfire 5850 in the call of juarez test got nearly scalar performance increases itself which is sort of new isnt it? isnt it the norm for crossfire and sli setups to do 40-50% better than single cards not 94%? Could just be my erroneous recollection but I dont recall near perfect doubling of fps with sli or crossfire before.
  • GourdFreeMan - Thursday, January 7, 2010 - link

    It is an amazing technological feat that they got this working at all, but in the few games in which it does work properly the performance is frankly terrible. Look at what happens when you pair a 5850 and a GTX280 -- equal or worse performance to a 5850 by itself, when theoretically you should get a ~75% gain over a single card.
  • FATCamaro - Thursday, January 7, 2010 - link

    This technology had fail written all over it. They unleashed a big sack of fail...
  • danger22 - Thursday, January 7, 2010 - link

    maybe the amd 5000 cards are to new to have support for hyrda? what about trying some older lower end cards? just for interest... i know you wouldn't put them in a $350 mobo
  • chizow - Thursday, January 7, 2010 - link

    Sadly, I think Lucid missed their window of opportunity as the need for Hydra largely evaporated with X58 and certainly P55's mainstream launch, offering support for both CF and SLI on the same platform. The only real hope for Hydra was the prospect of vendor-agnostic multi-GPU with better-than-AFR scaling.

    Those lofty goals seem to be unrealistic now that we've seen the tech, and with its current slew of problems and its incredibly high price tag, I just don't see the technology gaining any significant traction over the established, supported multi-GPU AFR methods.

    The article touched on many of the key problems early on, but never really drilled down on them, hopefully we'll see more on this in the next installment, especially IQ and latency:

    quote:

    What you won’t see them focusing on is the performance versus AFR, the difference in latency, or game compatibility for that matter.


    Guru3D did an extensive review as well and found CF scaled significantly better than Hydra without fail. Add in the various vendor-specific feature compatibility questions and an additional layer of driver profiles that need to be supported and synchronized between potentially 3 parties (Nvidia, ATI sync'd to each Lucid release) and you've got yourself a real nightmare from an end-user perspective.

    I'm impressed they got their core technology to work, I was highly skeptical in that regard, but I don't think we'll be hearing too much about this technology going forward. Its too expensive, too overly complicated and suffers from poor performance and compatibility. I don't see the situation improving any time soon and its clearly going to be an uphill struggle to get their drivers/profiles in order with both older titles and new releases.
  • sheh - Thursday, January 7, 2010 - link

    I agree. It's interesting from a technical standpoint but not many would want to go through all the fuss of SLI/CF (noise, heat, power) plus having to worry about the compatiblity of two or three sets of drivers at the same time. And that's assuming costs weren't high, and performance was better.

    I suspect in 1-2 years NV or ATI will be buying this company.

    (I'm somewhat surprised even SLI/CF exists, but maybe the development costs aren't too high or it's the bragging rights. :))
  • chizow - Thursday, January 7, 2010 - link

    quote:

    I suspect in 1-2 years NV or ATI will be buying this company.


    Not so sure about that, Intel has actually been pumping venture capital into Lucid for years, so I'm sure they're significantly vested in their future at this point. I actually felt Lucid's Hydra was going to serve as Intel's CF/SLI answer not so much as a straight performance alternative, but rather a vessel to make Larrabee look not so....underwhelming.

    Think about it, sell a Larrabee for $200-$300 that on its own, is a terrible 3D rasterizer and pair it up with an established, competent GPU from Nvidia or ATI and you'd actually get respectable gaming results. Now that Larrabee has been scrapped for the foreseeable future, I'd say Intel's financial backing and plans for Hydra are also in flux. As it is now, Hydra is in direct competition with the PCIe controllers they provide for little added cost that support both SLI and CF natively (licensing fee needed for SLI). In comparison, the Hydra 200 chip reportedly costs an additional $80!
  • TemjinGold - Thursday, January 7, 2010 - link

    I think the issue I see is that X-mode will most commonly be used by people looking to save a few bucks when upgrading by combining the card they already have with a new one they buy. Unfortunately, I seriously doubt that this is the same crowd that would shell out $350 for a mobo. That just leaves A and N modes, which the Hydra currently loses horribly to CF/SLI.

    If the Hydra was put on a cheap mobo, I might see where it could be appealing. But someone who spends $350 on a mobo will most likely just shell out for 2-3 new gfx cards at the same time rather than going "gee, I could put this $50 to use if I reuse my old video card."
  • AznBoi36 - Thursday, January 7, 2010 - link

    Right on. If I were to spend that much on a mobo, I wouldn't be thinking about saving some money by using an old video card, and in no way would I be mis-matching cards anyway. Seeing all this performance issues, I wonder how 3-way would be like...
  • ExarKun333 - Thursday, January 7, 2010 - link

    3-way would be ideal. ;)

Log in

Don't have an account? Sign up now