Gaming Performance

We're dealing with four top end GPUs in the AVADirect Compact Gaming PC we have for review; nobody's buying this thing for Minesweeper. What I was personally interested in was getting AMD Radeon HD 6990 quad-CrossFire numbers in house, since I've already reviewed the GeForce GTX 590 on its own and in SLI. Both of these dual-GPU cards are rare as hen's teeth in the marketplace right now, and speaking with different boutiques and OEMs it seems to be an issue with supply.

A pair of Radeon HD 6990s is going to be complete overkill at 1080p just like the GTX 590s in SLI were, but this is where we ran into a real problem with AMD's kit. I've personally chided NVIDIA's multi-monitor solution as feeling grafted on, an "oh crap" addition to their graphics hardware to compete with AMD's EyeFinity. Unfortunately, with the Radeons AMD seems to be leaning too heavily on DisplayPort, because while NVIDIA's solution may feel like a last minute addition, it's a last minute addition that supports triple-DVI (and variations including HDMI). Theoretically AMD's solution should as well using included active Mini-DisplayPort-to-DVI/HDMI adaptors, except with our review unit, it doesn't. I tried multiple adaptors, multiple drivers, and multiple different monitor combinations, and I just couldn't get the card to see more than two screens. While it turned out that one of the Mini-DisplayPort ports on the HD 6990 we were testing wasn't functioning properly, I still wound up having to order a Mini-DisplayPort-to-DisplayPort cable to test the pair of HD 6990s in surround.

Part of the problem really does seem to stem from the overreliance on DisplayPort, but my biggest beef is actually the use of Mini-DisplayPort. Inexplicably the HD 6990 does not include Mini-DisplayPort to regular DisplayPort adaptors, and not only that, you're going to find those adaptors surprisingly rarefied. My past issues getting the quad-GPU GeForce GTX 590 SLI solution running in Surround are only repeated here with the HD 6990 quad-Fire solution. There's a reason boutiques aren't going out of their way to sell you surround gaming rigs from either vendor.

But let's start with performance at our standard 1080p resolution in both High and Ultra configurations first. Then we'll get to the triple-head results.

It's clear that even with a 4.4GHz Intel Core i7-990X the quad-GPU solution is CPU-limited in some of our tests. Also unusual but worth pointing out is the comparatively poor performance in Mafia II; 100+ fps is still very good, but even a pair of GeForce GTX 560 Ti cards in SLI beats the pair of HD 6990s despite multiple retests to confirm. Four GPUs for 1080p "High" is silly, though, so let's move on to our maxed out settings.

We still run into a few CPU-limited situations with 1080p and antialiasing in the mix, which really throws into relief just how grossly overkill any multi-GPU solution is for this common resolution, much less four GPUs. On my own desktop (gaming at 1920x1200) I've almost never experienced a situation where I felt like a single GeForce GTX 580 (or comparatively, a single Radeon HD 6970) just wasn't enough. So to fully tax the GPUs, we have three 1920x1200 LCDs in a triple-head configurations; now we'll see if we can separate the men from the boys....

Comparatively, AMD's quad-GPU support looks a bit shaky against NVIDIA's, but really this is academic anyhow. You'll notice how even our "slowest" solution, a pair of GTX 560 Ti's, is still able to consistently run above 30fps (though admittedly that does ignore the hiccups inherent to multi-GPU setups, e.g. micro-stutter, game profiles, updated drivers, etc.).

Honestly I have a hard time justifying any quad-GPU configuration. Power consumption isn't commensurate with the gaming experience they give you and in fact, the added complication is oftentimes just not worth it. Cooling four GPUs is also obtrusively loud unless you opt for a custom water-cooling rig, but either way it still feels like buying a Lamborghini to drive around suburbia. A pair of GTX 580s or a single 590—or a pair of 6970s or a single 6990—should really be as fast as any sensible gamer ought to be looking to go.

Application and Futuremark Performance Build, Noise, Heat, and Power Consumption
Comments Locked

17 Comments

View All Comments

  • TinyRK - Wednesday, October 5, 2011 - link

    That's right, because I am an ACTUAL engineer. With a degree in Electronics. English is not my mother-tongue, so my apologies, that I did not know what you consider an Engineer.
    I didn't want to piss on you leg as a Sanitation Engineer. Somebody has to clean up the trash, and I appreciate that you're doing that.

    Keep up the good work!
  • Death666Angel - Tuesday, October 4, 2011 - link

    I honestly don't understand these systems.
    Maybe .001% can make use of such a system in a reasonable manner (GPU-computing, while taking advantage of the 6 cores). For most other people, even SLI/CF configurations are too much for gaming and of course, SNB would have been better as well for gaming.
    This particular unit should have gone with water cooling in my opinion. Anything else is just..... As it stands now, this build is insane and nothing that the average person can't build themselves (at least I don't see specially made components). But I like to get something substantial for the money I spend, so I doubt I'm the audience for this unit. :P
  • ph0masta - Tuesday, October 4, 2011 - link

    If they're sending Anand a copy to review, why not send him the best build possible? I'm sure they expect the average customer to go with a more modest build.
  • Thermalzeal - Tuesday, October 4, 2011 - link

    it's pink? (Perhaps the colors are off on my disp) I guess this is the one computer that won't get stolen at a LAN party...
  • s1175290 - Tuesday, October 4, 2011 - link

    Maybe I missed it in the article, but what power supply did this ship with? Looks to be 100% modular.
  • benrico - Wednesday, October 5, 2011 - link

    There was a Lian Li m-atx case review up a week or so ago that was cast it in a negative light- cant remember why. one reason was non standard optical bay or something .. Any thoughts on the comparison of the two. Also, any thoughts on look/feel of the case...
  • KamikaZeeFu - Thursday, October 6, 2011 - link

    The spec table says that the front USB3 ports are wired to USB2 headers. I would like to know how this was done, as my online searches didn't bring up anything useful.

    I'm in the market for a new case atm, but my board only has USB2 (won't upgrade until Ivy) and all the cases that I like only have front USB3 ports.

Log in

Don't have an account? Sign up now