Compute Performance

Moving on from our look at gaming performance, we have our customary look at compute performance. With GCN AMD significantly overhauled their architecture in order to improve compute performance, as their long-run initiatives rely on GPU compute performance becoming far more important than it is today.

Our first compute benchmark comes from Civilization V, which uses DirectCompute to decompress textures on the fly. Civ V includes a sub-benchmark that exclusively tests the speed of their texture decompression algorithm by repeatedly decompressing the textures required for one of the game’s leader scenes. Note that this is a DX11 DirectCompute benchmark.

The Civ V compute shader benchmark once again shows off just how much the compute shader performance of the 7800 series has improved relative to the 6900 series, with both 7800 cards coming in well, well ahead of any previous generation AMD cards. Compared to NVIDIA’s lineup the 7800 series does fairly well for itself too, although not quite as well as the commanding lead the 7900 series took.

Our next benchmark is SmallLuxGPU, the GPU ray tracing branch of the open source LuxRender renderer. We’re now using a development build from the version 2.0 branch, and we’ve moved on to a more complex scene that hopefully will provide a greater challenge to our GPUs.

SmallLuxGPU continues to showcase the 7800 series’ improvements over past AMD architectures, and while it’s not the same kind of massive leap we saw with CivV, it’s still enough to bring the 7850 up to near the performance of the 6970, and pushing the 7870 well beyond that. The only real competition here for AMD is AMD.

For our next benchmark we’re looking at AESEncryptDecrypt, an OpenCL AES encryption routine that AES encrypts/decrypts an 8K x 8K pixel square image file. The results of this benchmark are the average time to encrypt the image over a number of iterations of the AES cypher.

On the one hand, the 7870 gets quite close to the 7950 here in our AESEncryptDecrypt benchmark, in spite of the latter’s higher number of shaders. On the other hand, it’s still not enough to dethrone the GTX 570; the only NVIDIA cards the 7800 series can beat start at the GTX 560 Ti.

Finally, our last benchmark is once again looking at compute shader performance, this time through the Fluid simulation sample in the DirectX SDK. This program simulates the motion and interactions of a 16k particle fluid using a compute shader, with a choice of several different algorithms. In this case we’re using an (O)n^2 nearest neighbor method that is optimized by using shared memory to cache data.

In our final compute test the 7800 series once again makes a run at the top, with both cards rising past the GTX 570, although they can’t quite match the GTX 580. In an interesting turn of events the 7870 ends up being some 6% faster than the 7950, in spite of the fact that in a compute benchmark the 7950 should have a solid lead. This just goes to show that core clockspeeds do matter, and that adding more shaders alone can’t conquer all benchmarks.

Civilization V Theoretical Performance
Comments Locked

173 Comments

View All Comments

  • mak360 - Monday, March 5, 2012 - link

    Enjoy, now go and buy
  • ImSpartacus - Monday, March 5, 2012 - link

    Yeah, I'm trying to figure out if a 7850 could go in an Alienware X51. It looks like it uses a 6 pin power connector and puts out 150W of heat.

    While we would lose Optimus, would it work?
  • taltamir - Monday, March 5, 2012 - link

    optimus is laptops only. You do not have optimus with your desktop.
  • ImSpartacus - Monday, March 5, 2012 - link

    The X51 has desktop Optimus.

    "The icing on the graphics cake is that the X51 is the first instance of desktop Optimus we've seen. That's right: you can actually plug your monitor into the IGP's HDMI port and the tower will power down the GPU when it's not in use. This implementation functions just like the notebook version does, and it's a welcome addition."

    http://www.anandtech.com/show/5543/alienware-x51-t...

    In reality, if I owned an X51, I would wait so I could shove the biggest 150W Kepler GPU in there for some real gaming.

    But I'm sure the X51 will be updated for Kepler and Ivy Bridge, so now wouldn't be the best time to get an X51.

    Waiting games are lame...
  • scook9 - Monday, March 5, 2012 - link

    Wrong. Read a review.....The bigger issue will be the orientation of the PCIe Power Connector I expect. I have a tightly spaced HTPC that currently uses a GTX 570 HD from EVGA because it was the best card I could fit in the Antec Fusion enclosure. If the PCIe power plugs were facing out the side of the card and not the back I would have not been able to use it. I expect the same consideration will apply to the even smaller X51
  • kungfujedis - Monday, March 5, 2012 - link

    he does. x51 is a desktop with optimus.

    http://www.theverge.com/2012/2/3/2768359/alienware...
  • Samus - Monday, March 5, 2012 - link

    EA really screwed AMD over with Battlefield 3. There's basically no reason to consider a Radeon card if you plan on heavily playing BF3, especially since most other games like Skyrim, Star Wars, Rage, etc, all run excellent on any $200+ card, with anything $300+ being simply overkill.

    The obvious best card for Battlefield 3 is a Geforce GTX 560 TI 448 Cores for $250-$280, basically identical in performance to the GTX570 in BF3. Even those on a budget would be better served with a low-end GTX560 series card unless you run resolutions above 1920x1200.

    If I were AMD, I'd concentrate on increasing Battlefield 3 performance with driver tweaks, because it's obvious their architecture is superior to nVidia's, but these 'exclusive' titles are tainted.
  • kn00tcn - Monday, March 5, 2012 - link

    screwed how? only the 7850 is slightly lagging behind, & historically BC2 was consistently a little faster on nv

    also BF3 has a large consistent boost since feb14 drivers (there was another boost sometime in december, benchmark3d should have the info for both)
  • chizow - Tuesday, March 6, 2012 - link

    @ Samus

    BF3 isn't an Nvidia "exclusive", they made sure to remain vendor agnostic and participate in both IHV's vendor programs. No pointing the finger and crying foul on this game, it just runs better on Nvidia hardware but I do agree it should be running better than it does on this new gen of AMD hardware.

    http://www.amd4u.com/freebattlefield3/
    http://sites.amd.com/us/game/games/Pages/battlefie...
  • CeriseCogburn - Monday, March 26, 2012 - link

    In the reviews here SHOGUN 2 total war is said to be the very hardest on hardware, and Nvidia wins that - all the way to the top.
    --
    So on the most difficult game, Nvidia wins.
    Certainly other factors are at play on these amd favored games like C1 and M2033 and other amd optimized games.
    --
    Once again, on the MOST DIFFICULT to render Nvidia has won.

Log in

Don't have an account? Sign up now