Compute Performance

For our look at compute performance this is going to be a brief look. Our OpenGL AES and DirectCompute Fluid Simulation benchmarks simply don’t scale with multiple GPUs, so we’ll skip though (though the data is still available in Bench).

Our first compute benchmark comes from Civilization V, which uses DirectCompute to decompress textures on the fly. Civ V includes a sub-benchmark that exclusively tests the speed of their texture decompression algorithm by repeatedly decompressing the textures required for one of the game’s leader scenes. Note that this is a DX11 DirectCompute benchmark.

Given the nature of the benchmark, it’s not surprising that we see a performance regression here with some setups. The nature of this benchmark is that it doesn’t split across multiple GPUs well, though that doesn’t stop AMD and NVIDIA from tying. This doesn’t impact real game performance as we’ve seen, but it’s a good reminder of the potential pitfalls of multi-GPU configurations. Though AMD does deserve some credit here for gaining on their single GPU performance, pushing their lead even higher.

Our other compute benchmark is SmallLuxGPU, the GPU ray tracing branch of the open source LuxRender renderer. We’re now using a development build from the version 2.0 branch, and we’ve moved on to a more complex scene that hopefully will provide a greater challenge to our GPUs.

Unlike the Civ V compute benchmark, SLG scales very well with multiple GPUs, nearly doubling in performance. Unfortunately for NVIDIA GK104 shows its colors here as a compute-weak GPU, and even with two of them we’re nowhere close to one 7970, let alone the monster that is two. If you’re looking at doing serious GPGPU compute work, you should be looking at Fermi, Tahiti, or the future Big Kepler.

Civilization V Power, Temperature, & Noise
Comments Locked

200 Comments

View All Comments

  • JPForums - Thursday, May 3, 2012 - link

    Sadly, it is a very uncommon resolution for new monitors. Almost every 22-24" monitor your buy today is 1080p instead of 1200p. :(


    Not mine. I'm running a 1920x1200 IPS.
    1920x1200 is more common in the higher end monitor market.
    A quick glance at newegg shows 16 1920x1200 models with at 24" alone. (starting at $230)
    Besides, I can't imagine many buy a $1000 dollar video card and pair it with a single $200 display.

    It makes more sense to me to check 1920x1200 performance than 1920x1080 for several reasons:
    1) 1920x1200 splits the difference between 16x10 and 25x14 or 25x16 better than 1920x1080.
    1680x1050 = ~1.7MP
    1920x1080=~2MP
    1920x1200=~2.3MP
    2560*1440=~3.7MP
    2560x1600=~4MP

    2) People willing to spend $1000 for a video card are generally in a better position to get a nicer monitor. 1920x1200 monitors are more common at higher prices.

    3) They already have three of them around to run 5760x1200. Why go get another monitor?

    Opinionated Side Points:
    Movies transitioned to resolutions much wider than 1080P long ago. A little extra black space really makes no difference.
    1920x1200 is a perfectly valid resolution. If Nvidia is having trouble with it, I want to know. When particular resolutions don't scale properly, it is probable that there is either a bug or shenanigans are at work in the more common resolutions.
    I prefer using 1920x1200 as a starting point for moving to triple screen setups. I already thing 1920x1080 looks squashed, so 5760x1080 looks downright flattened. Also 3240x1920 just doesn't look very surround to me (3600x1920 seems borderline surround).
  • CeriseCogburn - Saturday, May 5, 2012 - link

    There are only 18 models available in all of newegg with 1920x1200 resolution - only 6 of those are under $400, they are all over $300.
    +
    There are 242 models available in 1920x1080, with nearly 150 models under $300.
    You people are literally a bad joke when it comes to even a tiny shred of honesty.
  • Lerianis - Sunday, May 6, 2012 - link

    I don't know about the 'sadly' there in all honesty. I personally like 1920*1080 better than *1200, because nearly everything is done in the former resolution.
  • Stuka87 - Thursday, May 3, 2012 - link

    Who buys a GTX690 to play on a 1080P display? Even a 680 is overkill for 1080. You can save a lot of money with a 7870 and still run everything out there.
  • vladanandtechy - Thursday, May 3, 2012 - link

    Stuka i agree with you.....but when you buy such a card....you think in the future....5 maybe 6 years....and i can't gurantee that we will do gaming in 1080p then:)....
  • retrospooty - Thursday, May 3, 2012 - link

    "Stuka i agree with you.....but when you buy such a card....you think in the future....5 maybe 6 years....and i can't gurantee that we will do gaming in 1080p then:)...."

    I have to totally disagree with that. Anyone that pays $500+ for a video card is a certain "type" of buyer. That type of buyer will NEVER wait 5-6 years for an upgrade. That guy is getting the latest and greatest of every other generation, if not every generation of cards.
  • vladanandtechy - Thursday, May 3, 2012 - link

    You shouldn't "totally disagree".......meet me...."the exception"....i am the type of buyer who is looking for the "long run"....but i must confess....if i could....i would be the type of buyer you describe....cya
  • orionismud - Thursday, May 3, 2012 - link

    retrospooty and I mean you no disrespect, but if you're spending $500 and buying for the "long run," you're doing it wrong.

    If you had spent $250, you could have 80% of the performance for 2.5 years, then spend another $250 and have 200% of the performance for the remaining 2.5 years.
  • von Krupp - Thursday, May 3, 2012 - link

    Don't say that.

    I bought two (2) HD 7970s on the premise that I'm not going to upgrade them for a good long while. At least four years, probably closer to six. I ran from 2005 to 2012 with a GeForce 7800GT just fine and my single core AMD CPU was actually the larger reason why I needed to move on.

    Now granted, I also purchased a snazzy U2711 just so the power of these cards wouldn't go to waste (though I'm quite CPU-bound by this i7-3820), but I don't consider dropping AA in future titles to maintain performance to be that big of a loss; I already only run with 8x AF because , frankly, I'm too busy killing things to notice otherwise. I intend to drive this rig for the same mileage. It costs less for me to buy the best of the best at the time of purchase for $1000 and play it into the ground than it is to keep buying $350 cards to barely keep up every two years, all over a seven year duration. Since I now have this fancy 2560x1440 resolution and want to use it, the $250-$300 offerings don't cut it. And the, don't forget to adjust for inflation year over year.

    So yes, I'm going to be waiting between 4 and 6 years to upgrade. Under certain conditions, buying the really expensive stuff is as much of an economical move as it is a power grab. Not all of us who build $3000 computers do it on a regular basis.

    P.S. Thank you consoles for extending PC hardware life cycles. Makes it easier to make purchases.
  • Makaveli - Thursday, May 3, 2012 - link

    lol agree let put a $500 videocard with a $200 TN panel at 1920x1080 umm ya no!

Log in

Don't have an account? Sign up now