DiRT: Showdown

Racing to the front of our 2013 list will be our racing benchmark, DiRT: Showdown. DiRT: Showdown is based on the latest iteration of Codemasters’ EGO engine, which has continually evolved over the years to add more advanced rendering features. It was one of the first games to implement tessellation, and also one of the first games to implement a DirectCompute based forward-rendering compatible lighting system. At the same time as Codemasters is by far the most prevalent PC racing developers, it’s also a good proxy for some of the other racing games on the market like F1 and GRID.

DiRT: Showdown is something of a divisive game for benchmarking. The game’s advanced lighting system, while not developed by AMD, does implement a lot of the key concepts they popularized with their Leo forward lighting tech demo. As a result performance with that lighting system turned on has been known to greatly favor AMD cards. With that said, since we’re looking at high-end cards there’s really little reason not to be testing with it turned on since even a slow card can keep up. That said, this is why we also test DiRT with advanced lighting both on and off starting at 1920x1080 Ultra.

The end result is perhaps unsurprising in that NVIDIA already starts with a large deficit with the GTX 680 versus AMD’s Radeon cards. Titan closes the gap and is enough to surpass the 7970GE at every resolution except 5760, but just barely. This is the one game like this and as a result I don’t put a ton of stock into these results on a global level, but I thought it would make for an interesting look none the less.

This also settles some speculation of whether DiRT and its compute-heavy lighting system would benefit from the compute performance improvements Titan brings to the table. The answer to that is yes, but only by roughly as much as the increase in theoretical compute performance over GTX 680. We’re not seeing any kind of performance increase that could be attributed to improved compute efficiency here, which is why Titan can only just beat the 7970GE at 2560 here. However the jury is still out on whether this means that DiRT’s lighting algorithm doesn’t map well to Kepler period, or if it’s an implementation issue. We also saw some unexpected weak DirectCompute performance out of Titan with our SystemCompute benchmark, so this may be further evidence that DirectCompute isn’t currently taking full advantage of everything Titan offers.

In any case, at 2560 Titan is roughly 47% faster than the GTX 680 and all of 3% faster than the 7970GE. It’s enough to get Titan above the 60fps mark here, but at 5760 no single GPU, not even GK110, can get you 60fps. On the other hand, the equivalent AMD dual-GPU products, the 7970GECF and the 7990, have no such trouble. Dual-GPU cards will consistently win, but generally not like this.

Meet The 2013 GPU Benchmark Suite & The Test Total War: Shogun 2
Comments Locked

337 Comments

View All Comments

  • CeriseCogburn - Tuesday, March 12, 2013 - link

    ROFL another amd fanboy having a blowout. Mommie will be down to the basement with the bar of soap, don't wet your pants.
    When amd dies your drivers will still suck, badly.
  • trajan2448 - Saturday, March 16, 2013 - link

    Until you guys start showing latencies, these reviews based primarily on fps numbers don't tell the whole story. Titan is 4x faster than multi GPU solutions in real rendering.
  • IUU - Wednesday, March 20, 2013 - link

    Just a thought: if they price titan say at 700 or 500 (that was the old price point for flagship cards), how on earth will they market game consoles, and the brave "new" world of the mobile "revolution"?
    Like it or not, high tech companies have found a convenient way to get away from the cutthroat competition of the pc-land(from there their hate and slogans like post-pc and the rest) and get a breath of fresh(money) air!

    Whether this is also good for the consumer in the long run, remains to be seen, but the fact is, we will pay more to get less, unless something unexpected happens.
  • paul_59 - Saturday, June 15, 2013 - link

    I would appreciate any intelligent opinions on the merits of buying a 690 card versus a Titan, considering they retail for the same price
  • bravegag - Tuesday, August 13, 2013 - link

    I have bought the EVGA nVidia GTX Titan, actually two of them instead of the Tesla K20 thanks to the benchmark results posted in this article. However, the performance results I got are nowhere close to the ones shown here. Running DGEMM from CUDA 5.5 and CUBLAS example matrixMulCUBLAS with my EVGA nVidia GTX Titan reaches no more than 220 GFlop/s which is nowhere close to 1 TFlop/s. My question is then, are the results presented here a total fake?

    I created the following project where some additional HPC benchmarks of the nVidia GTX Titan are included, the benchmark computing environment is also detailed there:
    https://github.com/bravegag/eigen-magma-benchmark
  • bravegag - Wednesday, August 14, 2013 - link

    have anyone tried replicating the benchmark results shown here? how did it go?
  • Tunnah - Wednesday, March 18, 2015 - link

    It feels nVidia are just taking the pee out of us now. I was semi-miffed at the 970 controversy, I know for business reasons etc. it doesn't make sense to truly trounce the competition (and your own products) when you can instead hold something back and keep it tighter, and have something to release in case they surprise you.

    And I was semi-miffed when I heard it would be more like a 33% improvement over the current cream of the crop, instead of the closer to 50% increase the Titan was over the 680, because they have to worry about the 390x, and leave room for a Titan X White Y Grey SuperHappyTime version.

    But to still charge $1000 even though they are keeping the DP performance low, this is just too far. The whole reasoning for the high price tag was you were getting a card that was not only a beast of a gaming card, but it would hold its own as a workstation card too, as long as you didn't need the full Quadro service. Now it is nothing more than a high end card, a halo product...that isn't actually that good!

    When it comes down to it, you're paying 250% the cost for 33% more performance, and that is disgusting. Don't even bring RAM into it, it's not only super cheap and in no way a justification for the cost, but in fact is useless, because NO GAMER WILL EVER NEED THAT MUCH, IT WAS THE FLIM FLAMMING WORKSTATION CROWD WHO NEEDING THAT FLIM FLAMMING AMOUNT OF FLOOMING RAM YOU FLUPPERS!

    This feels like a big juicy gob of spit in our faces. I know most people bought these purely for the gaming option and didn't use the DP capability, but that's not the point - it was WORTH the $999 price tag. This simply is not, not in the slightest. $650, $750 tops because it's the best, after all..but $999 ? Not in this lifetime.

    I've not had an AMD card since way back in the days of ATi, I am well and truly part of the nVidia crowd, even when they had a better card I'd wait for the green team reply. But this is actually insulting to consumers.

    I was never gonna buy one of these, I was waiting on the 980Ti for the 384bit bus and the bumps that come along with it...but now I'm not only hoping the 390x is better than people say because then nVidia will have to make it extra good..I'm hoping it's better than they say so I can actually buy it.

    For shame nVidia, what you're doing with this card is unforgivable

Log in

Don't have an account? Sign up now