DiRT: Showdown

Racing to the front of our 2013 list will be our racing benchmark, DiRT: Showdown. DiRT: Showdown is based on the latest iteration of Codemasters’ EGO engine, which has continually evolved over the years to add more advanced rendering features. It was one of the first games to implement tessellation, and also one of the first games to implement a DirectCompute based forward-rendering compatible lighting system. At the same time as Codemasters is by far the most prevalent PC racing developers, it’s also a good proxy for some of the other racing games on the market like F1 and GRID.

DiRT: Showdown is something of a divisive game for benchmarking. The game’s advanced lighting system, while not developed by AMD, does implement a lot of the key concepts they popularized with their Leo forward lighting tech demo. As a result performance with that lighting system turned on has been known to greatly favor AMD cards. With that said, since we’re looking at high-end cards there’s really little reason not to be testing with it turned on since even a slow card can keep up. That said, this is why we also test DiRT with advanced lighting both on and off starting at 1920x1080 Ultra.

The end result is perhaps unsurprising in that NVIDIA already starts with a large deficit with the GTX 680 versus AMD’s Radeon cards. Titan closes the gap and is enough to surpass the 7970GE at every resolution except 5760, but just barely. This is the one game like this and as a result I don’t put a ton of stock into these results on a global level, but I thought it would make for an interesting look none the less.

This also settles some speculation of whether DiRT and its compute-heavy lighting system would benefit from the compute performance improvements Titan brings to the table. The answer to that is yes, but only by roughly as much as the increase in theoretical compute performance over GTX 680. We’re not seeing any kind of performance increase that could be attributed to improved compute efficiency here, which is why Titan can only just beat the 7970GE at 2560 here. However the jury is still out on whether this means that DiRT’s lighting algorithm doesn’t map well to Kepler period, or if it’s an implementation issue. We also saw some unexpected weak DirectCompute performance out of Titan with our SystemCompute benchmark, so this may be further evidence that DirectCompute isn’t currently taking full advantage of everything Titan offers.

In any case, at 2560 Titan is roughly 47% faster than the GTX 680 and all of 3% faster than the 7970GE. It’s enough to get Titan above the 60fps mark here, but at 5760 no single GPU, not even GK110, can get you 60fps. On the other hand, the equivalent AMD dual-GPU products, the 7970GECF and the 7990, have no such trouble. Dual-GPU cards will consistently win, but generally not like this.

Meet The 2013 GPU Benchmark Suite & The Test Total War: Shogun 2
Comments Locked

337 Comments

View All Comments

  • CeriseCogburn - Saturday, February 23, 2013 - link

    Stop whining along with the rest of them, grow a set, get a job, and buy two of them.

    Might do you some good.
  • Alucard291 - Sunday, February 24, 2013 - link

    Unlike you, I have a job :)
  • chizow - Sunday, February 24, 2013 - link

    Good point, I'd tend to agree with that assessment as anyone who actually works for their money would not be so eager to part with it so quickly in $1K denominations for what amounts to a glorified pinball machine.

    He's probably a kid who has never had to work a day in his life or a basement dweller who has no hope of ever buying one of these anyways.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    And now with the pure troll, the lying idiot conspiracist nVidia hater takes on the pure personal attack for a big fat ZERO score.

    Congratulations, you and your pure troll can high five each other and both be wrong anyway, for another yer or two, or the rest of your whining crybaby posting PC herd idiot mentality lives.
  • Alucard291 - Friday, March 8, 2013 - link

    No no kid. You're the "pure troll here"

    So yeah go get a job and buy two of them. For yourself. Stop being angry at us for not being able to afford it

    ~lol~
  • wiyosaya - Thursday, February 21, 2013 - link

    While I understand your frustrations, IMHO, this is a card aimed at those wanting the compute performance of a Tesla at 1/3 the cost. As I see it, nVidia shot themselves in the foot for compute performance with the 680, and as such, I bet that 680 sales were less than expected primarily because of its crappy compute performance in comparison to say even a 580. This may have been their strategy, though, as they might have expected $3,500 Teslas to fly off the shelf.

    I am also willing to bet that Teslas did not fly off the shelf, and that in order to maintain good sales, they have basically dropped the price of the first GK110s to something that is reasonable with this card. Once can now buy 3.5 Titan's for the price of the entry level GK110 Tesla, and I highly expect nVidia to make a profit rather than the killing that they might have thought possible on the GK110 Teslas.

    That said, I bet that nVidia gets a sht load of orders for this card from new HPC builders and serious CAD/CAE workstation suppliers. Many CAD/CAE software packages like SolidWorks and Maple support GPGPUs in their code making this card a bargain for their builds.

    My apologies, to all the gamers here but us compute nerds are drooling over this card. I only wish I could afford one to put in my i7-3820 build from July. It is more than 2x what I paid for a 580 back then, and the 580 buy was for its compute performance.
  • atlr - Thursday, February 21, 2013 - link

    wiyosaya, I am trying to come up to speed on comparing compute performance between Nvidia and AMD options. Is the Titan drool-worthy only for software that only uses CUDA and not OpenCL? This reminds me of the days of Glide versus OpenGL APIs.
  • trajan2448 - Friday, February 22, 2013 - link

    AMDs fps numbers are overstated. They figured out a trick to make runt frames, or frames which are not actually rendered to trigger the fps monitor as a real fully rendered frame. This is real problem for AMD much worse than the latency problem. Crossfire is a disaster which is why numerous reviewers including Tech Report have written that Crossfire produces higher fps but feels less smooth than Nvidia.
    Check this article out. http://www.pcper.com/reviews/Graphics-Cards/NVIDIA...
  • chizow - Saturday, February 23, 2013 - link

    That's an awesome analysis by PCPer, thanks for linking that. Might end up being the biggest driver cheat scandal in history. Runt framesgate lol.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    HUGE amd cheat.

    It's their standard operating procedure.

    The fanboys will tape their mouths, gouge out their eyes and stick fingers in their ears and chant: "I'm not listening".

Log in

Don't have an account? Sign up now