DiRT 3

For racing games our racer of choice continues to be DiRT, which is now in its 3rd iteration. Codemasters uses the same EGO engine between its DiRT, F1, and GRID series, so the performance of EGO has been relevant for a number of racing games over the years.

DiRT 3 - 2560x1600 - DX11 Ultra Quality + 4xAA

DiRT 3 - 1920x1200 - DX11 Ultra Quality + 4xAA

DiRT 3 - 1680x1050 - DX11 Ultra Quality + 4xAA

First it loses, then it ties, and then it starts to win.

After a very poor start in Crysis NVIDIA has finally taken a clear lead in a game. DiRT 3 has historically favored NVIDIA’s video cards so this isn’t wholly surprising, but it’s our first proof that the GTX 680 can beat the 7970, with the GTX 680 taking a respectable 6% lead at 2560. Interestingly enough the lead increases as we drop down in resolution, which is something we have also seen with past Radeon and GeForce cards. It looks like Fermi’s trait of dropping off in performance more rapidly with resolution than GCN has carried over to the GTX 680.

In any case, compared to the GTX 580 this is another good showing for the GTX 680. The 680’s lead on the 580 is a rather consistent 36-38%.

DiRT 3 - Minimum Frame Rate - 2560x1600

DiRT 3 - Minimum Frame Rate - 1920x1200

DiRT 3 - Minimum Frame Rate - 1680x1050

The minimum framerates reflect what we’ve seen with the averages; the GTX 680 has a slight lead on the 7970 at 2560, while it beats the GTX 580 by over 30%.

Metro 2033 Total War: Shogun 2
Comments Locked

404 Comments

View All Comments

  • CeriseCogburn - Sunday, March 25, 2012 - link

    They get to show amd "catching up" so they like it. They get to try to puke out Kepler's 2G ram and make amd's 3G shine, so they "can't resist" - and when frame rates fall below playable, "they all of a sudden" "don't care", even when the puking attempt fails. They haven't been able to resist since the 580 w 1.5G vs 2G 6950/6970 it was a great blame the low ram game for any changes.
    Then they checked 6950 1G 2G and 2G was slower...but so what.
    Now 2G Kepler has put the ram lie to rest even in triple monitor gaming... but any lesser win or loss or slimming margin can still be blamed on that, it gets people "buying the amd card" and they get real frustrated here when they can't figure out why Nvidia is winning when they don't believe it should be. It's always expressed in the article how shocked they are. So ram is a convenient scapegoat. It's always used a "future proofing" notion as well, though no evidence has ever surfaced for that.
  • _vor_ - Sunday, March 25, 2012 - link

    What's with all the nerdrage? Do you work for NVIDIA?
  • formulav8 - Sunday, March 25, 2012 - link

    Get over yourself already. NVidia doesn't even like You. Can't believe how people feel about a stinking stupid corporation.
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    It's not about a corporation it's about facts guy. Facts mean my friends and my readers get the best they can get for the buck they are paying.
    Just because amd is behind and therefore lies are told, does not mean the truth should not shine through !
    The truth shall shine through !
  • AnnonymousCoward - Sunday, March 25, 2012 - link

    Personally, I don't care if the card has 64kB of RAM. Or 8 million stream processors. Performance, cost, power, and noise are what matter.

    And back to my point: performance in the 20-50fps range at 2560x1600 4xAA is meaningless and not a criteria for judgment.
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    I never disagreed with that point I merely explained why things are done in such and such a way while ignoring other things.
    It's not difficult at all.
  • Zephyr66z0r - Sunday, March 25, 2012 - link

    Well I understand 'some' of the tech behind the GTX680 one thing stands out. 256bit bus width, when you see that with nvidia its along the lines of GTX560.... so does that mean there's going be a 384bit (mid-high) or 512bit(high-enth, 256bit + 256bit + 2 GPU) card/s coming out?

    I can't wait, anyone done SLi with it yet?
  • dmnwlv - Sunday, March 25, 2012 - link

    First off, I think nVidia has done a good job with the new GTX680.

    However I do not need a game that is already running at 100+ frames to be even faster.
    It needs to be fast at where it counts - games that are still running slow at 60 fps and below.

    For this, of 3 relevant games, nVidia is faster at just one of them. Experience (if you also remember) has shown that the results could be very different once frames for some settings/games hit below 60fps.

    Hence I cannot agree with all the big f about GTX680 is so much faster fuss.
    You guys are led by the heart (much alike ati fanboys you used to call) than the brain.

    And all other compute tests are non-relevant to me (and majority of you to be honest).
  • gramboh - Monday, March 26, 2012 - link

    What about a little game (that several million people play) called Battlefield 3? NV has a massive lead with the GTX 680 over the 7970/7950. AT only benches single player, but the game is even more punishing in 64 player multiplayer. Having a smooth framerate at max detail with 4X AA/16X AF is a big competitive advantage and makes the game significantly more enjoyable.

    Kind of disappointed the card isn't faster in Witcher 2, which I think has the best graphics of a single player game.
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    Have all of you people repeating that FUD forgotten Shogun 2 Total War ?
    It's the hardest game in the bench set according to anandtech...
    How is it that THE HARDEST GAME that Nvidia swept top to bottom at every resolution is suddenly and completely forgotten about, while we hear these other FUD declarations ?
    How does that work, just repeat what some other mistaken fudder spewed ?

Log in

Don't have an account? Sign up now