DiRT 3

For racing games our racer of choice continues to be DiRT, which is now in its 3rd iteration. Codemasters uses the same EGO engine between its DiRT, F1, and GRID series, so the performance of EGO has been relevant for a number of racing games over the years.

DiRT 3 - 2560x1600 - DX11 Ultra Quality + 4xAA

DiRT 3 - 1920x1200 - DX11 Ultra Quality + 4xAA

DiRT 3 - 1680x1050 - DX11 Ultra Quality + 4xAA

First it loses, then it ties, and then it starts to win.

After a very poor start in Crysis NVIDIA has finally taken a clear lead in a game. DiRT 3 has historically favored NVIDIA’s video cards so this isn’t wholly surprising, but it’s our first proof that the GTX 680 can beat the 7970, with the GTX 680 taking a respectable 6% lead at 2560. Interestingly enough the lead increases as we drop down in resolution, which is something we have also seen with past Radeon and GeForce cards. It looks like Fermi’s trait of dropping off in performance more rapidly with resolution than GCN has carried over to the GTX 680.

In any case, compared to the GTX 580 this is another good showing for the GTX 680. The 680’s lead on the 580 is a rather consistent 36-38%.

DiRT 3 - Minimum Frame Rate - 2560x1600

DiRT 3 - Minimum Frame Rate - 1920x1200

DiRT 3 - Minimum Frame Rate - 1680x1050

The minimum framerates reflect what we’ve seen with the averages; the GTX 680 has a slight lead on the 7970 at 2560, while it beats the GTX 580 by over 30%.

Metro 2033 Total War: Shogun 2
Comments Locked

404 Comments

View All Comments

  • _vor_ - Tuesday, March 27, 2012 - link

    All I read is blah blah blah NVIDIA blah blah nerdrage blah blah.
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    I'll translate for the special people that need more help.
    AMD's IQ has been bad since 5000 series, with 6000 series also screwey.
    You will have shimmering in game textures and lines in shading transitions on screen since their algorithm has been messed up for years, even though it is angle independent and a perfect circle, IT SUCKS in real life - aka gaming.
    Nvidia doesn't have this problem, and hasn't had it since before the 5000 series amd cards.
    AMD's 7000 series tries once again to fix the ongoing issues, but fails in at least 2 known places, having only Dx9 support, but may have the shimmering and shading finally tackled and up to Nvidia quality, at least in one synthetic check.
  • _vor_ - Tuesday, March 27, 2012 - link

    How much is NVIDIA paying you to babysit this discussion and zealously post?

    "It's better to keep quiet and people think you are a fool, than to open your mouth and prove them right."
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    Words right from anandtechs articles, and second attack.
    A normal person would be thankful for the information.
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    Did you notice the Nvidia card won Civ5 by more than the amd did in Metro2033, but Civ5 is declared a tie, and well we know what everyone is claiming for Metro2033.
    I noticed that and thought it was quite interesting how that was accomplished.
  • BoFox - Monday, March 26, 2012 - link

    AMD's angle-independent AF is still flawed in that it's not fully trilinear when it comes to high-frequency textures (noisy moire). You'd be seeing lines of transition when everything suddenly becomes a bit blurry in a distance with these kinds of grainy textures.

    It's rather subjective, though.

    Nvidia does offer up to 32x CSAA with TRAA (transparent, or alpha textures) in DX10/11 games for superb IQ without having to use brute-force SSAA. AMD does not currently support "forced" AAA (Adaptive AA) on alpha textures in DX10/11 games, and the SSAA support in DX10/11 games was finally announced in beta driver support form with HD 7970 cards.

    Transparency AA has been around since 2005, and Nvidia actually maintained the quality IQ options for DX10/11 games compared to DX9 games all along.
  • ati666 - Monday, March 26, 2012 - link

    did AMD fix this problem in their HD7970 or not?
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    We will find out what's wrong with it a year from now when the next series big 8000 is launched, until then denials and claims it's as good as nvidia are standard operating procedure, and spinning useless theoretical notions that affect gameplay exactly zero and have amd IQ disadvantages will be spun in a good light for amd to get all the amd fans claiming the buzzwords are a win.
    That will work like it has for the last 3 releases, 4000, 5000, and 6000, and we just heard the 7000 series fixes that fix the 5000 and 6000 crud that was covered up until now in the 7970 release article.
    So amd users will suffer bad IQ in several ways while buzzing up words that are spun from this website as notional greatness and perfectness of amd till like, next release... then your question will be answered - just try to not notice anything until then, ok ?
  • blanarahul - Saturday, March 24, 2012 - link

    I was confused as to GPU Boost was necessary or not. Thanks for making the difference clear.
  • ammyt - Saturday, March 24, 2012 - link

    Dafuq y'all saying?
    The benchmarks are tight in front of your faces! The 680 is tied with the 7950, which surpasses it by a little, and the 7970 is the leader. The 7950 is cheaper by a little margin, but the 7970 is roughly $80 more expensive. What are y'all fighting for?

    If I were to choose between the 680, 7950, 7970, I will choose the 7950, cheaper, and a faster by a little margin than the 680. I don't care how or why (memory clock, architecture, bla bla bla) but the benchmarks are in front of you! Clearly, anandtech is biased towards Nvidia.

    (Perhaps they're getting paid from them more than AMD...)

Log in

Don't have an account? Sign up now