Final Words

Bringing things to a close, before writing up this article I spent some time going through our archives to take a look at past GPU reviews. While AMD has routinely retaken the performance crown for a time by beating NVIDIA in releasing next-generation GPUs first – such was the case with the Radeon HD 5870 and Radeon HD 7970 – the typical pattern is for AMD’s flagship single-GPU card to trail NVIDIA’s flagship once NVIDIA has caught up. In a generational matchup AMD has not been able to beat or tie NVIDIA for the highest performing single-GPU card a very long time. And as it turns out the last time that happened was six years ago, with the Radeon X1950 XTX in 2006.

Six years is a long time to wait, but patience, perseverance, and more than a few snub moves against NVIDIA have paid off for AMD. For the first time in 6 years we can say that AMD is truly competitive for the single-GPU performance crown. The Radeon HD 7970 GHz Edition isn’t quite fast enough to outright win, but it is unquestionably fast enough to tie the GeForce GTX 680 as the fastest single-GPU video card in the world today. With that said there’s a lot of data to go through, so let’s dive in.

As far as pure gaming performance goes the 7970GE and the GTX 680 are tied in our benchmarks at the top single monitor resolution of 2560x1600. The 7970GE scores some impressive wins in Crysis and DiRT 3, while NVIDIA manages to hold on to their substantial leads in Battlefield 3 and Portal 2. Elsewhere we see the 7970GE win at some games while the GTX 680 wins at others, and only very rarely do the two cards actually tie. Ultimately this is very much a repeat of what we saw with the GTX 670 versus the 7970, and the 6970 versus the GTX 570, which is to say that the 7970GE and GTX 680 are tied on average but are anything but equal.

Our advice then for prospective buyers is to first look at benchmarks for the games they intend to play. If you’re going to be focused on only a couple of games for the near future then there’s a very good chance one card or the other is going to be the best fit. Otherwise for gamers facing a wide selection of games or looking at future games where their performance is unknown, then the 7970GE and GTX 680 are in fact tied, and from a performance perspective you couldn’t go wrong with either one.

As an addendum to that however, while the 7970GE and GTX 680 are tied at 2560x1600 and other single-monitor resolutions the same cannot be said for multi-monitor configurations. The 7970GE and GTX 680 still trade blows on a game-by-game basis with Eyefinity/NVIDIA Surround, but there’s a clear 6% advantage on average for the 7970GE. Furthermore the 7970GE has 3GB of VRAM versus 2GB for the GTX 680, which makes the 7970GE all the better suited for multi-monitor gaming in the future. AMD may be tied for single-monitor gaming, but they have a clear winner on their hands for multi-monitor gaming.

With that said, AMD has made a great sacrifice to get to this point, and it’s one that’s going to directly impact most users. AMD has had to push the 7970GE harder than ever to catch up to the GTX 680, and as a result the 7970GE’s power consumption and noise levels are significantly higher than the GTX 680’s. It’s unfortunate for AMD that NVIDIA managed to tie AMD’s best gaming performance with a 104-series part, allowing them to reap the benefits of lower power consumption and less noise in the process. Simply put, the 7970GE is unquestionably hotter and uncomfortably louder than the GTX 680 for what amounts to the same performance. If power and noise are not a concern then this is not a problem, but for many buyers they're going to be unhappy with the 7970GE. It’s just too loud.

Of course this isn’t the first time we’ve had a hot and loud card on our hands – historically it happens to NVIDIA a lot, but when NVIDIA gets hot and loud they bring the performance necessary to match it. Such was the case with the GTX 480, a notably loud card that also had a 15% performance advantage on AMD’s flagship. AMD has no such performance advantage here, and that makes the 7970GE’s power consumption and noise much harder to justify even with a “performance at any cost” philosophy.

The end result is that while AMD has tied NVIDIA for the single-GPU performance crown with the Radeon HD 7970 GHz Edition, the GeForce GTX 680 is still the more desirable gaming card. There are a million exceptions to this statement of course (and it goes both ways), but as we said before, these cards may be tied but they're anything but equal.

Noise issues aside, we’re finally seeing something that we haven’t seen for a very long time: bona fide, cut throat, brutal competition in the high-end video card segment for the fastest single-GPU video card. To call it refreshing is an understatement; it’s nothing short of fantastic. For the first time in six years AMD is truly performance competitive with NVIDIA at the high-end and we couldn't be happier.

Welcome back to the fight AMD; we’ve missed your presence.

OC: Gaming Performance
POST A COMMENT

109 Comments

View All Comments

  • piroroadkill - Friday, June 22, 2012 - link

    While the noise is bad - the manufacturers are going to spew out non-reference, quiet designs in moments, so I don't think it's an issue. Reply
  • silverblue - Friday, June 22, 2012 - link

    Toms added a custom cooler (Gelid Icy Vision-A) to theirs which reduced noise and heat noticably (about 6 degrees C and 7-8 dB). Still, it would be cheaper to get the vanilla 7970, add the same cooling solution, and clock to the same levels; that way, you'd end up with a GHz Edition clocked card which is cooler and quieter for about the same price as the real thing, albeit lacking the new boost feature. Reply
  • ZoZo - Friday, June 22, 2012 - link

    Would it be possible to drop the 1920x1200 definition for test? 16/10 is dead, 1080p has been the standard for high definition on PC monitors for at least 4 years now, it's more than time to catch up with reality... Sorry for the rant, I'm probably nitpicking anyway... Reply
  • Reikon - Friday, June 22, 2012 - link

    Uh, no. 16:10 at 1920x1200 is still the standard for high quality IPS 24" monitors, which is a fairly typical choice for enthusiasts. Reply
  • paraffin - Saturday, June 23, 2012 - link

    I haven't been seeing many 16:10 monitors around thesedays, besides, since AT even tests iGPU performance at ANYTHING BUT 1080p your "enthusiast choice" argument is invalid. 16:10 is simply a l33t factor in a market dominated by 16:9. I'll take my cheap 27" 1080p TN's spaciousness and HD content nativiness over your pricy 24" 1200p IPS' "quality" anyday. Reply
  • CeriseCogburn - Saturday, June 23, 2012 - link

    I went over this already with the amd fanboys.
    For literally YEARS they have had harpy fits on five and ten dollar card pricing differences, declaring amd the price perf queen.

    Then I pointed out nVidia wins in 1920x1080 by 17+% and only by 10+% in 1920x1200 - so all of a sudden they ALL had 1920x1200 monitors, they were not rare, and they have hundreds of extra dollars of cash to blow on it, and have done so, at no extra cost to themselves and everyone else (who also has those), who of course also chooses such monitors because they all love them the mostest...

    Then I gave them egg counts, might as well call it 100 to 1 on availability if we are to keep to their own hyperactive price perf harpying, and the lowest available higher rez was $50 more, which COST NOTHING because it helps amd, of course....

    I pointed out Anand pointed out in the then prior article it's an ~11% pixel difference, so they were told to calculate the frame rate difference... (that keeps amd up there in scores and winning a few they wouldn't otherwise).

    Dude, MKultra, Svengali, Jim Wand, and mass media, could not, combined, do a better job brainwashing the amd fan boy.

    Here's the link, since I know a thousand red-winged harpies are ready to descend en masse and caw loudly in protest...

    http://translate.google.pl/translate?hl=pl&sl=...

    1920x1080: " GeForce GTX680 is on average 17.61% more efficient than the Radeon 7970.
    Here, the performance difference in favor of the GTX680 are even greater"

    So they ALL have a 1920x1200, and they are easily available, the most common, cheap, and they look great, and most of them have like 2 or 3 of those, and it was no expense, or if it was, they are happy to pay it for the red harpy from hades card.
    Reply
  • silverblue - Monday, June 25, 2012 - link

    Your comparison article is more than a bit flawed. The PCLab results, in particular, have been massively updated since that article. Looks like they've edited the original article, which is a bit odd. Still, AMD goes from losing badly in a few cases to not losing so badly after all, as the results on this article go to show. They don't displace the 680 as the best gaming card of the moment, but it certainly narrows the gap (even if the GHz Edition didn't exist).

    Also, without a clear idea of specs and settings, how can you just grab results for a given resolution from four or five different sites for each card, add them up and proclaim a winner? I could run a comparison between a 680 and 7970 in a given title with the former using FXAA and the latter using 8xMSAA, doesn't mean it's a good comparison. I could run Crysis 2 without any AA and AF at all at a given resolution on one card and then put every bell and whistle on for the other - without the playing field being even, it's simply invalid. Take each review at its own merits because at least then you can be sure of the test environment.

    As for 1200p monitors... sure, they're more expensive, but it doesn't mean people don't have them. You're just bitter because you got the wrong end of the stick by saying nobody owned 1200p monitors then got slapped down by a bunch of 1200p monitor owners. Regardless, if you're upset that NVIDIA suddenly loses performance as you ramp up the vertical resolution, how is that AMD's fault? Did it also occur to you that people with money to blow on $500 graphics cards might actually own good monitors as well? I bet there are some people here with 680s who are rocking on 1200p monitors - are you going to rag (or shall I say "rage"?) on them, too?

    If you play on a 1080p panel then that's your prerogative, but considering the power of the 670/680/7970, I'd consider that a waste.
    Reply
  • FMinus - Friday, June 22, 2012 - link

    Simply put; No!

    1080p is the second worst thing that happened to the computer market in the recent years. The first worst thing being phasing out 4:3 monitors.
    Reply
  • Tegeril - Friday, June 22, 2012 - link

    Yeah seriously, keep your 16:9, bad color reproduction away from these benchmarks. Reply
  • kyuu - Friday, June 22, 2012 - link

    16:10 snobs are seriously getting out-of-touch when they start claiming that their aspect ratio gives better color reproduction. There are plenty of high-quality 1080p IPS monitors on the market -- I'm using one.

    That being said, it's not really important whether it's benchmarked at x1080 or x1200. There is a neglible difference in the number of pixels being drawn (one of the reasons I roll my eyes at 16:10 snobs). If you're using a 1080p monitor, just add anywhere from 0.5 to 2 FPS to the average FPS results from x1200.

    Disclaimer: I have nothing *against* 16:10. All other things being equal, I'd choose 16:10 over 16:9. However, with 16:9 monitors being so much cheaper, I can't justify paying a huge premium for a measily 120 lines of vertical resolution. If you're willing to pay for it, great, but kindly don't pretend that doing so somehow makes you superior.
    Reply

Log in

Don't have an account? Sign up now