Battlefield 3

Its popularity aside, Battlefield 3 may be the most interesting game in our benchmark suite for a single reason: it’s the first AAA DX10+ game. It’s been 5 years since the launch of the first DX10 GPUs, and 3 whole process node shrinks later we’re finally to the point where games are using DX10’s functionality as a baseline rather than an addition. Not surprisingly BF3 is one of the best looking games in our suite, but as with past Battlefield games that beauty comes with a high performance cost.

Battlefield 3 continues to be NVIDIA’s ace, but the 7970GE combined with some minor driver performance improvements AMD has eroded that lead some. At 2560 the GTX 680 enjoys a 11% lead over even the 7970GE and even the GTX 670 can overtake the 7970GE, but at the very least AMD is on the threshold of 60fps here. More optimistically they have improved their performance over the 7970 by 11%, which is an above average performance gain for the 7970GE. AMD will likely never close the gap on current hardware but it will be in their best interests to keep it narrow for future Frostbite Engine 2 games.

Portal 2 Starcraft II
Comments Locked

110 Comments

View All Comments

  • Belard - Friday, June 22, 2012 - link

    Agreed.
  • Articuno - Friday, June 22, 2012 - link

    A whole new card launch and yet another pair of similarly named but differently performing products because they changed a few numbers that anyone can in several free, easily available programs.

    I suppose they can do this because you can actually buy their products though, unlike the 6XX series.
  • ExarKun333 - Friday, June 22, 2012 - link

    Yeah, tough to find 6xx products indeed. There is something called the 'internet' you could check out. Your buddy who posted for you might be able to help you out. ;)
  • Pantsu - Friday, June 22, 2012 - link

    I doubt any AIB will actually release GE cards with reference cooling. Most likely they will be custom cooled, so the loudness of the reference card is a bit of a moot point.

    It's good to see some decent driver improvements from AMD. I'm still quite happy about 7970 performance at 5760x1080, and it's enough for most games when OC'd. It would be interesting to see though, whether the GE has improved the max OC. Most likely it's no better though, and you'll be better off buying an old custom 7970 for a good price and OC'ing it to the same levels as the GE.
  • dagamer34 - Friday, June 22, 2012 - link

    The GE chips are better binned parts, one would assume that they have a bit more room for higher clocks than the normal 7970 parts. Certainly the average overclock will be higher.
  • CeriseCogburn - Saturday, June 23, 2012 - link

    So we can deduce that the prior 7970 overclocks were sucking down an even larger amount of enormous electrical power as those chips are of lower bin.

    I guess we need an overclocked power suction chart with an extended table for the amd housefire 7970.

    Any savings on card price or a few frames at a resolution near no one owns will be gobbled up by your electric bill every month for years - save 1 watt or 9 watts at extended idle, but when you game it's 100+ watts and beyond with the overclocked 7970 - maybe they should be $300 with 3 games.
  • silverblue - Monday, June 25, 2012 - link

    Well, it works both ways. You won't always be gaming, in addition there's all that compute hardware that, if properly harnessed, would save you money over competing solutions because you'd get the job done quicker. It used to be pointless to consider using anything for compute that wasn't a Quadro, Tesla or even FirePro, however those days are coming to an end.

    Having a 7970 will make sense for compute if that's your bag (there's a reason for the die size plus the extra memory and bus width), but this time, NVIDIA enjoys a performance/watt advantage which might go unchallenged for a while. Unless, of course, that extra hardware on the 7970 is properly leveraged; future games, perhaps?
  • ltcommanderdata - Friday, June 22, 2012 - link

    So do we think this will encourage nVidia to release a GeForce GK110 based product in the next few months rather than restrict it to Tesla?
  • PsiAmp - Friday, June 22, 2012 - link

    Nvidia isn't holding GK110 in its sleeve waiting for something. It is unfinished in the first place and there's no manufacturing capacity to produce such a large chip. Nvidia still struggles to fix GK104 design to have good yields. GK110 would be impossible to produce in since it is twice bigger and such will have at least 4 times less yield.
    Server market is not only much more profitable, it is operating on a contract basis. Nvidia will start to produce Tesla K20 in Q4 2012.

    IF(?) desktop card based on GK110 will hit the market it won't be sooner than Q1 2013. And it is not something that you can change really.
  • silverblue - Friday, June 22, 2012 - link

    "Of course this isn’t the first time we’ve had a hot & loud card on our hands – historically it happens to NVIDIA a lot – but when NVIDIA gets hot & loud they bring the performance necessary to match it. Such was the case with the GTX 480, a notably loud card that also had a 15% performance advantage on AMD’s flagship. AMD has no such performance advantage here, and that makes the 7970GE’s power consumption and noise much harder to justify even with a “performance at any cost” philosophy."

    Very true, however the power consumption and heat difference between the 5870 and the 480 was definitely more pronounced.

    The 680 is an incredible card, no doubt about it. It may not win in some titles, but it's hardly anywhere near unplayable either. AMD being right there at the very high end is fantastic but unless titles truly make use of GCN's compute ability, the extra power and noise are going to be hard to swallow. Still, I'd own either. :P

Log in

Don't have an account? Sign up now