Battlefield 3

Editor’s Note: Earlier today DICE released a patch that among other things is supposed to improve Radeon HD 7000 series performance in the game. We’ll update our numbers to include revised benchmarks as soon as we can.

Its popularity aside, Battlefield 3 may be the most interesting game in our benchmark suite for a single reason: it’s the first AAA DX10+ game. It’s been 5 years since the launch of the first DX10 GPUs, and 3 whole process node shrinks later we’re finally to the point where games are using DX10’s functionality as a baseline rather than an addition. Not surprisingly BF3 is one of the best looking games in our suite, but as with past Battlefield games that beauty comes with a high performance cost

BF3 is an all-around GPU killer, which in the case of the 7700 series doesn’t help matters. Keeping in mind our benchmarks typically trend high, even at 1680 with Medium settings we’re not cracking 60fps with anything less than a GTX 460 1GB. In this case the 7770 should be playable, but intense firefights will definitely drop through the 30fps floor.

In any case the performance of the 7700 series is starting to show some consistency. Once again the 7770 underperforms the 6850, this time by 5%, elsewhere the 7750 noticeably trails the 5770. Nothing on the AMD side is anywhere close to the GTX 560 however.

Looking at our data, I’m a bit worried about the amount of VRAM the 7770 has. 1GB is already not quite enough for some games at 1920 with high quality settings, but BF3 is especially punishing. If we see more games like BF3, I have to wonder if 1GB will be enough for even 1680 in a year’s time.

Portal 2 Starcraft II
Comments Locked

155 Comments

View All Comments

  • kallogan - Wednesday, February 15, 2012 - link

    HD 6850 is still the way to go.
  • zepi - Wednesday, February 15, 2012 - link

    So basically in couple of generations we've gone
    4870 > 5770/6770 > 7770

    Chip size
    260mm2 > 165mm2 > ~120mm2 chip.

    Performance is about
    100 > 100 > 120

    Power consumption in gaming load according to Techpowerup (just graphics card):
    150W - 108W - 83W

    And soon we should have 1 inch thick laptops with these things inside. I'm not complaining.
  • silverblue - Wednesday, February 15, 2012 - link

    Good point. One thing I think people forget is that smaller processing technologies will yield either better performance at the same power, or reduced consumption at the same performance... or a mix of the two. You could throw two cards in dual-GPU config for similar power to one you had two years back, and still not have to worry too much if CrossFire or SLi doesn't work properly (well, if you forget the microstuttering, of course).
  • cactusdog - Wednesday, February 15, 2012 - link

    WHy is the 6770 left out of benchmarks?? Isnt that odd considering the 7770 replaces the 6770? I really wish reviewers would be independant when reviewing cards, instead of following manufacturer guidelines.
  • Markstar - Wednesday, February 15, 2012 - link

    No, since the 6770 is EXACTLY the same card as the 5770 (just relabeled). So it makes sense to continue using the 5770 and remind AMD (and us) that we do not fall for their shenanigans (sadly, many do fall for it).
  • gnorgel - Wednesday, February 15, 2012 - link

    For your 6850. It should sell a lot better now. Maybe they really stopped producing it and need to get rid of stocks. But when it's sold out almost anyone should go for a gtx 560, 7% more expensive and 30% faster.
    The only reason to buy a 7770 now is if your powersupply can't support it and you would have to get a new one.
  • duploxxx - Wednesday, February 15, 2012 - link

    by the time the 6850 is out of stock the 78xx series are launched which will knock out 560

    don't understand what evryone is complaining about, its faster then the 57xx-67xx series, les spower. sure it's not cheap but neither are the 57-67 @ launch. Combined with old gen available and NV products a bit to expensive but this is just starting price....
  • akbo - Wednesday, February 15, 2012 - link

    Moore's law apparently doesn't apply to graphic cards. People expectations do. People expect that every two years gpus at the same price point have double transistors and thus be faster by so. Obviously perf does not scale like so since the 28 nm shrink only has a 50% improvement from 40 nm. However that would mean a 50% improvement is expected. Imperfect scaling would mean a 40% improvement.

    So people expect that a card which is 20% faster than a card from 2 years ago to be 1.2/1.4 the price at launch, or an ~ 85% of the 5770 launch price in this case. That would mean that the card should retail at around $130-140 or so for the 7750 and sub-$100 pricing, like $90 or so. I expect it to be that price too.
  • chizow - Wednesday, February 15, 2012 - link

    Moore's Law does actually hold true for GPUs in the direct context of the original law as you stated, roughly doubled transistors every 2 years with a new process node. The performance has deviated however for some time now with imperfect scaling relative to transistors, but at least ~50% has been the benchmark for performance improvements over previous generations.

    Tahiti and the rest of Southern Islands itself isn't that much of a disappointment relative to Moore's Law, because it does offer 40-50% improvement over AMD's previous flagship GPU. The problem is, it only offers 15-25% improvement over the overall last-gen performance leader the GTX 580 but somewhat comically, AMD wants to price it in that light.

    So we end up with this situation, the worst price performance metrics ever where a new GPU architecture and process node only offers 15-25% performance increase at the same price (actually 10% more in the 7970 case). This falls far short of the expectations of even low-end Moore's Law observer estimates that would expect to see at least +50% over the last-gen overall high-end in order to command that top pricing spot.
  • arjuna1 - Wednesday, February 15, 2012 - link

    DX11.1?? With only one true DX11 game on the market, BF3, there is literally no incentive to upgrade to this generation of cards 7xxx/kepler.

    Unless nvidia comes out with something big, and I mean big as in out of this world, I'll just skip to the next gen, and if AMD insists in being an ass with pricing, I'll go Ngreen when the time comes.

    Now, the worrying thing is that it's becoming evident, both parties are becoming too cynical with price fixing, when is that anti trust lawsuit coming?

Log in

Don't have an account? Sign up now