Battlefield 3

Its popularity aside, Battlefield 3 may be the most interesting game in our benchmark suite for a single reason: it’s the first AAA DX10+ game. It’s been 5 years since the launch of the first DX10 GPUs, and 3 whole process node shrinks later we’re finally to the point where games are using DX10’s functionality as a baseline rather than an addition. Not surprisingly BF3 is one of the best looking games in our suite, but as with past Battlefield games that beauty comes with a high performance cost

Battlefield 3 was a game the 7970 struggled with at launch, and even with AMD’s driver optimizations they haven’t been able to do a great deal about it so far. As a result the 7950 trails the GTX 580 the entire time by anywhere between 3% and 10%, and unfortunately for AMD BF3 is a very demanding game, making it one of the worst titles to fall behind at. As 2560 is not going to be playable, we’re realistically looking at 1920, where the 7950 is fast enough to crack 60fps, but is where that 10% performance gap is found.

Thankfully for Sapphire and XFX, overclocking is the great equalizer here. With their factory overclocks their cards generally erase the 10% performance gap, taking a very slight lead over the GTX 580 at both 2560 and 1920. Though there continues to be very little difference between the two cards themselves—XFX’s memory overclock is rarely worth more than a frame or two per second.

Portal 2 Starcraft II
Comments Locked

259 Comments

View All Comments

  • mak360 - Tuesday, January 31, 2012 - link

    i would easily buy the HD7950 over the old tech - outdated - hot - power hungry - loud GTX580 junk. The HD7950 is same price, new tech, uses 72 watts less, is cooler, is silent, is 28nm, is faster, has compute, has pcie3, has x3 monitors, has audio over each channel, also slaps the 590 if thats what you want lol.

    its a win-win, you would have to be an idiot to buy anything nvidia has currently in the high end.
  • chizow - Wednesday, February 1, 2012 - link

    Anyone interested in high-end already owns Nvidia and is hitting the snooze button on this launch until Kepler.

    There's only a 15-25% reason to buy a 7970, 0-5% reason to buy a 7950.
  • Death666Angel - Wednesday, February 1, 2012 - link

    You keep repeating it, and you keep being wrong. There are a million reasons for someone to upgrade their system now. Maybe they got a better monitor for christmas and need the graphics card upgrade but waited a month until AMD revealed their new tech? Maybe it's someones birthday and he can get a big card. Or someone got a new job and wants a new card today? Not everyone who has the money and need for such a card now had it in the months before it.
  • chizow - Wednesday, February 1, 2012 - link

    In that case, they should probably wait for the real next-gen, since that's what most anyone was doing prior to the disappointing Tahiti reveal.

    Or go ahead and pick up a 6970/570 for much better price/performance return. Although we may actually see the prices go back up now that its obvious Tahiti did nothing to force downward pricing pressure.
  • yankeeDDL - Wednesday, February 1, 2012 - link

    Just look at the review from Tomshardware.
    Based on performance, they were expecting the 7950 to be prices around $480. Then they were informed about the MSRP of $450 and took it extremely well.
    Just sayin'
  • Spunjji - Wednesday, February 1, 2012 - link

    Unfortunately you gave yourself away as a bit of an idiot as soon as you failed to address anything about the product other than its raw performance.

    Precisely what makes you think that AMD /has/ to price their products at this level? They have a smaller chip that performs better for less power. As soon as nVidia releases competing products they'll drop trou on the price and everyone can be happy. Right now they're price-gouging the performance-obsessed, just like nVidia have been for as long as they've had the top product.

    Personally, I'm disappointed that they've abandoned the 3/4/5000 series approach of providing fantastic value for money, but apparently that wasn't earning them any money. Big shame, don't care, move on. I'll be waiting for Kepler to show before I make any buying decisions.
  • chizow - Wednesday, February 1, 2012 - link

    If the only thing AMD is able to bring to the table from a full node process shrink is a reduction in power consumption, they've already failed.

    What compounds their failure however, is the fact they're trying to price this card that doesn't even significantly outperform last-gen parts at existing prices.

    If they actually priced this where it should be ~$380-$400, it'd be a completely different story. Because they'd actually be offering you all of those fringe benefits you listed as well as either high-end performance at a much lower price OR significantly higher performance at the same price.

    These are the kinds of metrics people look at when deciding to upgrade, or not. Pricing a product that performs the same as a part that's been available for 14 months already just doesn't make any sense, sorry.
  • ven - Thursday, February 2, 2012 - link

    after all the conversation you have given i came to only one conclusion you all guys created as much hype for the kepler. Nvidia will be much delighted for this.after reading all these I would be not surprised if Nvidia print a link to these website page on their kepler card boxes as part of their advertisement.
  • chizow - Thursday, February 2, 2012 - link

    I don't think Nvidia cares about what's written here tbh, I don't think it took more than looking at the benchmarks for them to get excited.

    What they care about:

    -AMD's top 28nm = only 15-25% faster than their last-gen top 40nm
    -AMD's 2nd 28nm = only 0-5% faster than their last-gen top 40nm

    The result is the rumors and indirect quotes attributed to Nvidia personnel at CES amounting to:

    "We expected more from AMD's HD7900 series."

    But really, this quote could and should be attributed to anyone, especially at the asking price. It seems most people feel this way, makes you wonder why AMD fans don't.

  • Galidou - Saturday, February 4, 2012 - link

    it's fun to see comparison of parts only by the size of the transistor..... the thing is the 40nm parts from nvidia from last gen are BIG gpus, you gotta compare the quantity of transistor to transistor to understand the % increase in performance....

    AMD's smaller gpus smaller power enveloppe that maxes performance/die size vs Nvidia's maximum die size/max performance attainable with good yields...

Log in

Don't have an account? Sign up now