Battlefield 3

Our major multiplayer action game of our benchmark suite is Battlefield 3, DICE’s 2011 multiplayer military shooter. Its ability to pose a significant challenge to GPUs has been dulled some by time and drivers, but it’s still a challenge if you want to hit the highest settings at the highest resolutions at the highest anti-aliasing levels. Furthermore while we can crack 60fps in single player mode, our rule of thumb here is that multiplayer framerates will dip to half our single player framerates, so hitting high framerates here may not be high enough.

With Battlefield 3 generally favoring NVIDIA GPUs the 290X fell just short of the GTX 780, and consequently the 290 will fall back a bit further. As such the 290 trails the GTX 780 by 7% while trailing the 290X by a narrower 5%. Furthermore in this case the 290 just hits the cutoff for a 60fps average at 2560, which means the card should have no problem sustaining minimum framerates above 30fps in even the most hectic firefights.

Elsewhere the 290 doesn’t get to enjoy quite the massive performance advantages over the 280X and GTX 770 that it enjoyed earlier, but it’s still ahead of its cheaper competitors. Against the 280X the 290 is 23% faster, while against the GTX 770 it’s a narrower 12%.

Bioshock Infinite Crysis 3
Comments Locked

295 Comments

View All Comments

  • HisDivineOrder - Tuesday, November 5, 2013 - link

    Haha, spoken like someone who's never heard a card this loud. I can't wait to see all these cards on sale on ebay and forums everywhere. "I tried it and it's not for me, sidegrading to a 780," they'll say.

    This card is so loud you're going to be shocked by it. It's going to blow people's minds and it may even convert a few fanboys.
  • Finally - Tuesday, November 5, 2013 - link

    If he buys one with a nice custom fan, there won't be anything left to complain about. Truly terrible outlook for an Nvidiot, isn't it?
  • TheJian - Tuesday, November 5, 2013 - link

    You're forgetting they are using ref NV also. You don't get that when you buy an NV card and they come overclocked on top of quiet. Also this thing will draw the same watts no matter what. It remains to be seen how good a different cooler will actually be. Did AMD really choose two terrible fans for their product launch? Seriously? I'm wondering how much they can really fix this situation. AMD had to know this would cause bad reviews about noise nearly everywhere and even on AMD loving sites. I can't believe they are completely dumb, and chose a total piece of junk for the fan/heatsink here. I really think people are putting to much faith in a fix with a fan change. They are at 95 all day basically, how much fan do you need to fix that?

    If NV runs their gpus at 95 tomorrow (and cranked up even more to meet the noise they're getting here) these cards will both be spanked. You get a better cooler on NV cards that are NOT ref also.
  • jnad32 - Tuesday, November 5, 2013 - link

    The way I look at it, AMD is looking like an absolute genius. Everyone was ripping them on the 290X for it being too hot and too loud anyway. So instead of keeping the sound levels down they just went for what they do best, price/performance. They are now blowing every other card out of the water. There isn't a card on the planet that can touch this card in price/performance. Yea its loud as hell but, at least you have to think about it now just because of the price. What I really want to see is them unleash the 290X sound threshold and see what kind of raw numbers it can put up. Lets be honest, the only people who should buy reference cards are the ones who are putting water blocks on them.

    People have been saying this about the temp since launch, and I still don't get it. If AMD designed the chip to run at those temps, what's the big deal as long as it's not damaging it.
  • swing848 - Tuesday, November 5, 2013 - link

    It will only get loud for me when playing games or the occasional benchmark. During games I wear headphones, and during benchmarks I can leave the room. I have a room dedicated to computer use and the house has good sound proofing, so, it will not bother other people.

    If I want it quiet I will use a water cooler with a large radiator and fan.

    It is better than dumping all the hot air from the video card into my case, even if it is well cooled with 200mm fans. I overclock my CPU and I do not want it, RAM, or chips on the motherboard to get any hotter than necessary.
  • zeock9 - Tuesday, November 5, 2013 - link

    The burning question on my mind at this point is why AMD is restricting board partners from releasing their own custom designed and obviously better performing coolers on this otherwise fantastic card?
  • techkitsune - Tuesday, November 5, 2013 - link

    They likely don't want to look bad.

    It's okay. It's tough doing thermal management. I cram 1,000w of LED into a 30mm x 30mm space. AMD doesn't have the cooling problems that I have. Nor does nVidia nor intel. They should be grateful. :D
  • HisDivineOrder - Tuesday, November 5, 2013 - link

    They don't have them yet. That's why they haven't made custom boards. They're just getting them right now. They're going with what they have, which right now are just the reference boards. In a month or so, they'll have QA'ed some solutions with pre-existing cooling options, assuming said cooling options are good enough to benefit these cards.

    The thing is, you have to know these cards are running REALLY, REALLY hot to hit these levels at 95 degrees, so... custom coolers may have a hard time handling these cards without some tweaks. Perhaps to get faster fans on there.

    Also, it takes time to redesign a board to add VRM's and the 290 and 290X are still very, very new. You're not going to get an MSI Lightning version overnight.

    It's a solid deal in price, but man it's a shame AMD didn't offer a better custom cooler more attuned to the very special needs of the 290 series. It's also a shame their board is being pushed so hard and so much above what it seems capable of doing with reasonable power levels.

    This is like the Bulldozer of GPU's.
  • techkitsune - Tuesday, November 5, 2013 - link

    AMD could have just spent a few more dollars and used copper instead of aluminum, I would think. They could have easily doubled or tripled thermal conductivity and thus not needed to run the reference cooler anywhere near as high, plus that would leave a LOT of extra overclocking room.

    I still would buy it for the extra $45 that would have likely entailed, though I do worry about weight at that point. My 9800 GTX+ was pretty hefty, to say the least.
  • TheinsanegamerN - Tuesday, November 5, 2013 - link

    THIS. why does amd, or heck, any manufacturer, insist on using aluminum fins on a 250 watt+ gpu? my old amd 2600xt had a full copper heatsink, and it was nowhere near as power hungry as this card (and it ran cool to boot. never over 47c).
    use the exact same heatsink, but make those fins copper. wonder how much lower the temps would go?

Log in

Don't have an account? Sign up now