Battlefield 3

Our multiplayer action game of our benchmark suite is Battlefield 3, DICE’s 2011 multiplayer military shooter. Its ability to pose a significant challenge to GPUs has been dulled some by time and drivers, but it’s still a challenge if you want to hit the highest settings at the highest resolutions at the highest anti-aliasing levels. Furthermore while we can crack 60fps in single player mode, our rule of thumb here is that multiplayer framerates will dip to half our single player framerates, so hitting high framerates here may not be high enough.

Battlefield 3 - 2560x1440 - Ultra Quality + 4x MSAA

BF3 typically favors NVIDIA cards, so it comes as no great shock that performance has once again flipped in advantage of the GTX 690.

Battlefield 3 - Delta Percentages - 2560x1440 - Ultra Quality + 4x MSAA

Looking at our delta percentages this is another game where AMD has made massive gains; previously they’d be so unbalanced that every other frame for a long stretch of the benchmark would be a runt frame. 12.6% is the single best showing for the 7990 and much closer to where we would like AMD to be. They still have more than twice the variability of the GTX 690, but if every game were like this AMD would in a better position than where they’re going to be with this first phase of frame pacing.

The graphical representation of our FCAT data neatly matches our numeric analysis, once again showcasing AMD’s improved frame pacing, while showing how much farther they have to go to catch NVIDIA.

 

Battlefield 3 - 95th Percentile FT - 2560x1440 - Ultra Quality + 4x MSAA

Finally this is another case where improving on the frame pacing situation by so much has greatly improved on 95th percentile times. Though 7990 still trails GTX 690, as you’d expect given the latter’s general performance lead.

Sleeping Dogs Bioshock Infinite
Comments Locked

102 Comments

View All Comments

  • chizow - Wednesday, August 7, 2013 - link

    There was discussions of microstutter on various forums associated with multi-GPU, but PCGH was the first site to publish it's findings in detail with both video evidence and hard data. From what I remember, they were the first to develop the methodology of using FRAPs frametimes and graphing the subsequent results to illustrate microstutter.
  • BrightCandle - Friday, August 2, 2013 - link

    One of the most shocking revelations to me is that AMDs quality assurance did not include checking the output of their cards frame by frame. I had always assumed that both NVidia and AMD had HDMI/DVI/VGA recorders that allowed them to capture the output of their cards so they could check them pixel by pixel, frame by frame and presumably check they were correct automatically.

    Such a technology would clearly have shown the problem immediately. I am stunned that these companies don't do that. Even FCAT is a blatantly blunt tool as it doesn't say anything about the contents of the frames. We still don't have any way to measure end to end latency for comparison either. All in all there is much to left to do and I am not confident that either company is testing these products well, its just I couldn't believe that AMD wasn't testing theirs for consistency (it was obvious when you played it something was wrong) at all.
  • krutou - Friday, August 2, 2013 - link

    AMD is in the business of being the best performance per price entry in every market segment. Technology and quality come second.

    How often does AMD introduce and/or develop technologies for their graphics cards? The only two that come to mind are Eyefinity and TressFX (100 times more overhyped than PhysX).
  • Death666Angel - Saturday, August 3, 2013 - link

    I think ATI had tessellation in their old DX8 chips. nVidia bought PhysX, so that shouldn't count. But I don't really see how having exclusive technology usable by a single GPU vendor is anything good. We need standardization and everybody having access to the same technologies (albeit with different performance deltas). Look at the gimmicky state of PhysX and imagine what it could be if nVidia would allow it to be fully utilized by CPUs and AMD GPUs?
  • krutou - Saturday, August 3, 2013 - link

    Because OpenCl and TressFX are doing so well right?
  • bigboxes - Sunday, August 4, 2013 - link

    March on, fanboi.
  • JamesWoods - Sunday, August 4, 2013 - link

    If you think that is all AMD/ATI has ever done for graphics then you sir, are ignorant. I was going to use a more degrading word there and thought better of it.
  • Will Robinson - Friday, August 2, 2013 - link

    LOL...what a load of tosh.
    "NVDA had to take them by the hand"?
    You and Wreckage ought to post in green text.
  • chizow - Friday, August 2, 2013 - link

    Agree with pretty much of all of this, although I would direct a lot of the blame on AMD's most loyal, enthusiastic supporters as well. Every time microstutter was mentioned and identified as being worst with AMD solutions, AMD's biggest fans would get hyperdefensive about it. If those most likely to have a problem were too busy denying any problem existed, it really should be no surprise it was never fixed.

    And this is the result. Years of denial and broken CF, finally fixed as a result of the scrutiny from the press and laughter of Nvidia fans which brought this to a head and forced AMD to take a closer look and formulate a solution.
  • EJS1980 - Friday, August 2, 2013 - link

    "Truth favors not one side."

Log in

Don't have an account? Sign up now