Battlefield: Bad Company 2

The latest game in the Battlefield series - Bad Company 2 – remains as one of the cornerstone DX11 games in our benchmark suite. As BC2 doesn’t have a built-in benchmark or recording mode, here we take a FRAPS run of the jeep chase in the first act, which as an on-rails portion of the game provides very consistent results and a spectacle of explosions, trees, and more.

Even more so than HAWX, Bad Company 2 marks the closest we’ve seen the GTX 560 and the 6950 1GB. At 1920 the 6950 has a lead of under a frame per second, and it’s not until 1680 that we see the GTX 560 take any kind of lead. In this case both cards just pass the all-important 60fps mark at 1920, representing the bottom necessary for (more or less) fully fluid gameplay.

While we’re not generally interested in 2560 with the GTX 560, it is the only resolution that we run our Waterfall benchmark on, so we’ll quickly comment. NVIDIA normally does quite well here and the GTX 560 is no exception – even though it loses at this resolution on average, it’s 30% faster when it comes to minimums. We’ve seen the minimums in Crysis go the other way, so minimums seem just as game-dependent as the averages with all things considered.

Civilization V STALKER: Call of Pripyat


View All Comments

  • auhgnist - Tuesday, January 25, 2011 - link

    1920x1080 graph is wrong, should be mistakenly used that of 2560x1600 Reply
  • Ryan Smith - Tuesday, January 25, 2011 - link

    Fixed. Thanks. Reply
  • Marlin1975 - Tuesday, January 25, 2011 - link

    6950 1gig look good.

    I am guessing the 560 will either drop in price very quickly or the 6950 will sell better.
  • Lolimaster - Tuesday, January 25, 2011 - link

    Not impressive at alla the 560, 6950 1GB is a good value over the 2GB 6950. I think if you just prefer 1GB 6870 offers more bang for buck. Reply
  • cactusdog - Tuesday, January 25, 2011 - link

    Wow, plenty of good options from AMD and Nvidia. Since the introduction of eyefinity and 3D surround, we dont need to spend a fortune to play the latest games. For most users with 1 monitor a $250 dollar card gives excellent performance. Reply
  • tech6 - Tuesday, January 25, 2011 - link

    Like top end desktop CPUs, the high end GPU really seems to be increasingly irrelevant for most gamers as the mid-range provides plenty of performance for a fraction of the cost. Reply
  • Nimiz99 - Tuesday, January 25, 2011 - link

    I was just curious about the 2.8 FPS on Crysis by the Radeon HD 5970 - is that reproducible/consistent?
    I am just curious, b/c on the first graph of average frame-rate it leads the pack; if it fluctuates that badly I would definitely like a little bit more background on it.

    'Preciate the response,
  • Ryan Smith - Tuesday, January 25, 2011 - link

    No, it's highly variable. With only 1GB of effective VRAM, the Radeon cards are forced to texture swap - the minimum framerate is chaotic at best and generally marks how long the worst texture swap took. With swapping under the control of AMD's drivers, the resulting minimum framerate ends up being quite variable. Reply
  • Shadowmaster625 - Tuesday, January 25, 2011 - link

    Can somebody explain why 1GB is not enough when 1GB is enough memory to store over 160 frames at 24 bits at 1920x1080. At 60fps, 1GB should be able to supply a constant uncompressed stream of frames for almost 3 whole seconds. Seems like more than enough memory to me. Sounds like somebody is just haphazardly wasting vast amounts of space for no reason at all. Sort of like windows with its WinSXS folder. Lets just waste a bunch of space because we can! Reply
  • ciukacz - Tuesday, January 25, 2011 - link

    are you streaming your benchmark video through youtube ?
    because i am rendering mine realtime, which requires loading all the textures, geometry etc.

Log in

Don't have an account? Sign up now