Battlefield: Bad Company 2

The latest game in the Battlefield series - Bad Company 2 – remains as one of the cornerstone DX11 games in our benchmark suite. As BC2 doesn’t have a built-in benchmark or recording mode, here we take a FRAPS run of the jeep chase in the first act, which as an on-rails portion of the game provides very consistent results and a spectacle of explosions, trees, and more.

Bad Company 2 is another game where the GTX 570 takes a notable lead over the GTX 480, and is clearly exceeding the theoretical clockspeed advantages of the GTX 570 and drifting in to architectural optimizations. Unfortunately this advantage for the GTX 570 plays out best at lower resolutions, resulting in the gap disappearing by the time we hit 2560. Even at 1920 the advantage is barely worth mentioning, meaning we’re once more at parity with the GTX 480.

Perhaps it would have been better for NVIDIA if that advantage had held, because it means the Radeon  5870 gets uncomfortably close at higher resolutions. Negligible at 2560, the GTX 570 advantage is only 10% at 1920. AMD’s strong CF scaling also puts NVIDIA in a tough position here, as the 6850 CF is a whopping 32% faster than the GTX 570 when it comes to Bad Company 2.

NVIDIA does manage to turn the tables with our Waterfall benchmark however, which serves as a proxy for minimum framerates. The GTX 570 is still tied with the GTX 480 here, but it’s also now at parity with the 6850CF and over 50% faster than the 5870, easily demonstrating that if you’re worried more about minimums than averages in Bad Company 2 that the 5870 and GTX 570 aren’t nearly as close as they were at first glance. Extra RAM would probably be of great benefit to AMD here.

Civilization V STALKER: Call of Pripyat
Comments Locked

54 Comments

View All Comments

  • ilkhan - Tuesday, December 7, 2010 - link

    I love my HDMI connection. It falls out of my monitor about once a month and I have to flip the screen around to plug it back in. Thanks TV industry!
  • Mr Perfect - Tuesday, December 7, 2010 - link

    It is somewhat disappointing. People with existing screens probably don't care, and the cheap TN screens still pimp the DVI interface, but all of the high end IPS panel displays include either HDMI, DP or both. Why wouldn't a high end video card have the matching outputs?
  • EnzoFX - Tuesday, December 7, 2010 - link

    High-End gaming card is probably for serious gamers, which should probably go with TN as they are the best against input lag =P.
  • Mr Perfect - Tuesday, December 7, 2010 - link

    Input lag depends on the screen's controller, you're thinking pixel response time. Yes, TN is certainly faster then IPS for that. I still wouldn't get a TN though, the IPS isn't far enough behind in response time to negate the picture quality improvement.
  • MrSpadge - Tuesday, December 7, 2010 - link

    Agreed. The pixel response time of my eIPS is certainly good enough to be of absolutely no factor. The image quality, on the other hand, is worth every cent.

    MrS
  • DanNeely - Tuesday, December 7, 2010 - link

    Due to the rarity of HDMI 1.4 devices (needed to go above 1920x1200) replacing a DVI port with an HDMI port would result in a loss of capability. This is aggravated by the fact that due to their stickerprice 30" monitors have a much longer lifetime than 1080p displays and owners who would get even more outraged as being told they had to replace their screens to use a new GPU. MiniDVI isn't an option either because it's singlelink and has the same 1920x1200 cap as HDMI 1.3.

    Unfortunately there isn't room for anything except a single miniHDMI/miniDP port to the side of 2 DVI's, installing it on the top half of a double height card like ATI has done cuts into the cards exhaust airflow and hurts cooling. With the 5xx series still limited to 2 outputs that's not a good tradeoff, and HDMI is much more ubiquitous.

    The fiasco with DP-DVI adapters and the 5xxx series cards doesn't exactly make them an appealing option either to consumers.
  • Mr Perfect - Wednesday, December 8, 2010 - link

    That makes good sense too, you certainty wouldn't want to drop an existing port to add DP. I guess it really comes down to that cooling vs port selection problem.

    I wonder why ATI stacked the DVI ports? Those are the largest ports out of the three and so block the most ventilation. If you could stack a mini-DP over the mini HDMI, it would be a pretty small penalty. It might even be possible to mount the mini ports on edge instead of horizontally to keep them all on one slot.
  • BathroomFeeling - Tuesday, December 7, 2010 - link

    "...Whereas the GTX 580 took a two-tiered approach on raising the bar on GPU performance while simultaneously reducing power consumption, the GeForce GTX 470 takes a much more single-tracked approach. It is for all intents and purposes the new GTX 480, offering gaming performance..."
  • Lonyo - Tuesday, December 7, 2010 - link

    Any comments on how many will be available? In the UK sites are expecting cards on the 9th~11th December, so not a hard launch there.
    Newegg seems to only have limited stock.

    Not to mention an almost complete lack of UK availability of GTX580s, and minimal models and quantities on offer from US sites (Newegg).
  • Kef71 - Tuesday, December 7, 2010 - link

    Or maybe they are a nvidia "feature" only?

Log in

Don't have an account? Sign up now