Battlefield 1 (DX11)

Battlefield 1 returns from the 2017 benchmark suite, the 2017 benchmark suite with a bang as DICE brought gamers the long-awaited AAA World War 1 shooter a little over a year ago. With detailed maps, environmental effects, and pacy combat, Battlefield 1 provides a generally well-optimized yet demanding graphics workload. The next Battlefield game from DICE, Battlefield V, completes the nostalgia circuit with a return to World War 2, but more importantly for us, is one of the flagship titles for GeForce RTX real time ray tracing.

We use the Ultra preset is used with no alterations. As these benchmarks are from single player mode, our rule of thumb with multiplayer performance still applies: multiplayer framerates generally dip to half our single player framerates. Battlefield 1 also supports HDR (HDR10, Dolby Vision).

Battlefield 1 - 2560x1440 - Ultra Quality

Battlefield 1 - 1920x1080 - Ultra Quality

Battlefield 1 - 99th Percentile - 2560x1440 - Ultra Quality

Battlefield 1 - 99th Percentile - 1920x1080 - Ultra Quality

Right from the get-go, the GTX 1660 Ti stakes out its territory in between the RTX 2060 FE and RX 590, leaving the latter by the wayside. And as a result, it technically edges out the GTX 1070 FE, though for all intents and purposes it is a dead heat. The RX Vega 56, however, keeps ahead by decent amount; Battlefield 1 runs well on many GPUs, but Vega cards have always had a strong showing in this title.

The mild +10W TDP of the EVGA XC Black makes an equally mild difference, more so with the 99th percentiles.

The Test Far Cry 5
Comments Locked

157 Comments

View All Comments

  • PeachNCream - Friday, February 22, 2019 - link

    This article reads a little like that infamous Steve Ballmer developers thing except it's not "developers, developers, developers, etc" but "traditional, traditional, traditionally, etc." instead. Please explore alternate expressions. The word in question implies long history which is something the computing industry lacks and the even shorter time periods referenced (a GPU generation or two) most certainly lack so the overuse stands out like a sore thumb in many of Anandtech's publications.
  • Oxford Guy - Saturday, February 23, 2019 - link

    How about the utterly asinine use of the word "kit" to describe a set of RAM sticks that simply snap into a motherboard?

    The Altair 8800 was a kit. The Heathkit H8 was a kit. Two sticks of RAM that snap into a board doth not a kit maketh.
  • futurepastnow - Friday, February 22, 2019 - link

    A triple-slot card? Really, EVGA?
  • PeachNCream - Friday, February 22, 2019 - link

    Yup, for 120W TDP of all things. But it's in the charts as a 2.75 slot width card so EVGA is probably hoping that no one understands how expansion slots actually would not permit the remaining .25 slot width to support anything.
  • darckhart - Friday, February 22, 2019 - link

    lol this was my first thought upon seeing the photo as well.
  • GreenReaper - Saturday, February 23, 2019 - link

    I suspect it was the cheapest way to get that level of cooling. A more compact heatsink-fan combo could have cost more.

    130W (which is the TDP here) is not a *trivial* amount to dissipate, and it's quite tightly packed.
  • Oxford Guy - Saturday, February 23, 2019 - link

    I think all performance GPUs should be triple slot. In fact, I think the GPU form factor is ridiculously obsolete.
  • Oxford Guy - Monday, February 25, 2019 - link

    Judging by techpowerup's reviews, though, the EVGA card's cooling is inefficient.
  • eastcoast_pete - Friday, February 22, 2019 - link

    @Ryan and Nate: What generation of HDMI and DP does the EVGA card have/support? Apologize if you had it listed and I missed it.
  • Ryan Smith - Friday, February 22, 2019 - link

    HDMI 2.0b, DisplayPort 1.4.

Log in

Don't have an account? Sign up now