Battlefield 1 (DX11)

Battlefield 1 returns from the 2017 benchmark suite, the 2017 benchmark suite with a bang as DICE brought gamers the long-awaited AAA World War 1 shooter a little over a year ago. With detailed maps, environmental effects, and pacy combat, Battlefield 1 provides a generally well-optimized yet demanding graphics workload. The next Battlefield game from DICE, Battlefield V, completes the nostalgia circuit with a return to World War 2, but more importantly for us, is one of the flagship titles for GeForce RTX real time ray tracing, although at this time it isn't ready.

We use the Ultra preset is used with no alterations. As these benchmarks are from single player mode, our rule of thumb with multiplayer performance still applies: multiplayer framerates generally dip to half our single player framerates. Battlefield 1 also supports HDR (HDR10, Dolby Vision).

Battlefield 1 1920x1080 2560x1440 3840x2160
Average FPS
99th Percentile

At this point, the RTX 2080 Ti is fast enough to touch the CPU bottleneck at 1080p, but it keeps its substantial lead at 4K. Nowadays, Battlefield 1 runs rather well on a gamut of cards and settings, and in optimized high-profile games like these, the 2080 in particular will need to make sure that the veteran 1080 Ti doesn't edge too close. So we see the Founders Edition specs are enough to firmly plant the 2080 Founders Edition faster than the 1080 Ti Founders Edition.

The outlying low 99th percentile reading for the 2080 Ti occurred on repeated testing, and we're looking into it further.

The 2018 GPU Benchmark Suite and The Test Far Cry 5
Comments Locked

337 Comments

View All Comments

  • dustwalker13 - Sunday, September 23, 2018 - link

    Way too expensive. If those cards were the same price as the 1080 / TI it would be a generation change.

    This actually is a regression, the price has increased out of every proportion even if you completely were to ignore that newer generations are EXPECTED to be a lot faster than the older ones for essentially the same price (plus a bit of inflation max).

    paying 50% more for a 30% increase is a simple ripoff. no one should buy these cards ... sadly a lot of people will let themselves get ripped off once more by nvidia.

    and no: raytracing is not an argument here, this feature is not supported anywhere and by the time it will be adopted (if ever) years will have gone bye and these cards will be old and obsolete. all of this is just marketing and hot air.
  • mapesdhs - Thursday, September 27, 2018 - link

    There are those who claim buying RTX to get the new features is sensible for future proofing; I don't understand why they ignore that the performance of said features on these cards is so poor. NVIDIA spent years pushing gamers into high frequency monitors, 4K and VR, now they're trying to flip everyone round the other way, pretend that sub-60Hz 1080p is ok.

    And btw, it's a lot more than 50%. Where I am (UK) the 2080 Ti is almost 100% more than the 1080 Ti launch price. It's also crazy that the 2080 Ti does not have more RAM, it should have had at least 16GB. My guess is NVIDIA knows that by pushing gamers back down to lower resolutions, they don't need so much VRAM. People though have gotten used to the newer display tech, and those who've adopted high refresh displays physically cannot go back.
  • milkod2001 - Monday, September 24, 2018 - link

    I wonder if NV did not hike prices so much on purpose so they don't have to lower existing GTX prices that much. There is still ton of GTX in stock.
  • poohbear - Friday, September 28, 2018 - link

    Why the tame conclusion? Do us a favor will u? Just come out and say you'd have to be mental to pay these prices for features that aren't even available yet and you'd be better off buying previous gen video cards that are currently heavily discounted.
  • zozaino - Thursday, October 4, 2018 - link

    i really want to use it
  • zozaino - Thursday, October 4, 2018 - link

    i really want to use it https://subwaysurfers.vip/
    https://psiphon.vip/
    https://hillclimbracing.vip/
  • Luke212 - Wednesday, October 24, 2018 - link

    Why does the 2080ti not use the tensor cores for GEMM? I have not seen any benchmark anywhere showing it working. It would be a good news story if Nvidia secretly gimped the tensor cores.

Log in

Don't have an account? Sign up now