Battlefield 1 (DX11)

Battlefield 1 returns from the 2017 benchmark suite, the 2017 benchmark suite with a bang as DICE brought gamers the long-awaited AAA World War 1 shooter a little over a year ago. With detailed maps, environmental effects, and pacy combat, Battlefield 1 provides a generally well-optimized yet demanding graphics workload. The next Battlefield game from DICE, Battlefield V, completes the nostalgia circuit with a return to World War 2, but more importantly for us, is one of the flagship titles for GeForce RTX real time ray tracing.

We use the Ultra preset is used with no alterations. As these benchmarks are from single player mode, our rule of thumb with multiplayer performance still applies: multiplayer framerates generally dip to half our single player framerates. Battlefield 1 also supports HDR (HDR10, Dolby Vision).

Battlefield 1 - 3840x2160 - Ultra Quality

Battlefield 1 - 2560x1440 - Ultra Quality

Battlefield 1 - 1920x1080 - Ultra Quality

Our previous experience with Battlefield 1 shows that AMD hardware tend to do relatively well here, and the Radeon VII is no exception. Of the games in our suite, Battlefield 1 is actually only one of two games where the Radeon VII takes the lead over the RTX 2080, but nevertheless this is still a feather in its cap. The uplift over the Vega 64 is an impressive 34% at 4K, more than enough to solidly mark its position at the tier above. In turn, Battlefield 1 sees the Radeon VII meaningfully faster than the GTX 1080 Ti FE, something that the RTX 2080 needed the Founders Edition tweaks for.

Battlefield 1 - 99th Percentile - 3840x2160 - Ultra Quality

Battlefield 1 - 99th Percentile - 2560x1440 - Ultra Quality

Battlefield 1 - 99th Percentile - 1920x1080 - Ultra Quality

99th percentiles reflect the same story, and at 1080p the CPU bottleneck plays more of a role than slight differences of the top three cards.

The Test Far Cry 5
Comments Locked

289 Comments

View All Comments

  • PeachNCream - Thursday, February 7, 2019 - link

    Sorry about that. The Radeon VII is very much out of the range of prices I'm willing to pay for any single component or even an whole system for that matter. I was zinging about the GPU being called high-end (which it rightfully is) because in another recent article, a $750 monitor was referred to as midrange. See:

    https://www.anandtech.com/show/13926/lg-launches-3...

    It was more to make a point about the inconsistency with which AT classifies products than an actual reflection of my own buying habits.

    As for my primary laptop, my daily driver is a Bay Trail HP Stream 11 running Linux so yeah, it's packing 2GB of RAM and 32GB of eMMC. I have a couple other laptops around which I use significantly less often that are older, but arguably more powerful. The Stream is just a lot easier to take from place to place.
  • Korguz - Friday, February 8, 2019 - link

    it could be.. that maybe the manufacturer refers it as a mid range product ( the monitor ) in their product stack.. and AT.. just calls it that, because of that ?

    :-)
  • eva02langley - Friday, February 8, 2019 - link

    I follow you on that. I bought a 1080 TI and I told myself this is the maximum I am willing to put for a GPU.

    I needed something for 4k and it was the only option. If Navi is 15% faster than Vega 64 for 300$, I am buying one on launch.
  • D. Lister - Saturday, February 9, 2019 - link

    But why would you want to spend $300 for a downgrade from your 1080Ti?
  • HollyDOL - Thursday, February 7, 2019 - link

    Purely on gaming field this can't really compete with RTX 2080 (unless some big enough perf change comes with new drivers soon)... it's performing almost same, but at a little bit more power, hotter and almost 10dB louder, which is quite a lot. Given that it won't be able to offer anything more (as oposed to possible adoptions of DXR) I would expect it not trying to compete for same price level RTX 2080 does.

    If it can get $50-$100 lower otoh, you get what many people asked for... kind of "GTX 2080" ... classic performance without ray tracing and DLSS extensions.

    With current price though It only makes sense if they are betting they can get enough compute buyers.
  • Oxford Guy - Thursday, February 7, 2019 - link

    Yeah, because losing your hearing to tinnitus is definitely worth that $50-100.
  • HollyDOL - Friday, February 8, 2019 - link

    Well, it's "lab conditions", it can always get dampened with good chasis or chasis position to reasonable levels and hopefully noone should be playing with head stuck inside the chasis... For me subjectively it would be too loud, but I wanted to give the card advantage of doubt, non-reference designs should hopefully get to lower levels.
  • Oxford Guy - Friday, February 8, 2019 - link

    1) The Nvidia card will be quieter in a chassis. So, that excuse fails.

    2) I am not seeing significant room for doubt. Fury X was a quiet product (except at idle which some complained about, and in terms of, at least in some cases, coil whine). AMD has chosen to move backward, severely, in the noise department with this product.

    This card has a fancy copper vapor chamber with flattened heatpipes and three fans. It also runs hot. So, how is it, at all, rational to expect 3rd-party cards to fix the noise problem? 3rd-party makers typically use 3 slot designs to increase clocks and they typically cost even more.
  • HollyDOL - Friday, February 8, 2019 - link

    Well, not really. If the quieter chassis cuts of enough dB to get it out of disturbing level it will be enough. Also depends on environment... If you play in loud environment (day, loud speakers) the noise won't be percieved as bad as if you play it during night with quiter speakers. Ie. what can be sufferable during day can turn in complete hell during night.

    That being said I am by any means not advocating +10dB, because it is a lot, but in the end it doesn't have to present so terrible obstacle.

    It is very early, there can always be a bug in drivers or bios causing this temp/noise issue or it can be a design problem that cannot be circumvented. But that will be seen only after some time. I remember bug in ForceWare causing my old GTX580 not dropping to 2D frequencies once it kicked in 3D (or was it on 8800GT, I don't really remember)... You had to restart the machine. Such things simply can happen, which doesn't make them any better ofc.
  • Oxford Guy - Friday, February 8, 2019 - link

    "If the quieter chassis cuts of enough dB to get it out of disturbing level it will be enough."

    Nope. I've owned the Antec P180. I have extensively modified cases and worked hard with placement to reduce noise.

    Your argument that the noise can simply be eliminated by putting it into a case is completely bogus. In fact, Silent PC Review showed that more airflow, from less restriction (i.e. a less closed-in case design) can substantially reduce GPU noise — the opposite of the P180 philosophy that Silent PC Review once advocated (and helped to design).

    The other problem for your argument is that it is 100% logically true that there is zero reason to purchase an inferior product. Since this GPU is not faster than a 2080 and costs the same there is zero reason to buy a louder GPU, since, in actuality, noise doesn't just get absorbed and disappear when you put it into a case. In fact, this site wrote a review of a Seasonic PSU that could be heard "from rooms away" and I can hear noisy GPUs through walls, too.

    "It is very early, there can always be a bug in drivers or bios causing this temp/noise issue"

    Then it shouldn't be on the market and shouldn't have been sampled. Alpha quality designs shouldn't be review subjects, particularly when they're being passed off as the full product.

Log in

Don't have an account? Sign up now