Battlefield 1 (DX11)

Battlefield 1 leads off the 2017 benchmark suite with a bang as DICE brought gamers the long-awaited AAA World War 1 shooter a little over a year ago. With detailed maps, environmental effects, and pacy combat, Battlefield 1 provides a generally well-optimized yet demanding graphics workload. In light of DX12-related performance issues in this title, DX11 is utilized for all cards.

The Ultra preset is used with no alterations. As these benchmarks are from single player mode, our rule of thumb with multiplayer performance still applies: multiplayer framerates generally dip to half our single player framerates.

Battlefield 1 - 3840x2160 - Ultra QualityBattlefield 1 - 2560x1440 - Ultra QualityBattlefield 1 - 1920x1080 - Ultra QualityBattlefield 1 - 99th Percentile - 3840x2160 - Ultra QualityBattlefield 1 - 99th Percentile - 2560x1440 - Ultra QualityBattlefield 1 - 99th Percentile - 1920x1080 - Ultra Quality

Battlefield 1 has shown itself to be rather favorable on Vega hardware, and against Vega 56 at 4K, the GTX 1070 Ti FE can only manage a draw. At lower resolutions, the Vega 56 loses its advantage, but the difference is slim.

The Test Ashes of the Singularity: Escalation
Comments Locked

78 Comments

View All Comments

  • Nfarce - Friday, November 3, 2017 - link

    "Vega 56 beats 1070 in almost all games."

    It beats the reference GTX 1070, but a factory overclocked 1070 pulls ahead again, especially overclocked on top of the factory overclock. Vega 56 (or 64 for that matter) does not have that type of overclocking headroom. This has long been an advantage for Nvidia. AMD GPUs have a history of being terrible overclockers. My old EVGA GTX 970 SSC ACX 2.0+ could be overclocked to 980 performance without even touching the voltage for example.
  • LastQuark - Saturday, November 4, 2017 - link

    Vega 56 BIOS can be upgraded to the Vega 64 BIOS for another 20% speed boost. Vega can only do 5% max. With Vega 56 lower price point by over $50, cheaper Freesync monitor options, and 2x the availability of GSync, Vega 56 is still making a lot of sense for new buyers.
  • B-Real - Thursday, November 2, 2017 - link

    According to this theory, you can't compare the 1070 to the Vega56, as the 1070 is 5 degrees Celsius hotter...
  • damonlynch - Thursday, November 2, 2017 - link

    It should be "nonetheless", not "none the less", in the introduction to the Final Words ;-)
  • jardows2 - Thursday, November 2, 2017 - link

    Good showing. At MSRP, a good argument (from these tests at least) can still be made for Vega 56. Not sure if 1070ti is worth $50.00 more, but you do get a little bit better performance, and most important to me, a lower noise profile at load. Keeping it interesting for sure!
  • CaedenV - Thursday, November 2, 2017 - link

    So.... it is essentially a direct Vega 56 competitor except that it will be available on store shelves for purchase?
    Really hoping that this will cause the normal 1070 prices to drop a bit *fingers crossed*. I picked up a 4k monitor last year and my gaming has been quite limited on it with my GTX 960. A 1070 will fill in quite nicely for now, and next year when the new cards come out I'll pick up a 2nd 1070 for SLi to really make 4K gaming smooth.
  • BrokenCrayons - Thursday, November 2, 2017 - link

    SLI doesn't seem like a good solution these days given tepid support from the GPU manufacturers and very few modern titles that are optimized to take advantage of a 2nd graphics card. You might have a better experience if you set aside the first 1070's cost until next year and then use the funding from both to purchase a 1080 or just hang on to see what happens with Volta since there'll likely be consumer GPUs available sometime in 2018.
  • vladx - Thursday, November 2, 2017 - link

    AMD has no chance at all, a RX Vega 56 in my country is $150 more expensive than the newly released GTX 1070 Ti.
  • Sorjal - Thursday, November 2, 2017 - link

    Easiest check to see potential mining impact is toss a ti and non in a pc and test with some miner program. Nicehash legacy is probably the best for comparative stats. Run their benchmark on both cards on precise and compare the results. There will be some variance, but it should provide a decent reference. Nvidias are typically used on ones that are more gpu intensive, with Zcash probably the largest. AMD works off their bandwidth and are favored for corresponding currencies like Etherium and Monero. The nice hash legacy will test against most of the major ones including Ethereum, Monero, and ZCash.
    Given the increased CPU performance, it is probably being tested mining wise as we speak
  • Sorjal - Thursday, November 2, 2017 - link

    Remember to overclock and undervolt. Energy effeciency is a large factor. 1070's seem to be run between 65-70% power limit. A voltage meter on the outlet may be useful in this case as you can look at the voltage change for each test for each card

Log in

Don't have an account? Sign up now