Gaming Tests: Borderlands 3

As a big Borderlands fan, having to sit and wait six months for the EPIC Store exclusive to expire before we saw it on Steam felt like a long time to wait. The fourth title of the franchise, if you exclude the TellTale style-games, BL3 expands the universe beyond Pandora and its orbit, with the set of heroes (plus those from previous games) now cruising the galaxy looking for vaults and the treasures within. Popular Characters like Tiny Tina, Claptrap, Lilith, Dr. Zed, Zer0, Tannis, and others all make appearances as the game continues its cel-shaded design but with the graphical fidelity turned up. Borderlands 1 gave me my first ever taste of proper in-game second order PhysX, and it’s a high standard that continues to this day.

BL3 works best with online access, so it is filed under our online games section. BL3 is also one of our biggest downloads, requiring 100+ GB. As BL3 supports resolution scaling, we are using the following settings:

  • 360p Very Low, 1440p Very Low, 4K Very Low, 1080p Badass

BL3 has its own in-game benchmark, which recreates a set of on-rails scenes with a variety of activity going on in each, such as shootouts, explosions, and wildlife. The benchmark outputs its own results files, including frame times, which can be parsed for our averages/percentile data.

AnandTech Low Resolution
Low Quality
Medium Resolution
Low Quality
High Resolution
Low Quality
Medium Resolution
Max Quality
Average FPS
95th Percentile

All of our benchmark results can also be found in our benchmark engine, Bench.

Gaming Tests: World of Tanks Gaming Tests: F1 2019
Comments Locked

279 Comments

View All Comments

  • mitox0815 - Tuesday, April 13, 2021 - link

    Discounting the possibilty of great design ideas just because past attempts failed to varying degrees is a bit premature, methinks. But it does seem odd that it's constantly P6-esque design philosphies - VERY broadly speaking here - that take the price in the end when it comes to x86.
  • blppt - Tuesday, March 30, 2021 - link

    Even Jim Keller, the genius who designed the original x64 AMD chip, AND bailed out AMD with the excellent Zen, didn't last very long at Intel.

    Might be an indicator of how messed up things are there.
  • BushLin - Tuesday, March 30, 2021 - link

    It's still possible that a yet to be released Jim Keller designed Intel CPU finally delivers a meaningful performance uplift in the next few years... I wouldn't bet on it but it isn't impossible either.
  • philehidiot - Tuesday, March 30, 2021 - link

    Indeed, it's a generation out. It's called "Intel Dynamic Breakfast Response". It goes "ding" when your bacon is ready for turning, rather than BSOD.
  • Hifihedgehog - Tuesday, March 30, 2021 - link

    Raja Koduri is a terrible human being and has wasted money on party buses and booze while “managing” his side of the house at Intel. I think Jim Keller knew the corporation was a big pander fest of bureaucracy and was smart to leave when he did. The chiplet idea he brought to the table, while not innovation since AMD already was first to market, will help them to stay in the game which wouldn’t have happened if he hadn’t contributed it.
  • Oxford Guy - Saturday, April 3, 2021 - link

    Oh? Firstly, I doubt he was the exec at AMD who invented the FX 9000 series scam. Secondly, AMD didn’t beat Nvidia for performance per watt but the Fury X coming with an AIO was a great improvement in performance per decibel — an important metric that is frequently undervalued by the tech press.

    What he deserves the most credit for, though, is making GPUs that made miners happy. Fact is that AMD is a corporation not a charity. And, not only is it happy to sell its entire stock to miners it is pleased to compete against PC gamers by propping up the console scam.
  • mitox0815 - Tuesday, April 13, 2021 - link

    The first to the x86 market, yes. Chiplets - or modules, however you wanna call them - are MUCH much older than that. Just as AMD64 wasn't the stroke of genius it's made out to be by AMD diehards...they just repeated the trick Intel pulled off with their jump to 32 bit on the 386. Not even multicores were AMDs invention...I think both multicore CPUs and chiplet designs were done by IBM before.

    The same goes for Intel though, really. Or Microsoft. Or Apple. Or most other big players. Adopting ideas and pushing them with your market weight seems to be much more of a success story than actually innovating on your own...innovation pressure is always on the underdogs, after all.
  • KAlmquist - Wednesday, April 7, 2021 - link

    The tick-tock model was designed to limit the impact of failures. For example, Broadwell was delayed because Intel couldn't get 14nm working, but that didn't matter too much because Broadwell was the same architecture as Haswell, just on a smaller node. By the time the Skylake design was completed, Intel had fixed the issues with 14nm and Skylake was released on schedule.

    What happened next indicates that people at Intel were still following the tick-tock model but had not internalized the reasoning that led Intel to adopt the tick-tock model in the first place. When Intel missed its target for 14nm, that meant it was likely that 10nm would be delayed as well. Intel did nothing. When the target date for 10nm came and went, Intel did nothing. When the target date for Sunny Cove arrived and it couldn't be released because the 10nm process wasn't there, Intel did nothing. Four years later, Intel has finally ported it to 14nm.

    If Intel had been following the philosophy behind tick-tock, they would have released Rocket Lake in 2017 or 2018 to compete with Zen 1. They would have designed a new architecture prior to the release of Zen 3. The only reason they'd be trying to pit a Sunny Cove variant against Zen 3 would be if their effort to design a new architecture failed.
  • Khenglish - Tuesday, March 30, 2021 - link

    I've said it before but I'll say it again. Software render Crysis by setting the benchmark to use the GPU, but disable the GPU driver in the device manager. This will cause Crysis to use the built-in Windows 10 software renderer, which is much newer and should be more optimized than the Crysis software renderer. It may even use AVX and AVX2, which Crysis certainly is too old for.
  • Adonisds - Tuesday, March 30, 2021 - link

    Great! Keep doing those Dolphin emulator tests. I wish there were even more emulator tests

Log in

Don't have an account? Sign up now