Gaming Tests: Strange Brigade

Strange Brigade is based in 1903’s Egypt, and follows a story which is very similar to that of the Mummy film franchise. This particular third-person shooter is developed by Rebellion Developments which is more widely known for games such as the Sniper Elite and Alien vs Predator series. The game follows the hunt for Seteki the Witch Queen, who has arose once again and the only ‘troop’ who can ultimately stop her. Gameplay is cooperative centric with a wide variety of different levels and many puzzles which need solving by the British colonial Secret Service agents sent to put an end to her reign of barbaric and brutality.

The game supports both the DirectX 12 and Vulkan APIs and houses its own built-in benchmark as an on-rails experience through the game. For quality, the game offers various options up for customization including textures, anti-aliasing, reflections, draw distance and even allows users to enable or disable motion blur, ambient occlusion and tessellation among others. Strange Brigade supports Vulkan and DX12, and so we test on both.

  • 720p Low, 1440p Low, 4K Low, 1080p Ultra

The automation for Strange Brigade is one of the easiest in our suite – the settings and quality can be changed by pre-prepared .ini files, and the benchmark is called via the command line. The output includes all the frame time data.

AnandTech Low Res
Low Qual
Medium Res
Low Qual
High Res
Low Qual
Medium Res
Max Qual
Average FPS
95th Percentile

All of our benchmark results can also be found in our benchmark engine, Bench.

Gaming Tests: Red Dead Redemption 2 Broadwell with eDRAM: Still Has Gaming Legs
Comments Locked

120 Comments

View All Comments

  • Billy Tallis - Wednesday, November 4, 2020 - link

    Ian already said he tests at JEDEC speeds, which includes the latency timings. Using modules that are capable of faster timings does not prevent running them at standard timings.
  • Quantumz0d - Tuesday, November 3, 2020 - link

    Don't even bother Ian with these people.
  • Nictron - Wednesday, November 4, 2020 - link

    I appreciate the review and context over a period of time. Having a baseline comparison is important and it is up to us the reader to determine the optimal environment we would like to invest in. As soon as we do the price starts to skyrocket and comparisons are difficult.

    Reviews like this also show that a well thought out ecosystem can deliver great value. Companies are here to make money and I appreciate reviewers that provide baseline compatible testing over time for us to make informed decisions.

    Thank you and kind regards,
  • GeoffreyA - Tuesday, November 3, 2020 - link

    Thanks, Ian. I thoroughly enjoyed the article and the historical perspective especially. And the technical detail: no other site can come close.
  • eastcoast_pete - Tuesday, November 3, 2020 - link

    Ian, could you comment on the current state of the art of EDRAM? How fast can it be, how low can the latency go? Depending on those parameters and difficulty of manufacturing, there might be a number of uses that make sense.
    One where it could is to possibly allow Xe graphics to use cheaper and lower power LPDDR-4 or -5 RAM without taking a large performance hit vs. GDDR6. 128 or 256 MB EDRAM cache might just do that, and still keep costs lower. Pure speculation, of course.
  • DARK_BG - Tuesday, November 3, 2020 - link

    Hi , what I'm wondering is where the 30% gap between the 5770C and 4790K in Games came from , compared to your original review and all other reviews out there of 5770C. Since I'm with a Z97 platform and 4.0GHz Xeon , moving to 4770k or 4790K doesn't make any sense given their second hand prices but 5770C on this review makes alot of sense.

    So is it the OS,the drivers , some BIOS settings or on the older reviews the systems were just GPU limited failing to explore the CPU performance?
  • jpnex - Friday, January 8, 2021 - link

    Lol, no, the I7 5775c is just stronger than an i7 4790k, this is a known fact. Other benchmarks show the same thing. Old benchmarks don't show It because back then people didn't know that deactivating the iGPU would give a performance boost.
  • DARK_BG - Wednesday, July 20, 2022 - link

    I forgot back then to reply back , based on this review I've sourced 5775C (for a little less than 100$ this days going for 140-150$) coupled with Asus Z97 Pro and after some tweaking (CPU at 4.1GHz , eDRAM at 2000MHz and some other minor stuff that I already forgot) the difference compared to the Xeon 4.0GHz in games was mind blowing.Later I was able to source and 32GB Corsair Dominator DDR3 2400MHz CL10 just for fun to make it top spec config. :)

    It is a very capable machine but this days I'll swap it for Ryzen 5800X3D to get the final train on the fastest Windows 7 capable gaming system.Yeah i know it is OLD OS but everything I need runs flawessly for more than a decade with only reainstall 7 years ago due to an SSD failure. It is my only personal Intel System for the past 22 years since it was the for a first time the best price performance second hand platform for a moment , all the rest were AMD based and I keep them all in working condition.

    BTW I was able to run Windows XP 64bit on the Z97 platform , I just need to swap the GTX 1070 for GTX 980/980 Ti to be fully functional everything else runs like a charm under XP i was able to hack the driver to install as an GTX 960 so I have a 2D hardware acceleration under XP on GTX 1070 since nvidia havent changed anything in regard to 2D compared to the previous generation
  • dew111 - Tuesday, November 3, 2020 - link

    Rocket lake should have been the comet lake processor with eDRAM. Instead they'll be lucky to beat comet lake at all.
  • erotomania - Tuesday, November 3, 2020 - link

    Thanks, Ian. I enjoyed this article from a NUC8i7BEH that has 128MB of coffee-flavored eDRAM. Also, thanks Ganesh for the recent reminder that Bean > Frost.

Log in

Don't have an account? Sign up now