Wolfenstein II: The New Colossus (Vulkan)

id Software is popularly known for a few games involving shooting stuff until it dies, just with different 'stuff' for each one: Nazis, demons, or other players while scorning the laws of physics. Wolfenstein II is the latest of the first, the sequel of a modern reboot series developed by MachineGames and built on id Tech 6. While the tone is significantly less pulpy nowadays, the game is still a frenetic FPS at heart, succeeding DOOM as a modern Vulkan flagship title and arriving as a pure Vullkan implementation rather than the originally OpenGL DOOM.

Featuring a Nazi-occupied America of 1961, Wolfenstein II is lushly designed yet not oppressively intensive on the hardware, something that goes well with its pace of action that emerge suddenly from a level design flush with alternate historical details.

The highest quality preset, "Mein leben!", was used. Wolfenstein II also features Vega-centric GPU Culling and Rapid Packed Math, as well as Radeon-centric Deferred Rendering; in accordance with the preset, neither GPU Culling nor Deferred Rendering was enabled. NVIDIA Adaptive Shading was not enabled.

In summary, Wolfenstein II tends to scales well, enables high framerates with minimal CPU bottleneck, enjoys running on modern GPU architectures, and consumes VRAM like nothing else. For the Turing-based RTX 2060 (6GB), this results in outpacing the GTX 1080 as well as RX Vega 56 at 1080p/1440p. The 4K results can be deceiving; looking closer at 99th percentile framerates shows a much steeper dropoff, more likely than not to be related to the limitations of the 6GB framebuffer. We've already seen the GTX 980 and 970 struggle at even 1080p, chained by 4GB video memory.

Ashes of the Singularity: Escalation Final Fantasy XV


View All Comments

  • sing_electric - Monday, January 7, 2019 - link

    It's likely that Nvidia has actually done something to restrict the 2060s to 6GB - either though its agreements with board makers or by physically disabling some of the RAM channels on the chip (or both). I agree, it'd be interesting to see how it performs, since I'd suspect it'd be at a decent price/perf point compared to the 2070, but that's also exactly why we're not likely to see it happen. Reply
  • CiccioB - Monday, January 7, 2019 - link

    You can't add memory at will. You need to take into consideration the available bus, and as this is a 192bit bus, you can install 3, 6 or 12 GB of memory unless you cope with hybrid configuration thorough heavily optimized drivers (as nvidia did with 970). Reply
  • nevcairiel - Monday, January 7, 2019 - link

    Even if they wanted to increase it, just adding 2GB more is hard to impossible. The chip has a certain memory interface, in this case 192-bit. Thats 6x 32-bit memory controller, for 6 1GB chips. You cannot just add 2 more without getting into trouble - like the 970, which had unbalanced memory speeds, which was terrible. Reply
  • mkaibear - Tuesday, January 8, 2019 - link

    "terrible" in this case defined as "unnoticeable to anyone not obsessed with benchmark scores" Reply
  • Retycint - Tuesday, January 8, 2019 - link

    It was unnoticeable back then, because even the most intensive game/benchmark rarely utilized more than 3.5GB of RAM. The issue, however, comes when newer games inevitably start to consume more and more VRAM - at which point the "terrible" 0.5GB of VRAM will become painfully apparent. Reply
  • mkaibear - Wednesday, January 9, 2019 - link

    So, you agree with my original comment which was that it was not terrible at the time? Four years from launch and it's not yet "painfully apparent"?

    That's not a bad lifespan for a graphics card. Or if you disagree can you tell me which games, now, have noticeable performance issues from using a 970?

    FWIW my 970 has been great at 1440p for me for the last 4 years. No performance issues at all.
  • atragorn - Monday, January 7, 2019 - link

    I am more interested in that comment " yesterday’s announcement of game bundles for RTX cards, as well as ‘G-Sync Compatibility’, where NVIDIA cards will support VESA Adaptive Sync. That driver is due on the same day of the RTX 2060 (6GB) launch, and it could mean the eventual negation of AMD’s FreeSync ecosystem advantage." will ALL nvidia cards support Freesync/Freesync2 or only the the RTX series ? Reply
  • A5 - Monday, January 7, 2019 - link

    Important to remember that VESA ASync and FreeSync aren't exactly the same.

    I don't *think* it will be instant compatibility with the whole FreeSync range, but it would be nice. The G-sync hardware is too expensive for its marginal benefits - this capitulation has been a loooooong time coming.
  • Devo2007 - Monday, January 7, 2019 - link

    Anandtech's article about this last night mentioned support will be limited to Pascal & Turing cards Reply
  • Ryan Smith - Monday, January 7, 2019 - link

    https://www.anandtech.com/show/13797/nvidia-to-sup... Reply

Log in

Don't have an account? Sign up now