STALKER: Call of Pripyat – A Peak at More DX11

For the 5970 launch, AMD sent over a special benchmark version of STALKER: Call of Pripyat, which has since then been made public at the GSC Game World STALKER website. STALKER is another one of AMD’s big DX11 titles, as it’s technically the 2nd DX11 title to launch and the first such title to implement most of the major DX11 features. The Russian version actually shipped back in October, and the German version shipped 2 weeks ago. However we’re told that these versions only had an early-stage implementation of the DX11 feature set, and that the demo is more representative of where the game is after patching and what it will be like when it finally launches in the rest of the world early next year.

Since it’s an unplayable demo, we’re going to forgo any competitive benchmarking (it’s not as if anyone else has a DX11 card anyhow) but we will look quickly at the performance impact of these features, since this is the closest thing we have to a playable game using them at this point in time.

STALKER makes use of 3 major DX11 features.

  1. High Definition Ambient Occlusion using compute shaders
  2. Tessellation
  3. Contact hardening shadows

We’ve already seen HDAO with Battleforge, and it’s no different here in STALKER. And we’ve covered tessellation in-depth in our look at DirectX 11.

So today, let’s talk about contact hardening shadows. Shadowing has been on a path of particularly noticeable evolution. The first real shadows, seen in such titles as Doom 3, had very hard edges. Stencil buffering was used to determine where a shadow would fall, and that was it. Diffusion was never taken into account. Newer games have since taken diffusion into account to generate soft shadows, but these shadows aren’t necessarily accurate. Currently soft shadows are implemented with a fixed degree of softness around the entire shadow, which isn’t how shadows really work.

With diffusion, the softness of a shadow increases with the distance of the casting object from the surface the shadow is being displayed on. AMD loves to use a light pole as an example, as the top of the shadow should be softer than the bottom. These shadows are referred to as contact hardening shadows, and the use of them in games has been limited by the expense of calculating them using the DX10 feature set. STALKER allows for contact hardening shadows with DX10.1 and DX11 mode.

Unfortunately a moving benchmark makes for a poor source to take screenshots, so we’re going to stick with AMD’s reference shots here. Only contact hardening shadows are particularly noticeable in the benchmark; tessellation and HDAO are there, but are virtually impossible to catch given the zoomed-out nature of the benchmark and the fact that it’s constantly in motion.

The benchmark comes with 4 different flybys, each under different environmental conditions: day, night, day with rain, and day with sun shafts. We’ve gone ahead and benchmarked the game during the “day” flyby, once with the DX11 feature set enabled, and once with it disabled. This means for DX11 mode tessellation, contact hardening shadows, and Ultra HDAO were enabled; and for DX10 tessellation and contact hardening shadows were disabled, and High HDAO was used.

STALKER: Call of Pripyat, Day Benchmark DX10 DX11
Average FPS 31.4 35.1
Minimum FPS 17.7 21.2

Enabling all of these features actually causes performance to rise, thanks to the more efficient implementation of HDAO as a compute shader as opposed to a pixel shader. Ultimately what this means is that unless HDAO is disabled entirely, STALKER is going to be faster on a DX11 card running the entire DX11 feature set than it will be when running the DX10 feature set.

The biggest performance hit, and the reason we’re not breaking 40fps here even with a 5970, is due to how anti-aliasing is implemented in STALKER. As it uses deferred rendering, the game does its own anti-aliasing. We used 4X MSAA here along with per-pixel alpha transparency testing (basically Adaptive/Transparancy AA). Disabling anti-aliasing improves performance dramatically.

The Card They Beg You to Overclock Radeon HD 5970 Eyefinity
Comments Locked

114 Comments

View All Comments

  • palladium - Wednesday, November 18, 2009 - link

    Since AMD is binning their chips to get the 5970 within spec, I suppose it wouldn't make sense to make a 5950 SKU since a 5850 is simply a re-harvested 5870 (which failed the initial binning process), and 2x5850 would be out of the ATX spec anyway.

    Anyway, a great card for those who can afford it, and have the proper case and PSU to handle it.
  • Paladin1211 - Wednesday, November 18, 2009 - link

    With 512 SP, 6.67% more than a GTX 295, I dont see Fermi has any chance of beating the 5970. nVidia will need a dual Fermi to dethrone the 5970, and thats not happening until Q3 or Q4 2010.

    nVidia has targeted a wrong, niche market rather than gamers. Sooner or later monitors without bezel will come out, and Eyefinity makes much more sense. Its really funny that the R770s aka HD 4870s are in 1 out of 5 fastest supercomputers and not Tesla.

    They have taken a long, deep sleep after the 8800GTX and now they're paying for it.
  • cmdrdredd - Wednesday, November 18, 2009 - link

    Unfortunately, PC gaming is almost dead. Look at Call of Duty's release. Look at Dragon Age which is also available on consoles. Sure the PC version might look a bit better, but when you spend as much on a video card as someone does on an entire system that can download movies, demos, act as a media box, and play Blu-Rays...you get the point.
  • Lurker0 - Wednesday, November 18, 2009 - link

    Unfortunately, PC gaming has been declared "nearly dead" for decades. It hasn't died, and as much as console fanboys will rage on hearing this, it isn't going to either.

    PC gaming is a niche industry, it always has been and always will be. Yes, console games do tend to be more profitable, which means that most games will be developed for consoles first and then ported to the PC. Doesn't mean there will never be games developed for the PC first(or even exclusivly), or that there's no profit to be had in PC games.

    Yes, it can be cheaper to get a console than a mid-level gaming PC, just like it can be cheaper to just buy some econobox off the lot than to buy or build your own hot rod. Sure, one market is much larger and more profitable than the other, but there's still money to be made off of PC games and gearheads alike, and so long as that's true neither will be going away.
  • DominionSeraph - Thursday, November 19, 2009 - link

    PC gaming is no longer an isolated economy, though. That changes things. With most games being written with consoles in mind, there isn't the broad-based software push for hardware advance that there was at the dawn of 3d acceleration.
    I could give you dozens of great reasons to have upgraded from a NV4 to a NV15 back in the day, but the upgrade from a G80 to 5970 today? ~$800 when you factor in the PSU, and for what? Where's the must-have game that needs it? TNT to Geforce 2 was two years -- it's now been 3 since the release of the 8800, and there's been no equivalent to a Half Life, Quake 2, Deus Ex, Homeworld, Warcraft III, or WoW.
  • GourdFreeMan - Thursday, November 19, 2009 - link

    Unfortunately, this is precisely the problem. When looking at AAA (large budget) games, six years ago PC game sales were predominantly PC exclusives, with some well known console ports (e.g. Halo, Morrowind). Twelve years ago PC game sales were almost entirely exclusives. Today the console ports are approaching the majority of high profile PC titles.

    Being multiplatform isn't necessarily a detriment for a console game. After all, having a larger budget allows more money to be spent on art and polishing the code to get the best performance on console hardware. In most cases, however, the PC version of a multiplatform title is almost always an afterthought. Little to no effort is spent redesigning the user interface and rebalancing game play because of the different controls. Shaders are almost never rewritten to take advantage of effects that could only be accomplished with the power of the graphic cards in modern PCs when porting. At most we seem to get better textures at somewhat higher resolutions.

    The biggest problem with multiplatform development, however, is that multiplatform games are almost always aimed at the lowest common denominator in terms of both technology and content. All this does is produce the same game over and over again -- the clichéd rail shooter in a narrow environment with a macho/reluctant superhuman protagonist thrown against hordes of respawning mooks.

    Based on the quarterly reports of sales revenue from the major publishers (EA, Activision and Ubisoft), PC games sales are comparable to PS3 game sales. The PS3, however, has several more exclusives because Sony owns several games studios and forces them to release exclusives. AMD and nVIDIA do not, much to PC gaming's detriment.
  • mschira - Wednesday, November 18, 2009 - link

    Hehe 5970CF to power three screens, now that sounds like a killer setup.
    Besides that one's burning 600+ watts for the graphic. What's the CPU supposed to live on? The BIOS-Battery?
    M.
  • monomer - Wednesday, November 18, 2009 - link

    Wouldn't it be possible to run six screens using a 5970 CF setup, or are there other limitations I'm unaware of?
  • Fleeb - Wednesday, November 18, 2009 - link

    600W for the whole setup. :S
  • maximusursus - Wednesday, November 18, 2009 - link

    It really seems weird...:( I've seen some reviews that had way better overclocking than the standard 5870 clocks and their tests seem to be ok without any "throttling" problems.

    For example:

    Techspot: 900/1250
    HotHardware: 860/1220
    Tom's Hardware: 900/1300
    HardOCP: 900/1350 (!)
    Guru3D: 900/1200

    HardwareZone however had a similar problem with you guys, could it really be the PSU?

Log in

Don't have an account? Sign up now