Assassin's Creed PC

by Jarred Walton on June 2, 2008 3:00 AM EST

Further Anti-Aliasing Investigations

When it comes to anti-aliasing in AC, the heart of the matter is the differences in graphics architectures. ATI's current GPUs do not have dedicated anti-aliasing hardware. Instead, they rely on the pixel shaders to achieve the same result. Depending on the type of graphics work being done, this approach can have a negative impact on performance. ATI seems to have gotten around this via some of the extensions to DirectX 10.1. NVIDIA graphics chips do include anti-aliasing hardware, but that hardware cannot be properly utilized in some situations. Specifically, there are post-rendering effects that interfere with the use of anti-aliasing hardware -- that's why some games don't support anti-aliasing at all.

If you haven't figured it out already, there is a universal "solution" for applying anti-aliasing effects that doesn't depend on the use (or non-use) of other shader effects. The solution is to use pixel shaders. ATI has apparently decided that's the most practical solution as we move forward and SM 3.0/SM4.0 usage increases, but it does require work on the part of software developers. Another downside to this approach is that it requires more work from the pixel shader hardware, which can result in lower performance. However, it seems the only other option is to omit anti-aliasing in certain types of graphics rendering.

With all that out of the way, let's look at the performance of AC with and without anti-aliasing at several different resolutions. Unfortunately, as previously mentioned we are unable to test anti-aliasing at higher resolutions because the option is disabled inside the game.








We can see the result of using pixel shaders to do anti-aliasing in the resulting performance drop. What's noteworthy is that the drop isn't nearly as bad on ATI hardware running AC version 1.00. In other words, the 1.02 patch levels the playing field and forces ATI and NVIDIA to both use an extra rendering pass in order to do anti-aliasing. That probably sounds fair if you're NVIDIA -- or you own NVIDIA hardware -- but ATI users should be rightly upset.

Something else that's interesting to see is how performance is definitely CPU limited even with a quad-core 3.0GHz Intel chip. We will look at that next, but right now we are more interested in the steady drop caused by anti-aliasing. If the CPU is the bottleneck, putting more of a load on the GPU should not impact performance much. For whatever reason, it seems that the extra rendering pass required for anti-aliasing causes a steady drop in performance even when we're CPU limited. Have we mentioned yet that you really need some beefy hardware in order to play AC at higher detail settings?

1.00 vs. 1.02 - Does it Matter? CPU Performance Scaling
Comments Locked

32 Comments

View All Comments

  • Griswold - Monday, June 2, 2008 - link

    Thats no excuse. Halo sucked performance and gameplay wise compared to the PC-first titles of then - and that is what matters. In essence, the game is bad when you're used to play that genre on the PC. Same is true for gears of war but that port is lackluster in many more ways.

    I fell two times for console to PC ports. Never again.
  • bill3 - Monday, June 2, 2008 - link

    The even worst shooter is Resistance on PS3.

Log in

Don't have an account? Sign up now