Assassin's Creed PC

by Jarred Walton on June 2, 2008 3:00 AM EST

Graphics and Design Decisions

AC has some very impressive graphics, but there are also some design decisions that have us scratching our heads (we'll get to those in a moment). The engine supports the latest buzzword features: high dynamic range lighting, motion blur, environmental bump mapping, depth of field effects.... If you want a game that will make good use of your DirectX 10 graphics card, AC certainly qualifies. That's not to say the graphics are better than Crysis, but they do a good job of portraying a 12th-century environment. (Worth mention is that DX9 cards are also supported, and in testing on DX10 hardware we didn't immediately notice any major differences. If you're still holding on to Windows XP, you can enjoy Assassin's Creed without any serious loss of quality.)


The maps that you visit during the course of the game are the true star of the show, giving the player huge areas to explore with nary a loading screen in sight. There are loading screens in between the main areas, but once you enter a city -- i.e. Jerusalem -- you are free to explore the whole town with no further delays. That's not to say that you can explore all of each map from the start; you will need to unlock additional "memories" before you can enter the three parts of each main city (Damascus, Jerusalem, and Acre). However, late in the game it was nice to be able to navigate an entire city. We just wish some later missions would have involved more than one city area.


If there's one area of AC that truly impresses us, it is without doubt the character animations. Watching Altaïr move around through the city -- disregarding the slight swing in his hips -- looks natural, but that's nothing special. It's when you start running, jumping, dodging through crowds, and climbing various structures that the work of the artists begins to shine. If you ever watched some of the online videos of Parkour or Free Running, you will have some idea of what it's like to move through the cities in AC. We wouldn't go so far as to call the animation "perfect", but for the amount of environmental interaction that takes place it comes darn close. The fluid animation extends to the fight sequences, where Altaïr has a huge variety of finishing moves. Which move he uses is essentially random, but they all look great.

One other aspect of the game that certainly warrants discussion is the control system. Most PC gamers have now experienced ports that suffer from a severe case of "consolitis" -- the feeling that a game was ported from a platform where a gamepad was standard, and no one ever took the time to properly modify things so that mouse and keyboard users will feel at home. While there are a few areas where colsolitis shows up -- without using the Alt+F4 shortcut, it can take well over a minute to exit the game as you navigate through the various menus and areas -- mouse and keyboard support is very well done. In fact, mouse and keyboard support is so well done I almost wonder how people could play this game using a gamepad. I tried it with my Xbox 360 controller, and it only took a few minutes before I was back to using the mouse and keyboard. Others might feel differently, but this is one console port where a gamepad is definitely not required.


Now let's get to our complaints with the graphics and game design. One complaint is that areas tend to look very similar. Building and architectural styles might vary a bit between Damascus, Jerusalem, and Acre -- not to mention the Kingdom map -- but the Middle East setting does necessitate a certain look and feel with little to differentiate one town from another. Initial impressions are very favorable, but the longer you play the game the more repetitive the environments begin to feel.

Perhaps a bigger concern is Ubisoft's choice of presentation. We've complained in the past about games that lack proper widescreen support -- for example, the Battlefield series does not offer proper aspect ratio control, with early titles stretching the image to fill your screen and the last two versions cropping the top and bottom edges. The approach in AC is different, but in some ways it's almost worse. Regardless of what resolution you choose to run at, AC will render as a 16:9 aspect ratio. If you have a 16:10 computer monitor, you will get black bars on the top and bottom. If you have a 4:3 or 5:4 display, you'll get even larger black bars on the top and bottom! Only 16:9 resolutions (i.e. 1280x720 or 1920x1080) avoid the black bars, but unless you're playing on an HDTV you'll end up with a stretched image or black bars regardless. We understand that there's a certain amount of artistic presentation in a game, but it still shocks us that the designers felt it was better to force users to see a widescreen image than to let them decide for themselves how the game looks best.

Then we come to the major can of worms: DirectX 10.1. What exactly does DirectX 10.1 allow you to do that can't be done in DirectX 10? In terms of actual graphic effects, it adds nothing. What DirectX 10.1 does allow is the ability to do certain effects in a single pass rather than two separate passes. So if one graphics chip requires five passes to render a scene and a competing chip can render the same scene in four passes, all other things being equal you would expect the four-pass chip to be ~25% faster. As far as we can tell, Ubisoft uses (used) DirectX 10.1 in order to render anti-aliasing without incurring a substantial performance hit. Depending on your computer system, if you have one of ATI's HD 3000 series cards, you may be able to enable anti-aliasing with little to no performance loss. That's the only difference we noticed during testing, though the 1.02 patch removes DX10.1 support. We'll look at this in a moment, but we need to talk about another complaint with the game design first.

"Free" anti-aliasing sounds great -- if it works. The problem is, at present the only way to get anti-aliasing is via the in-game menus; trying to force anti-aliasing through the ATI or NVIDIA drivers does not work right now. This wouldn't be a concern at all, except Ubisoft decided to limit anti-aliasing support to certain resolutions. Specifically, the maximum resolution that allows users to enable anti-aliasing is 1680x1050. That was a very shortsighted decision, considering newer and faster graphics cards and other hardware are continuously appearing. While top-end hardware of today might struggle at 2560x1600 with 4xAA, there's little doubt that future hardware will be able to run such settings on AC without difficulty.

There are basically three areas that we would like to see addressed with a future patch. First, we would like to see the aspect ratio support "fixed". Users should be given the option to choose whether they would like an letterbox format with black bars on the top and bottom or an image that fills their entire screen. Second, we would like to see anti-aliasing allowed across all resolutions; we understand that not all computer hardware is going to be able to handle high resolutions with 4xAA, but at least give us the chance to try it out. Computer gamers are generally savvy when it comes to tweaking graphics settings; if it's too slow, we are fully capable of disabling anti-aliasing or lowering the resolution. Finally, Ubisoft really needs to bring back DirectX 10.1 support. It's one thing to not add a feature that a lot of gamers can't use; it's a completely different story to remove a feature that was already present. We can't help but feel there were some conversations between Ubisoft and NVIDIA personnel that resulted in the removal of DX10.1 support.

And with that said, let's take a look at actual graphics performance on a couple of systems.

More Gameplay Test Setup
Comments Locked

32 Comments

View All Comments

  • bill3 - Monday, June 2, 2008 - link

    Actually it's terrible, I cant read the graphs AT ALL.

    seriously my eyes just glazed over those terrible charts..completely unreadable. I still, have no idea what I'm looking at. Is ATI supposed to be faster in this game? Why did they test with version 1.00 on ATI and 1.2 on Nvidia? I dont know because the graphs are totally useless.
  • Nihility - Monday, June 2, 2008 - link

    I second that. The graphs are terrible. Maybe bar graphs would have been better?
    Sometimes when you're the one making the graph it's hard to imagine what other people are seeing when they look at them. I suggest having another pair of eyes check the graphs out for readability.

    Besides that, I loved the review. Especially the performance part and the 10.1 controversy.
  • JarredWalton - Tuesday, June 3, 2008 - link

    Charts are colored with similar colors used either for ATI vs. NVIDIA, 1.00 vs. 1.02, or dual-GPU vs. single-GPU. I could have generated four times as many graphs to show the same data, but I figure most people are capable of reading the labels on a chart and figuring out what they mean. Here's a hint: when you can't see the difference between two lines because they overlap, it's a tie.

    If you want to give specific examples and recommendations on what would look better and still convey the same amount of information, I'm all ears. However, simply stating that "the graphs are terrible" does little to help. Tell me what graph specifically is terrible, and tell me why it's terrible.

    As an example of why I used these graphs, page 9 has two charts showing 40 total data points. You can get a clear idea of how performance scales with single or dual GPUs at the various detail settings looking at a single chart. Green is NVIDIA, Red is ATI. That makes a lot of sense to me. Creating ten different bar charts with four lines in each to show the same data makes it more difficult to compare how Medium graphics compares to High graphics performance, and it takes up five times as much space to tell the same "story".

    Page 6 is the same thing, but with green used for dual-GPUs (light and dark for 1.00 and 1.02) and red for single GPUs. 24 data points in two charts instead of using six charts. Having established that 1.00 doesn't perform any different than 1.02 on NVIDIA hardware, I skipped the 1.00 NVIDIA numbers to make those charts easier to read on page 7. Then I put in the four standard test system (0xAA and 4xAA, ATI and NVIDIA) on 1.02, with 1.00 4xAA ATI in blue as a reference.

    Lastly, on page 8 I have two clock speeds on NVIDIA, three on ATI, with different base colors for single and dual GPUs. ATI and NVIDIA are in separate charts, and brighter colors are for a higher overclock.

    There's method to my graphing madness. Are the charts immediately clear to a casual glance? No, but then that's really difficult to do while still conveying all of the information. I spent a lot of time trying to make comprehensible charts, and settled on these as the best option I could come up with. Again, if they're so bad, it must be easy to generate something clearly better - have at it, and I'll be happy to use any sensible suggestions. However, if the only complaint is that you actually have to look at the charts and think for a minute before you understand, I'm not likely to be very sympathetic. I think our readers are smart enough to digest these graphs.
  • mpjesse - Monday, June 2, 2008 - link

    While I appreciate the detailed review, isn't it a little irrelevant now? I mean, the game's been out for nearly 2 months now and it's been reviewed everywhere. The only thing new about this review are the performance benchmarks, in which case I would have have made the review solely about performance instead of gameplay.

    Just my 2 cents.
  • ImmortalZ - Monday, June 2, 2008 - link

    Its sad that the companies with money always manage to suppress innovation.

    I hope this article by AT will raise some ruckus in the collective Interwebs and cause something. But I doubt it.
  • ViRGE - Monday, June 2, 2008 - link

    For what it's worth, another forum I read had some screenshots comparing DX10 and DX10.1. The problems the poster had managed to find involved trees; there was some kind of post-processing rendering going on with trees that wasn't occurring with DX10.1, which made them look weird.

    Not fixing 10.1 may be an NVIDIA thing, but there was definitely a problem with it as-is.
  • tuteja1986 - Monday, June 2, 2008 - link

    Well why where the hell is nvidia dx10.1 support if dx10.1 actually brings some kind of performance improvement in AA.

    Why aren't GT200 series have DX10.1 ?

    I thought PC gaming was all about being the cutting edge on all technology front...

    Anyways , this is not the 1st time Ubisoft or Nvidia have done this.

  • wyemarn - Monday, June 2, 2008 - link

    Maybe because Nvidia GPUs cant support AA through shaders. So no use supporting dx 10.1. ATI GPUs have 320 stream processors so it can utilize for shaders and etc. Nvidia cards have less SPs but more ROPs, TMUs which translates to more brute power if games dont use shaders or SPs much. Technology wise, I think ATI is ahead but NVIDIA GPUs have game developer support and more raw horsepower so performance wise NVIDIA is ahead and I think this trend will continue with GTX200 series. I choosed G92 over RV670 because the raw performance is much better even though on paper HD 3800 series look great.
  • SteelSix - Monday, June 2, 2008 - link

    Worthy of a thread in Video. I just started one..
  • Gannon - Monday, June 2, 2008 - link

    The original halo had performance issues but they weren't alarming, halo was actually not too bad port compared to many other console to PC disasters. Halo 1 got 'better with hardware' advancing. Halo 2 on the other hand is just all around atrocious. Halo 2 was just not a very well made game, period, despite the addition of cutscenes, etc. Halo 1 had a much better feel and better vehicle design IMHO, I hated how the warthog looked in Halo 2, it annoyed me to no end.

Log in

Don't have an account? Sign up now