Call of Juarez

There has been quite a bit of controversy and drama surrounding the journey of Call of Juarez from DirectX 9 to DirectX 10. As many may remember, AMD handed out demos of the DirectX 10 version of Call of Juarez prior to the launch of R600. This build didn't fully support NVIDIA hardware, so many review sites opted not to test it. On its own, this is certainly fine and no cause for worry. It's only normal to expect a company to want to show off something cool running on their hardware even if it isn't as fully functional as the final product will be.

But, after NVIDIA found out about this, they set out to help Techland bring their demo up to par and get it to run properly on G80 based systems. Some publications were able to get an updated build of the game from Techland which included NVIDIA's fixes. When we requested the same from them, they declined to provide us with this updated code. They cited the fact that they would be releasing a finalized benchmark in the near future. Again, this was fine with us and nothing out of the ordinary. We would have liked to get our hands on the NVIDIA update, but it's Techland's code and they can do what they want with it.

Move forward to the release of the of the Call of Juarez benchmark we currently have for testing, and now we have a more interesting situation on our hands. Techland decided to implement something they call "HDR Correct" antialiasing. This feature is designed to properly blend polygon edges in cases with very high contrast due to HDR lighting. Using a straight average or even a "gamma corrected" blend of MSAA samples can result in artifacts in extreme cases when paired with HDR.



The real caveat here is that doing HDR correct AA requires custom MSAA resolve. AMD hardware must always necessarily perform AA resolves in the shader hardware (as the R/RV6xx line lack dedicated MSAA resolve hardware in their render backends), so this isn't a big deal for them. NVIDIA's MSAA hardware, on the other hand, is bypassed. The ability of DX10 to allow individual MSAA samples to be read back is used to perform the custom AA resolve work. This incurs quite a large performance hit for what NVIDIA says is little to no image quality gain. Unfortunately, we are unable to compare the two methods ourselves, as we don't have the version of the benchmark that actually ran using NVIDIA's MSAA hardware.

NVIDIA also tells us that some code was altered in Call of Juarez's parallax occlusion mapping shader that does nothing but degrade the performance of this shader on NVIDIA hardware. Again, we are unable to verify this claim ourselves. There are also other minor changes that NVIDIA feels unnecessarily paint AMD hardware in a better light than the previous version of the benchmark.

But Techland's response to all of this is that game developers are the one's who have the final say in what happens with their code. This is definitely a good thing, and we generally expect developers to want to deliver the best experience possible to their users. We certainly can't argue with this sentiment. But whether or not anything is going on under the surface, it's very clear that Techland and NVIDIA are having some relationship issues.

No matter what's really going on, it's better for the gamer if hardware designers and software developers are all able to work closely together to design high quality games that deliver a consistent experience to the end user. We want to see all of this as just an unfortunate series of miscommunications. And no matter what the reason, we are here today with what Techland has given us. The performance of their code as it is written is the only thing that really matters, as that is what gamers will experience. We will leave all other speculation in the hands of the reader.

So, what are the important DirectX 10 features that this benchmark uses? We see geometry shaders to simulate water particle effects, alpha-to-coverage for smooth leaf and grass edges, and custom MSAA resolve for HDR correct AA.

Call of Juarez


Call of Juarez Performance


The AMD Radeon HD 2900 XT clearly outperforms the GeForce 8800 GTS here. At the low end, none of our cards are playable under any option the Call of Juarez benchmark presents. While all the numbers shown here are with large shadow maps and high quality shadows, even without these features, the 2400 XT only posted about 10 fps at 1024x768. We didn't bother to test it against the rest of our cards because it just couldn't stack up.

Call of Juarez


Call of Juarez 4xAA Performance


With 4xAA enabled, our low-end NVIDIA hardware really tanks. Remember that even these cards must resolve all MSAA samples in their shader hardware. AMD's parts are designed to always handle AA in this manner, but NVIDIA's parts only support the feature inasmuch as DX10 requires it.

We do see some strange numbers from the low-end NVIDIA cards at 1600x1200, but its likely that they performed so poorly here that rendering certain aspects of the scene failed to the point of improving performance (in other words, it's likely not everything was rendered properly even though we didn't notice anything).

The Test Company of Heroes
Comments Locked

59 Comments

View All Comments

  • DerekWilson - Thursday, July 5, 2007 - link

    this is true -- our current information shows that AMD does relatively worse than NVIDIA when compared under DX10 than under DX9.
  • rADo2 - Thursday, July 5, 2007 - link

    "there are applications where the 2900 xt does outperform its competition" - where? 2900XT has 22FPS, 8800GTX 24FPS nad 8800ULTRA 26FPS. Despite "crippled for NVIDIA" / "paid by ATI" I see still green camp to outperform ATI.

    And in Lost Planet NVIDIA has 2x (!) better performance.

    It is not even worth considering ATI for purchase.
  • smitty3268 - Thursday, July 5, 2007 - link

    Read the comment again: The point is that the 2900XT does not compete against the 8800GTX, it competes against the 8800GTS, which it did outperform in that test.

    It certainly isn't the fastest card available, but I could also make a statement like "The GeForce7300Go outperforms it's competition" without saying it's the fastest thing available. I'm just saying it beats the cards in a similar price range.
  • KeithTalent - Thursday, July 5, 2007 - link

    Um, what about price? Last time I checked the 8800GTX still costs about $150 more than the 2900XT. I will not even bring up the Ultra which is still way overpriced.

    So for $150 less you get a card that competes with the GTX some of the time and is more than capable of playing most games maxed out at high resolution. That is why the 2900XT is worth considering for purchase.

    KT

  • rADo2 - Thursday, July 5, 2007 - link

    Problem is, 2900XT is many times NOT playable. Its performance is sometimes close to NVIDIA, sometimes 2x lower. And this applies for both DX9 and DX10. With 60+ games I own, I can assume ATI would "suck" on 30 of them.

    Look at Lost Planet with AA, 22FPS versus 40-50FPS is a huge difference in playability. On 8800GTX/ULTRA you can play even at 1920x1200 (30FPS), with 2900XT even 1280x1024 is giving you problems.
  • KeithTalent - Thursday, July 5, 2007 - link

    Well I have two of them and they work more than fine on every single game I have tried so far, including demos, betas, and a couple of older games (using Catalyst 7.6).

    Lost Planet is a POS port anyway, but when I ran the test benchmark in DX9 with Crossfired 2900XTs I had frames well above 40 with everything maxed at 1920x1200 so I am somewhat confused by the numbers here. I will have to wait until I am home to see my exact numbers, but they were much higher than what was presented here. Maybe there is something wonky with the beta drivers?

    I'll post back tonight once I have verified my numbers.

    KT

  • DerekWilson - Thursday, July 5, 2007 - link

    We didn't use the demo benchmark, and the release version (afaik) does not include an updated version of the benchmark either.

    For the Lost Planet test, we had to use FRAPS running through the snow. This will absolutely give lower performance, as the benchmark runs through a couple in door scenes as well with higher framerates.

    I mentioned FRAPS on the test page, but I'll add to the Lost Planet section the method we used for testing.
  • defter - Thursday, July 5, 2007 - link

    Do Intel CPUs outperform currently AMD or not? After all, $200 AMD CPU is about as fast as $200 Intel CPU....

    It's natural that slower parts have good price/peformance ratio compared to competition since otherwise nobody would buy them. However, this has nothing to do which one fastest...

  • KeithTalent - Thursday, July 5, 2007 - link

    Not sure what you are getting at, I was responding to this ridiculous statement:

    quote:

    It is not even worth considering ATI for purchase.


    Which is completely untrue, because price can be a big consideration.

    With respect to CPUs, if you spend an extra $50 - $100 for the better Intel processor, you are getting exponentially better performance (I know this from experience), while if you spend $150 more for a GTX, you are getting only marginally better performance.

    KT

Log in

Don't have an account? Sign up now