Call of Juarez

There has been quite a bit of controversy and drama surrounding the journey of Call of Juarez from DirectX 9 to DirectX 10. As many may remember, AMD handed out demos of the DirectX 10 version of Call of Juarez prior to the launch of R600. This build didn't fully support NVIDIA hardware, so many review sites opted not to test it. On its own, this is certainly fine and no cause for worry. It's only normal to expect a company to want to show off something cool running on their hardware even if it isn't as fully functional as the final product will be.

But, after NVIDIA found out about this, they set out to help Techland bring their demo up to par and get it to run properly on G80 based systems. Some publications were able to get an updated build of the game from Techland which included NVIDIA's fixes. When we requested the same from them, they declined to provide us with this updated code. They cited the fact that they would be releasing a finalized benchmark in the near future. Again, this was fine with us and nothing out of the ordinary. We would have liked to get our hands on the NVIDIA update, but it's Techland's code and they can do what they want with it.

Move forward to the release of the of the Call of Juarez benchmark we currently have for testing, and now we have a more interesting situation on our hands. Techland decided to implement something they call "HDR Correct" antialiasing. This feature is designed to properly blend polygon edges in cases with very high contrast due to HDR lighting. Using a straight average or even a "gamma corrected" blend of MSAA samples can result in artifacts in extreme cases when paired with HDR.



The real caveat here is that doing HDR correct AA requires custom MSAA resolve. AMD hardware must always necessarily perform AA resolves in the shader hardware (as the R/RV6xx line lack dedicated MSAA resolve hardware in their render backends), so this isn't a big deal for them. NVIDIA's MSAA hardware, on the other hand, is bypassed. The ability of DX10 to allow individual MSAA samples to be read back is used to perform the custom AA resolve work. This incurs quite a large performance hit for what NVIDIA says is little to no image quality gain. Unfortunately, we are unable to compare the two methods ourselves, as we don't have the version of the benchmark that actually ran using NVIDIA's MSAA hardware.

NVIDIA also tells us that some code was altered in Call of Juarez's parallax occlusion mapping shader that does nothing but degrade the performance of this shader on NVIDIA hardware. Again, we are unable to verify this claim ourselves. There are also other minor changes that NVIDIA feels unnecessarily paint AMD hardware in a better light than the previous version of the benchmark.

But Techland's response to all of this is that game developers are the one's who have the final say in what happens with their code. This is definitely a good thing, and we generally expect developers to want to deliver the best experience possible to their users. We certainly can't argue with this sentiment. But whether or not anything is going on under the surface, it's very clear that Techland and NVIDIA are having some relationship issues.

No matter what's really going on, it's better for the gamer if hardware designers and software developers are all able to work closely together to design high quality games that deliver a consistent experience to the end user. We want to see all of this as just an unfortunate series of miscommunications. And no matter what the reason, we are here today with what Techland has given us. The performance of their code as it is written is the only thing that really matters, as that is what gamers will experience. We will leave all other speculation in the hands of the reader.

So, what are the important DirectX 10 features that this benchmark uses? We see geometry shaders to simulate water particle effects, alpha-to-coverage for smooth leaf and grass edges, and custom MSAA resolve for HDR correct AA.

Call of Juarez


Call of Juarez Performance


The AMD Radeon HD 2900 XT clearly outperforms the GeForce 8800 GTS here. At the low end, none of our cards are playable under any option the Call of Juarez benchmark presents. While all the numbers shown here are with large shadow maps and high quality shadows, even without these features, the 2400 XT only posted about 10 fps at 1024x768. We didn't bother to test it against the rest of our cards because it just couldn't stack up.

Call of Juarez


Call of Juarez 4xAA Performance


With 4xAA enabled, our low-end NVIDIA hardware really tanks. Remember that even these cards must resolve all MSAA samples in their shader hardware. AMD's parts are designed to always handle AA in this manner, but NVIDIA's parts only support the feature inasmuch as DX10 requires it.

We do see some strange numbers from the low-end NVIDIA cards at 1600x1200, but its likely that they performed so poorly here that rendering certain aspects of the scene failed to the point of improving performance (in other words, it's likely not everything was rendered properly even though we didn't notice anything).

The Test Company of Heroes
Comments Locked

59 Comments

View All Comments

  • slickr - Monday, July 9, 2007 - link

    Great review, thats what we all need to get Nvidia and ATI stop bitchin around and stealing our money with slow hardware that can't even outperform last generations hardware. If you ask me the 8800Ultra should be the middle 150$ class here and top end should be some graphic card with 320 stream processors 1GB GDDR4 clocked at 2.4GHZ and 1000MHz core clock, same from amd they need the X2900XT to be middle 150$ class and top of the line should be some graphic card with 640stream processors 1GB GDDR4 2.4GHz and 1000MHz core clock!

    More of this kind of reviews please so we can put to ATI and Nvidia we won't buy their hardware if its not good!!!!!!!!

  • ielmox - Tuesday, July 24, 2007 - link

    I really enjoyed this review. I have been agonizing over selecting an affordable graphics card that will give me the kind of value I enjoyed for years from my trusty and cheap GF5900xt (which runs Prey, Oblivion, and EQ2 at decent quality and frame rates) and I am just not seeing it.

    I'm avoiding ATI until they bring their power use under control and generally get their act together. I'm avoiding nVidia because they're gouging the hell out of the market. And the previous generation nVidia hardware is still quite costly because nVidia know very well that they've not provided much of an upgrade with the 8xxx family, unless you are willing to pay the high prices for the 8800 series (what possessed them to use a 128bit bus on everything below the 8800?? Did they WANT their hardware to be crippled?).

    As a gamer who doesn't want to be a victim of the "latest and greatest" trends, I want affordable performance and quality and I don't really see that many viable options. I believe we have this half-baked DX10 and Vista introduction to thank for it - system requirements keep rocketing upwards unreasonably but the hardware economics do not seem to be keeping pace.

  • AnnonymousCoward - Saturday, July 7, 2007 - link

    Thanks Derek for the great review. I appreciate the "%DX10 performance of DX9" charts, too.
  • Aberforth - Thursday, July 5, 2007 - link

    This article is ridiculous. Why would Nvidia and other dx10 developers want gamers to buy G80 card for high dx10 performance? DX10 is all about optimization, the performance factor depends on how well it is implemented and not by blindly using API's. Vista's driver model is different and dx10 is different. The present state of Nvidia drivers are horrible, we can't even think of dx10 performance at this stage.

    the dx10 version of lost planet runs horribly eventhough it is not graphically different from dx9 version. So this isn't dx10 or GPU's fault, it's all about the code and the drivers. Also the CEO of Crytek has confirmed that Nvidia 8800 (possibly 8800GTS) and E6600 CPU can max Crysis in Dx10 mode.

    Long back when dx9 came out I remember reading an article about how it sucked badly. So I'm definetly not gonna buy this one.
  • titan7 - Thursday, July 12, 2007 - link

    No, it's not about sucky code or sucky drivers. It's about shaders. Look at how much faster cards with more shader power are in d3d9. Now in d3d10 longer, richer, prettier shaders are used that take more power to process.

    It's not about optimization this time as the IHVs have already figured out how to write optimized drivers, it's about raw FLOPS for shader performance.
  • DerekWilson - Thursday, July 5, 2007 - link

    DX9 performance did (and does) "suck badly" on early DX9 hardware.

    DX10 is a good thing, and pushing the limits of hardware is a good thing.

    Yes drivers and game code can be rocky right now, but the 162 from NVIDIA are quite stable and NV is confident in their performance. Lost planet shows that NV's drivers are at least getting close to parity with DX9.

    This isn't an article about DX10 not being good, it's an article about early DX10 hardware not being capable of delivering all that DX10 has to offer.

    Which is as true now as it was about early DX9 hardware.
  • piroroadkill - Friday, July 6, 2007 - link

    Wait, performance on the Radeon 9700 Pro sucked? I seem to remember games several years later that were DirectX 9 still being playable...
  • DerekWilson - Saturday, July 7, 2007 - link

    yeah, 9700 pro sucks ... when actually running real world DX9 code.

    Try running BF2 at any playable setting (100% view distance, high shadows and lighting). This is really where games started using DX9 (to my knowledge, BF2 was actually the first game to require DX9 support to run).

    But many other games still include the ability to run 1.x shaders rather 2.0 ... Like Oblivion can turn the detail way down to the point where there aren't any DX9 heavy features running. But if you try to enable them on a 9700 Pro it will not run well at all. I actually haven't tested Oblivion at the lowest quality so I don't know if it can be playable on a 9700 Pro, but if it is, it wouldn't even be the same game (visually).
  • DerekWilson - Saturday, July 7, 2007 - link

    BTW, BF2 was released less than 3 years after the 9700 Pro ... (aug 02 to june 05) ...
  • Aberforth - Thursday, July 5, 2007 - link

    Fine...

    Just want to know why a DX10 game called Crysis was running at 2048x1536 res with 60+ FPS equipped with Geforce 8800 GTX.

    crysis-online.com/?id=172

Log in

Don't have an account? Sign up now