Let's Talk about PhysX Baby

When AMD and NVIDIA both started talking about GPU accelerated physics the holy grail was the following scenario:

You upgrade your graphics card but now instead of throwing your old one away, you simply use it for GPU accelerated physics. Back then NVIDIA didn't own AGEIA and this was the way of competing with AGEIA's PPU, why buy another card when you can simply upgrade your GPU and use your old one for pretty physics effects?

It was a great promise but something that was never enabled in a useable way by either company until today. Previous NVIDIA drivers with PhysX support required that you hook a monitor up to the card that was to accelerate PhysX and extend the windows desktop onto that monitor. With NVIDIA's 180.48 drivers you can now easily choose which NVIDIA graphics card you'd like to use for PhysX. Disabling PhysX, enabling it on same GPU as the display, or enabling it on a different GPU are now as easy as picking the right radio button option and selecting the card from a drop down menu.

When we tested this, it worked. Which was nice. While it's not a fundamental change in what can be done, the driver has been refined to the point where it should have been in the first place. It is good to have an easy interface to enable and adjust the way PhysX runs on the system and to be able to pick whether PhysX runs on the display hardware (be it a single card or an SLI setup) or on a secondary card. But this really should have already been done.

There is another interesting side effect. When we enabled PhysX on our secondary card, we noticed that the desktop had been extended onto a non-existent monitor.

Windows has a limitation of not allowing GPUs to be used unless they are enabled as display devices, which was the cause of the cumbersome issues with enabling PhysX on a secondary card in the first place. Microsoft hasn't fixed anything from their end, but NVIDIA has made all the mucking around with windows transparent. It seems they simply tell windows a display is connected when it is actually not. It's a cool trick, but hopefully future versions of Windows will not require such things.

Mirror's Edge: The First Title to Impress us with GPU PhysX?

Around every single GPU release, whether from AMD or NVIDIA, we get a call from NVIDIA telling us to remember that only on NVIDIA hardware can you get PhysX acceleration (not physics, but PhysX). We've always responded by saying that none of the effects enabled by PhysX in the games that support it are compelling enough for us to recommend an underperforming NVIDIA GPU over a more competitive AMD one. Well, NVIDIA promises that Mirror's Edge, upon its release in January for the PC will satisfy our needs.

We don't have the game nor do we have a demo for anything other than consoles, but NVIDIA promises it'll be good and has given us a video that we can share. To under line the differences between the PhysX and non-PhysX version, here's what to look for: glass fragments are a particle system without PhysX and persistent objects with (meaning they stick around and can be interacted with). Glass fragments are also smaller and more abundant with PhysX. Cloth is non-interactive and can't be ripped torn or shot through without PhysX (it will either not there at all or it won't respond to interaction). Some of the things not shown really clearly are that smoke responds to and interacts with characters and leaves and trash will blow around to help portray wind and in response to helicopters.

Another aspect to consider is the fact that PhysX effects can be run without GPU acceleration at greatly reduced performance. This means that AMD users will be able to see what their missing. Or maybe an overclocked 8 core (16 thread) Nehalem will do the trick? Who knows... we really can't wait to get our hands on this one to find out.

We'll let you be the judge, is this enough to buy a NVIDIA GPU over an AMD one? What if the AMD one was a little cheaper or a little faster, would it still be enough?

We really want to see what the same sequence would have looked like with PhysX disabled. Unfortunately, we don't have a side by side video. But that could also significantly impact our recommendation. We are very interested in what Mirror's Edge has to offer, but it looks like getting the full picture means waiting for the game to hit the streets.

Multi-monitor SLI Users Rejoice! Driver Performance Improvements
POST A COMMENT

63 Comments

View All Comments

  • Finally - Friday, November 21, 2008 - link

    Thank you Derek for your insightful posting, clarity and all.
    The only lesson I can extract from your writing is the common man's knowledge that you shouldn't mess around with SLI/Crossfire, ever.

    @Tejas:
    [quote]As a 3870X2 quadfire and 4870 Crossfire owner I can say without doubt that AMD driver support is lousy and bordering on scandalous... I still do not have a Crossfire profile for Fallout 3 and it has been almost a month."[/quote]

    Stop bitching. You called for your personal grief and you got it delivered alright. If you got too much time on your hands and want to spend them on ridiculous hobbies, so be it - but don't bitch for the common man.
    Reply
  • Finally - Friday, November 21, 2008 - link

    To clarify the meaning of "calling for personal grief":
    Putting too many graphics cards in your rig is like hiring a motorcycle gang to beat you up with sticks and chains and all and then running around the town, showing your bruises and bloodpouring to everyone complaining how bad you are feeling after that paid-for encounter...
    Reply
  • tejas84 - Thursday, November 20, 2008 - link

    Derek Wilson is 100% right. As a 3870X2 quadfire and 4870 Crossfire owner I can say without doubt that AMD driver support is lousy and bordering on scandalous... I still do not have a Crossfire profile for Fallout 3 and it has been almost a month.

    I had to wait for TWO catalyst revisions until Crysis Warhead and Stalker CS had profiles as well as GRID, Assassins Creed, World in Conflict etc etc....

    Nvidia put the effort to work with developers to ensure the games work with their hardware and integrate SLI profiles. AMD are arrogant and I remember an AMD moderator say that the TWIMTBP program was simply paying for a logo. For a company betting everything on multi GPU isnt it strange that AMD doesnt work with devs to get Crossfire profiles into game.

    Well actually they pay so that their games work well with the latest games. AMD are lazy and cut corners just like with their CPUs and frankly I am going to sell up my AMD cards and go exclusively Nvidia from now on...

    Bottom line... anyone who thinks that Derek is being harsh has NEVER OWNED AN ATI CROSSFIRE SETUP BEFORE....

    Regards

    Reply
  • Griswold - Friday, November 21, 2008 - link

    Point and case why multi-GPU solution suck donkey nuts, no matter what team you depend on - you depend on them twice as much as everyone else (one for the raw driver and its bugs or lack thereof and one for the profiles). No thanks to that.

    Tough luck, I say. And with nvidia on, what seems to be a financial downward slope, it remains to be seen if they're willing and capable to deliver in the future. Good luck, I say.
    Reply
  • Goty - Thursday, November 20, 2008 - link

    So wait, I think you're forgetting the whole "Call of Juarez" deal. ATI had a deal with the developer there in the same manner that NVIDIA has a deal with all the developers that participate in the TWIMTBP program. NVIDIA's hardware performed like crap in the game when it was first released and everyone cried foul, saying that it was "unfair" and "anti-competitive" for AMD to do something like that.

    Now, if we want to talk about anti-competitive, what about NVIDIA's dubious dealings with Ubisoft and Assassin's Creed and DirectX 10.1 support? Hmmm...
    Reply
  • tejas84 - Thursday, November 20, 2008 - link

    addendum,

    Well actually they pay so that their games work well with the latest games- this refers to Nvidia
    Reply
  • chizow - Thursday, November 20, 2008 - link

    Its not the first, Anand recently ripped into ATI drivers in his Core i7 launch review:

    quote:

    We have often had trouble with AMD drivers, especially when looking at CrossFire performance. The method that AMD uses to maintain and test their drivers necessitates eliminating some games from testing for extended periods of time. This can sometimes result in games that used to work well with AMD hardware or scale well with CrossFire to stop performing up to par or to stop scaling as well as they should.

    The consistent fix, unfortunately, has been for review sites to randomly stumble upon these problems. We usually see resolutions very quickly to issues like this, but that doesn't change the fact that it shouldn't happen in the first place.


    Its a problem that has been gaining momentum lately and has drawn a LOT of attention with the recent Farcry 2 driver debacle. First there was the issue of render errors, hitching in DX10 and overall poor performance without FPS caps. Then there were hot fixes, fixes for hot fixes and further hot fixes. Then there were CF problems with newer drivers that necessitated using drivers that had the render errors or DX10 stuttering or both. But it comes down to this, if the recommended fix for a problem is to revert to prior drivers, its pretty clear the monthly WHQL program isn't working.

    ATI gets more heat because their drivers tend to be more reactive than Nvidia, who tends to be more proactive with their TWIMTBP program and driver updates that come in advance or arrive in tandem with hot launch titles. This latest round of reviews and performance in top 5 titles would confirm this.

    ATI has also made multi-GPU their solution for high-end performance, which means their products rely heavily CF scaling and compatibility. A big problem here is that ATI does not have user-defined profiles for games like Nvidia, which means there is no recourse if you have poor CF scaling or performance short of workarounds like renaming game .exes.
    Reply
  • giantpandaman2 - Thursday, November 20, 2008 - link

    Where's the blame on Ubisoft Montreal? Can't a game company release a game that works with a large portion of video cards?

    That said, I think AMD should go to once every other month. Less overhead, more things fixed with the same amount of man hours. nVidia drivers simply take too damn long. They go to the opposite extreme if you ask me. I owned an 8800GT and it took them 9 months to get their video card fully compatible with my monitor in Vista64. That's ridiculous.

    But, seriously, why do people only blame driver makers and not the fricken game makers who have easy access to the hardware?
    Reply
  • DerekWilson - Thursday, November 20, 2008 - link

    game developers are hitting a moving target as well. they don't have the drivers that will be out when their game launches until their game launches ... and it would have been final for month(s) before that.

    in contrast, AMD and NVIDIA can get their hands on those games months before hand and make sure that drivers work they way they should with the software.

    there is developer responsibility to be sure, but a driver issue is a driver issue ... game devs can't shoulder that burden.
    Reply
  • JonnyDough - Friday, November 21, 2008 - link

    Hence, MORE STANDARDS! Reply

Log in

Don't have an account? Sign up now