The idea of hardware accelerated physics has been around for a long time and PhysX on NVIDIA GPUs has had some time to mature. There are more games coming out with support for hardware PhysX and not all of them have completely sucked. So we want to get a better picture of the impact of PhysX on our readers. Is this a thing that matters to you?

Before we get to the questions, last week saw the announcement of several upcoming titles that will support PhysX:

Terminator Salvation
Dark Void
Darkest of Days

Until we actually play the games, we won't know whether the PhysX implementation is any good though. Many of the ideas like debris, fog, smoke, contrails, destructable environments and weapons / fighting effects have seen light in other titles only to fall short of the expectation. But at least Mirror's Edge was able to take some of the same things and package them in a professional and appealing way.

There are more games still that will have support for PhysX in the near future, but other titles we've seen that touted their PhysX support (like Cryostasis) have fallen short of expectations. We certainly see a future in hardware accelerated physics, but, in the eyes of our readers, is physics hardware really "here" with NVIDIA and PhysX, or will OpenCL be the vehicle to usher in a new era in game physics?

To get a better idea of the landscape, we'll be asking two questions about PhysX software and hardware. For the software question, it would be helpful if those who do not have PhysX hardware could answer the question as if they did. We can't limit respondents to NVIDIA hardware owners, but we would like to keep things as fair as possible.

{poll 131:850}



View All Comments

  • shin0bi272 - Friday, May 08, 2009 - link

    Funny... I ran that same demo on high today at 12x10 on an i7 920 NOT overclocked with an 8800gtx and got an average of 33fps with a minimum of 16.9. Hes basically getting boned because hes recording at the same time (probably with his webcam). Maybe he has his video card's 3d rendering set to quality mode, maybe he turned off hardware physics on the nvidia control panel and set the benchmarking program to use hardware since the card does support it its just off... Or maybe he needs to update his drivers I dont know exactly. All I know is if I run at 10x7 on high I get an average fps in the 40's. Reply
  • Psynaut - Wednesday, May 06, 2009 - link

    If you want only people who own Nvidea cards to vote, you might want to provide a way for people to view the results without having to vote first. Reply
  • Sureshot324 - Wednesday, May 06, 2009 - link


    1. Games are usually GPU limited even with SLI setups, so shifting any job from CPU (where there are cores doing nothing) to GPU is a bad thing.

    2. Detrimental to people who don't own PhysX hardware, or just don't want to waste GPU power for physics. Games that support hardware physics usually have better physics effects if you use hardware physics. However, if you run PhysX in software mode, they usually don't come close to maxing out all your CPU cores. This is proof that the developers (or Nvidia) are deliberately crippling the software implementation.

    3. Detrimental to the advancement of Physics technology in games. If PhysX becomes the defacto standard, Nvidia will have no incentive to improve it. It will be like when Creative got a monopoly of the Gaming sound market. EAX 6.0 is basically the same thing as EAX 2.0 except it supports more voices.

    Hopefully Microsoft puts Physics into the next DirectX. At least then it would be hardware neutral.
  • rgallant - Wednesday, May 06, 2009 - link

    -first -Last week I put a spare 8800gts 512 in my machine for to go with my gtx 285 to finsh playing Mirrors Edge.
    -did not really see anything improved- fast game not much time to look around.
    -but what came to mind was the fact that most people mention power of a video card as seen in benches as good or bad and yet install a second\third card that add's no value to the game play , but up's the power by 100 watts plus so why all the power draw issuse for a given card and then they say it surports Phy X
    -also why when these sites compare ATI to Nvidia buy ratings they never test a eg.4890 + a 4870 [old card ] for extra value to the product value.If that still works.
    -So why I think Nvidia could have used the R&D money better would be to make some old cards give some help to the overall game FPS eg. 8800gts + gtx 285 sli thing , instead of the old card being in a closet.\ or if it's sold \ given away it's still one less card that someone does not buy retail.
    -if given the two I would pick a second helper card , not some useless eye candy that might or might not work in any given game.
  • AznBoi36 - Wednesday, May 06, 2009 - link

    I wonder what would happen if a game such as World of Warcraft supported PhysX... Reply
  • shin0bi272 - Wednesday, May 06, 2009 - link

    Ive been following the idea of hardware accelerated physx since ageia announced their idea at e3 in 03. Back then there were no games that utilized the thing except a few games here and there. Now anyone whos played ut3 or gears of war or any of several other games has played a game that can benefit from hardware accelerated physics. The coolest part of it is that the acceleration allows unscripted physics with hundreds of objects larger than a paint can at once. For the past decade and a half weve had to put up with thin fences of wood that cant be shot through with a missle launcher or run over with a tank. Hardware accelerated physics can fix things like that and make worlds more destroyable and more realistic. Now that nvidia has purchased ageia and incorporated their software into their cards we have the added bonus of getting physx on the video card itself and eliminating the Achilles's heel of the ageia p1 card... the pci bus. So now with every nvidia card you not only get one of the top performing cards in its class you also get hardware physics for added value.

    I know most people dont agree with me and that's fine. The idea has been gaining ground for 5 years. Its not going away now that nvidia has bought them out and been pushing it. Not only can you can do havok physics, or proprietary physics but you can also do hardware accelerated physics (that they give away for free instead of charging 250k like havok does).
  • fausto412 - Wednesday, May 06, 2009 - link

    If you notice any software physics game that tries to come close to what you can do with PhysX is majorly taxing on a system. Prime example is Crysis, great physics but the guy abuses whatever hardware you throw at it.

    I believe GPU physics is here to stay...but also...i've never see what OpenCl looks like. the problem with PhysX is nvidia is doing it to sell hardware with good marketing, if they wanted it to be adopted they would make it openly available and free and there would be no need for OpenCL...but if ATI was going to make it work on their cards they would have to pay that's why ATI is going OpenCl. Now the funny thing is we consumers have to deal with this nonsense when all we want is pretty graphics with awesome physics...damn the politics. if only nvidia played nice Physics would boom within 6 months...nobody bases their decisions on PhysX when we all know this is a battle between Nvidia and ATI.

    Best 3d performance bang for the buck is still king, physx is just a bonus...and that's if games you play use it. UT doesn't benefit from it unless you play the Physx mod...and nobody plays those online!
  • drwheel - Wednesday, May 06, 2009 - link

    I first read about Ageia and PhysX 3 or 4 years ago. When they had a working card out there, I remember reading the reviews, seeing screenshots, and watching the tech videos. I was underwhelmed to say the least. What was most confusing to me at the time was the added performance hit. Wasn't the idea of hw physics acceleration to increase performance?

    "But wait!", they said. "Check out these extra cool effects!" I can't speak for anyone else, but the effects weren't anything ground-breaking. Most of what I saw were added particles here and there. Ooh! To top things off, title support was dismal at the time.

    Needless to say, I wasn't surprised when they were acquired by Nvidia in 2008. I was excited at the time because I owned an 8800 GTX. Now, I thought, Nvidia will take this potentially great thing and do it right. Accelerate PhysX via the GPU, get it out there, and push developer support.

    Well, it's been almost a year and a half and what do they have to show for it? The only game I can recall doing anything cool with PhysX is Unreal Tournament III. And that implementation consisted of a pack of 3 add-on maps that play more like a tech demo than anything else. Really? Is this what I was looking forward too.

    I am not endlessly loyal to ATI or Nvidia. It so happens that my 8800 GTX died this past month. I bought an HD 4830 (which I am amazed with for $70, but that's another story) to replace it until the DX11 cards start rolling out later this year.

    The bottom line is this: When I do buy a new video card at the end of this year, NOTHING about that decision will be based on PhysX support. It's not even a passing thought.
  • McRhea - Wednesday, May 06, 2009 - link

    +1 to the comment above.
  • GaryJohnson - Wednesday, May 06, 2009 - link

    Marginal is the only right answer to the first two questions. If it's not used to do something in software that you use or plan to use, how is it even relevant?

    It's basically "it's good if it's useful", but the other choices are either "it's better than good even if it's not useful" or "it's bad because I'm not using it"

Log in

Don't have an account? Sign up now