ASUS PhysX Card

It's not as dramatic as a 7900 GTX or an X1900 XTX, but here it is in all its glory. We welcome the new ASUS PhysX card to the fold:



The chip and the RAM are under the heatsink/fan, and there really isn't that much else going on here. The slot cover on the card has AGEIA PhysX written on it, and there's a 4-pin Molex connector on the back of the card for power. We're happy to report that the fan doesn't make much noise and the card doesn't get very warm (especially when compared to GPUs).

We did have an occasional issue when installing the card after the drivers were already installed: after we powered up the system the first time, we couldn't use the AGEIA hardware until we hard powered our system and then booted up again. This didn't happen every time we installed the card, but it did happen more than once. This is probably not a big deal and could easily be an issue with the fact that we are using early software and early hardware. Other than that, everything seemed to work great in the two pieces of software it's currently possible to test.

Our test system is setup similarly to our graphics test systems, with the addition of a low speed CPU. We were curious to find out if the PhysX card helps out slower processors more than fast CPUs, so we set our FX-57 to a 9X multiplier to simulate an Opteron 144. Otherwise, the test bed is the same as we've used for recent GPU reviews:

AMD Athlon 64 FX-57
AMD Opteron 144 (simulated)
ASUS NVIDIA nForce4 SLI X16 Motherboard
2GB OCZ DDR RAM
ATI Radeon X1900 XTX
ASUS PhysX PPU
Windows XP SP2
OCZ PowerStream 600W PSU


Now let's see how the card actually performs in practice.

AGEIA PhysX Technology and GPU Hardware PhysX Performance
Comments Locked

101 Comments

View All Comments

  • DrZoidberg - Sunday, May 7, 2006 - link

    Yes dual core CPU's aren't being properly utilised by games. Only handful of games like COD2, Quake4 etc.. have big improvements with the dual core CPU patches. I would rather game companies spend time trying to utilise dual core so the 2nd core gets to do alot of physics work, rather then sitting mostly idle. Plus the cost of AGEIA card is too high. Already I'm having trouble justifying buying a 7900gt or x1900 when i have a decent graphics card, cant upgrade graphics every year. $300 for physics card that only handful of games support is too much. And the AT benchies dont show alot of improvement.
  • Walter Williams - Friday, May 5, 2006 - link

    We are starting to see multithreaded games that basically do this.

    However, it is very difficult and time consuming to make a game multi-threaded, hence why not many games are this way.
  • Hypernova - Friday, May 5, 2006 - link

    Physx API claims to be multy core compatiable, what will happen is probably is the the API and engine will load balance the calculation between any available resources which is either the PPU, CPU or better yet both.
  • JarredWalton - Friday, May 5, 2006 - link

    Isn't it difficult to reprogram a game to make use of the hardware accelerated physics as well? If the GRAW tests perform poorly because the support is "tacked on", wouldn't that suggest that doing PhysX properly is at least somewhat difficult? Given that PhysX has a software and hardware solution, I really want to be able to flip a switch and watch the performance of the same calculations running off the CPU. Also, if their PhysX API can be programmed to take advantage of multiple threads on the PhysX card (we don't have low level details, so who knows?), it ought to be able to multithred calculations on SMP systems as well.

    I'd like to see the CellFactor people add the option of *trying* to run everything without the PhysX card. Give us an apples-to-aplles comparison. Until we can see that sort of comparison, we're basically in the dark, and things can be hidden quite easily in the dark....
  • Walter Williams - Friday, May 5, 2006 - link

    PPU support is simply achieved by using the Novadex physics engine or any game engine that uses the Novadex engine (ex/ Unreal Engine 3.0). The developers of GRAW decided to take a non-basic approach to adding PPU support, adding additional graphical effects for users of the PPU - this is similar to how Far Cry 64 advertises better graphics b/c it is 64bits as an advertising gimmick. GRAW seems to have issues in general and is not a very reliable test.

    At quakecon '05, I had the opportunity to listen to the CEO of Ageia speakd and then meet with Ageia representatives. They had a test system that was an Athlon64 X2, a top of the line video card, and a PPU. The demo that I was able to play was what looked to be an early version of the Unreal Engine 3.0 (maybe Huxley?) and during the demo they could turn on and off PPU support. Everytime we switched between the two, we would take notice of the CPU usage meter and of the FPS and there was a huge difference.

    It will be really interesting to see what happens when Microsoft releases their physics API (think DirectX but for physics) - this should make everyones lives better.
  • johnsonx - Friday, May 5, 2006 - link

    Having downloaded and viewed the videos, my reaction is "so what?". I guess the physx sequence has a bit more crap flying around, but it's also quite alot slower (slower probably than just letting the CPU process the extra crap). It seems obvious that this game doesn't make proper use of the physx card, as I can't otherwise imagine that Aegia would have wasted so much time and money on it.
  • DerekWilson - Friday, May 5, 2006 - link

    quote:

    (slower probably than just letting the CPU process the extra crap)


    We'd really love to test that, but it is quite impossible to verify right now.
  • Walter Williams - Friday, May 5, 2006 - link

    Have you seen the CellFactor videos yet?
  • Spoonbender - Friday, May 5, 2006 - link

    Well, looking at the videos, I know what I prefer. The framerate hit with physx is definitely too noticeable. I'm curious to see how this turns out in other games, and with driver revisions and newer versions of the hardware (and probably pci-e would be a good idea as well)

    In any case, I read somewhere they weren't expecting these cards to evolve as fast as GPU's. Rather, it'd have a life cycle about the same as for soundcards. That seemed a bit encouraging to me. Having to fork out $300+ for yet another card every year or two didn't sound too attractive. But if I just have to buy it once, I guess it might catch on.
  • tk109 - Friday, May 5, 2006 - link

    With quad cores around the corner this really isn't looking to promising.

Log in

Don't have an account? Sign up now