Introduction

A little over a year ago, we first heard about a company called AGEIA whose goal was to bring high quality physics processing power to the desktop. Today they have succeeded in their mission. For a short while, systems with the PhysX PPU (physics processing unit) have been shipping from Dell, Alienware, and Falcon Northwest. Soon, PhysX add-in cards will be available in retail channels. Today, the very first PhysX accelerated game has been released: Tom Clancy's Ghost Recon Advanced Warfighter, and to top off the excitement, ASUS has given us an exclusive look at their hardware.

We have put together a couple benchmarks designed to illustrate the impact of AGEIA's PhysX technology on game performance, and we will certainly comment heavily on our experience while playing the game. The potential benefits have been discussed quite a bit over the past year, but now we finally get a taste of what the first PhysX accelerated games can do.

With NVIDIA and ATI starting to dip their toes into physics acceleration as well (with Havok FX and in-house demos of other technology), knowing the playing field is very important for all parties involved. Many developers and hardware manufacturers will definitely give this technology some time before jumping on the bandwagon, as should be expected. Will our exploration show enough added benefit for PhysX to be worth the investment?

Before we hit the numbers, we want to take another look at the technology behind the hardware.

AGEIA PhysX Technology and GPU Hardware
Comments Locked

101 Comments

View All Comments

  • DrZoidberg - Sunday, May 7, 2006 - link

    Yes dual core CPU's aren't being properly utilised by games. Only handful of games like COD2, Quake4 etc.. have big improvements with the dual core CPU patches. I would rather game companies spend time trying to utilise dual core so the 2nd core gets to do alot of physics work, rather then sitting mostly idle. Plus the cost of AGEIA card is too high. Already I'm having trouble justifying buying a 7900gt or x1900 when i have a decent graphics card, cant upgrade graphics every year. $300 for physics card that only handful of games support is too much. And the AT benchies dont show alot of improvement.
  • Walter Williams - Friday, May 5, 2006 - link

    We are starting to see multithreaded games that basically do this.

    However, it is very difficult and time consuming to make a game multi-threaded, hence why not many games are this way.
  • Hypernova - Friday, May 5, 2006 - link

    Physx API claims to be multy core compatiable, what will happen is probably is the the API and engine will load balance the calculation between any available resources which is either the PPU, CPU or better yet both.
  • JarredWalton - Friday, May 5, 2006 - link

    Isn't it difficult to reprogram a game to make use of the hardware accelerated physics as well? If the GRAW tests perform poorly because the support is "tacked on", wouldn't that suggest that doing PhysX properly is at least somewhat difficult? Given that PhysX has a software and hardware solution, I really want to be able to flip a switch and watch the performance of the same calculations running off the CPU. Also, if their PhysX API can be programmed to take advantage of multiple threads on the PhysX card (we don't have low level details, so who knows?), it ought to be able to multithred calculations on SMP systems as well.

    I'd like to see the CellFactor people add the option of *trying* to run everything without the PhysX card. Give us an apples-to-aplles comparison. Until we can see that sort of comparison, we're basically in the dark, and things can be hidden quite easily in the dark....
  • Walter Williams - Friday, May 5, 2006 - link

    PPU support is simply achieved by using the Novadex physics engine or any game engine that uses the Novadex engine (ex/ Unreal Engine 3.0). The developers of GRAW decided to take a non-basic approach to adding PPU support, adding additional graphical effects for users of the PPU - this is similar to how Far Cry 64 advertises better graphics b/c it is 64bits as an advertising gimmick. GRAW seems to have issues in general and is not a very reliable test.

    At quakecon '05, I had the opportunity to listen to the CEO of Ageia speakd and then meet with Ageia representatives. They had a test system that was an Athlon64 X2, a top of the line video card, and a PPU. The demo that I was able to play was what looked to be an early version of the Unreal Engine 3.0 (maybe Huxley?) and during the demo they could turn on and off PPU support. Everytime we switched between the two, we would take notice of the CPU usage meter and of the FPS and there was a huge difference.

    It will be really interesting to see what happens when Microsoft releases their physics API (think DirectX but for physics) - this should make everyones lives better.
  • johnsonx - Friday, May 5, 2006 - link

    Having downloaded and viewed the videos, my reaction is "so what?". I guess the physx sequence has a bit more crap flying around, but it's also quite alot slower (slower probably than just letting the CPU process the extra crap). It seems obvious that this game doesn't make proper use of the physx card, as I can't otherwise imagine that Aegia would have wasted so much time and money on it.
  • DerekWilson - Friday, May 5, 2006 - link

    quote:

    (slower probably than just letting the CPU process the extra crap)


    We'd really love to test that, but it is quite impossible to verify right now.
  • Walter Williams - Friday, May 5, 2006 - link

    Have you seen the CellFactor videos yet?
  • Spoonbender - Friday, May 5, 2006 - link

    Well, looking at the videos, I know what I prefer. The framerate hit with physx is definitely too noticeable. I'm curious to see how this turns out in other games, and with driver revisions and newer versions of the hardware (and probably pci-e would be a good idea as well)

    In any case, I read somewhere they weren't expecting these cards to evolve as fast as GPU's. Rather, it'd have a life cycle about the same as for soundcards. That seemed a bit encouraging to me. Having to fork out $300+ for yet another card every year or two didn't sound too attractive. But if I just have to buy it once, I guess it might catch on.
  • tk109 - Friday, May 5, 2006 - link

    With quad cores around the corner this really isn't looking to promising.

Log in

Don't have an account? Sign up now