Final Words

Ideally, we would have a few more games to test in order to get a better understanding of what developers are doing with the hardware. We'd also love a little more flexibility in how the software we test handles hardware usage and physics detail. For example, what sort of performance can be had using multithreaded physics calculations on dual-core or multi-core systems? Can a high-end CPU even handle the same level of physics detail as with the PhysX card, or has GRAW downgraded the complexity of the software calculations for a reason? It would also be very helpful if we could dig up some low level technical detail on the hardware. Unfortunately, you can't always get what you want.

For now, the tests we've run here are quite impressive in terms of visuals, but we can't say for certain whether or not the PPU contributes substantially to the quality. From what GRAW has shown us, and from the list of titles on the horizon, it is clear that developers are taking an interest in this new PPU phenomenon. We are quite happy to see more interactivity and higher levels of realism make their way into games, and we commend AGEIA for their role in speeding up this process.

The added realism and immersion of playing Ghost Recon Advanced Warfighter with hardware physics is a huge success in this gamer's opinion. Granted, the improved visuals aren't the holy grail of game physics, but this is an excellent first step. In a fast fire fight with bullets streaming by, helicopters raining destruction from the heavens, and grenades tearing up the streets, the experience is just that much more hair raising with a PPU plugged in.

If every game out right now supported some type of physics enhancement with a PPU under the hood, it would be easy to recommend it to anyone who wants higher image quality than the most expensive CPU and GPU can currently offer. For now, one or two games aren't going get a recommendation for spending the requisite $300, especially when we don't know the extent of what other developers are doing. For those with money to burn, it's certainly a great part to play with. Whether it actually becomes worth the price of admission will remain to be seen. We are hopefully optimistic having seen these first fruits, especially considering how much more can be done.

Obviously, there's going to be some question of whether or not the PPU will catch on and stay around for the long haul. Luckily, software developers need not worry. AGEIA has worked very hard to do everything right, and we think they're on the right track. Their PhysX SDK is an excellent software physics solution its own right - Sony is shipping it with every PS3 development console, and there are XBox 360 games around with the PhysX SDK powering them as well. Even if the hardware totally fails to gain acceptance, games can still fall back to a software solution. Unfortunately, it's still up to developers to provide the option for modifying physics quality under software as well as hardware, as GRAW demonstrates.

As of now, the PhysX SDK has been adopted by engines such as: UnrealEngine3 (Unreal Tournament 2007), Reality Engine (Cell Factor), and Gamebryo (recently used for Elder Scrolls IV: Oblivion, though Havok is implimented in lieu of PhysX support). This type of developer penetration is good to see, and it will hopefully provide a compelling upgrade argument to consumers in the next 6-12 months.

We are still an incredibly long way off from seeing games that require the PhysX PPU, but it's not outside the realm of possibility. With such easy access to the PhysX SDK for developers, there has got to be some pressure now for those one to two year timeframe products to get in as many beyond-the-cutting-edge features as possible. Personally, I'm hoping the AGEIA PhysX hardware support will make it onto the list. If AGEIA is able to prove their worth on the console middleware side, we may end up seeing a PPU in XBox3 and PS4 down the line as well. There were plenty of skeptics that doubted the PhysX PPU would ever make it out the door, but having passed that milestone, who knows how far they'll go?

We're still a little skeptical about how much the PhysX card is actually doing that couldn't be done on a CPU -- especially a dual core CPU. Hopefully this isn't the first "physics decellerator", rather like the first S3 Virge 3D chip was more of a step sideways for 3D than a true enhancement. The promise of high quality physics acceleration is still there, but we can't say for certain at this point how much faster a PhysX card really makes things - after all, we've only seen one shipping title, and it may simply be a matter of making better optimizations to the PhysX code. With E3 on the horizon and more games coming out "real soon now", rest assured that we will have continuing coverage of AGEIA and the PhysX PPU.

PhysX Performance
Comments Locked

101 Comments

View All Comments

  • DrZoidberg - Sunday, May 7, 2006 - link

    Yes dual core CPU's aren't being properly utilised by games. Only handful of games like COD2, Quake4 etc.. have big improvements with the dual core CPU patches. I would rather game companies spend time trying to utilise dual core so the 2nd core gets to do alot of physics work, rather then sitting mostly idle. Plus the cost of AGEIA card is too high. Already I'm having trouble justifying buying a 7900gt or x1900 when i have a decent graphics card, cant upgrade graphics every year. $300 for physics card that only handful of games support is too much. And the AT benchies dont show alot of improvement.
  • Walter Williams - Friday, May 5, 2006 - link

    We are starting to see multithreaded games that basically do this.

    However, it is very difficult and time consuming to make a game multi-threaded, hence why not many games are this way.
  • Hypernova - Friday, May 5, 2006 - link

    Physx API claims to be multy core compatiable, what will happen is probably is the the API and engine will load balance the calculation between any available resources which is either the PPU, CPU or better yet both.
  • JarredWalton - Friday, May 5, 2006 - link

    Isn't it difficult to reprogram a game to make use of the hardware accelerated physics as well? If the GRAW tests perform poorly because the support is "tacked on", wouldn't that suggest that doing PhysX properly is at least somewhat difficult? Given that PhysX has a software and hardware solution, I really want to be able to flip a switch and watch the performance of the same calculations running off the CPU. Also, if their PhysX API can be programmed to take advantage of multiple threads on the PhysX card (we don't have low level details, so who knows?), it ought to be able to multithred calculations on SMP systems as well.

    I'd like to see the CellFactor people add the option of *trying* to run everything without the PhysX card. Give us an apples-to-aplles comparison. Until we can see that sort of comparison, we're basically in the dark, and things can be hidden quite easily in the dark....
  • Walter Williams - Friday, May 5, 2006 - link

    PPU support is simply achieved by using the Novadex physics engine or any game engine that uses the Novadex engine (ex/ Unreal Engine 3.0). The developers of GRAW decided to take a non-basic approach to adding PPU support, adding additional graphical effects for users of the PPU - this is similar to how Far Cry 64 advertises better graphics b/c it is 64bits as an advertising gimmick. GRAW seems to have issues in general and is not a very reliable test.

    At quakecon '05, I had the opportunity to listen to the CEO of Ageia speakd and then meet with Ageia representatives. They had a test system that was an Athlon64 X2, a top of the line video card, and a PPU. The demo that I was able to play was what looked to be an early version of the Unreal Engine 3.0 (maybe Huxley?) and during the demo they could turn on and off PPU support. Everytime we switched between the two, we would take notice of the CPU usage meter and of the FPS and there was a huge difference.

    It will be really interesting to see what happens when Microsoft releases their physics API (think DirectX but for physics) - this should make everyones lives better.
  • johnsonx - Friday, May 5, 2006 - link

    Having downloaded and viewed the videos, my reaction is "so what?". I guess the physx sequence has a bit more crap flying around, but it's also quite alot slower (slower probably than just letting the CPU process the extra crap). It seems obvious that this game doesn't make proper use of the physx card, as I can't otherwise imagine that Aegia would have wasted so much time and money on it.
  • DerekWilson - Friday, May 5, 2006 - link

    quote:

    (slower probably than just letting the CPU process the extra crap)


    We'd really love to test that, but it is quite impossible to verify right now.
  • Walter Williams - Friday, May 5, 2006 - link

    Have you seen the CellFactor videos yet?
  • Spoonbender - Friday, May 5, 2006 - link

    Well, looking at the videos, I know what I prefer. The framerate hit with physx is definitely too noticeable. I'm curious to see how this turns out in other games, and with driver revisions and newer versions of the hardware (and probably pci-e would be a good idea as well)

    In any case, I read somewhere they weren't expecting these cards to evolve as fast as GPU's. Rather, it'd have a life cycle about the same as for soundcards. That seemed a bit encouraging to me. Having to fork out $300+ for yet another card every year or two didn't sound too attractive. But if I just have to buy it once, I guess it might catch on.
  • tk109 - Friday, May 5, 2006 - link

    With quad cores around the corner this really isn't looking to promising.

Log in

Don't have an account? Sign up now