Final Thoughts

More than each other however, there's one other thing that threatens the camps offering hardware physics acceleration: the CPU. Recent years have seen CPUs going multi-core, first with two cores and this week has seen the introduction of the (practically) cheap four core Q6600 from Intel. Being embarrassingly parallel in nature, physics simulations aren't just a good match for GPUs/PPUs with their sub-processors, but a logical fit for multi-core CPUs.

While both AMD and Intel have stated that they intend to avoid getting in to a core war as a replacement for the MHZ race, all signs point to a core war taking place for the foreseeable future, with Intel going so far as to experiment on 80-core designs. With the monolithic nature of games these cores will all be put to work in one way or another, and what better way than physics simulations which can be split nicely among cores? While not the floating point power houses that dedicated processors are, with multiple cores CPUs can realistically keep the gap closed well enough to prevent dedicated processors from being viable for consumers. In some ways Havok is already betting on this with their software physics middleware already designed to scale well with additional CPU cores.

Furthermore the CPU manufactures (Intel in particular) have a hefty lead in bringing manufacturing processes to market and can exploit this to further keep the gap closed versus GPUs(80nm at the high end) and the PhysX PPU(130nm). All of this leads to multi-core CPUs being an effective and low-risk way of going about physics instead of a riskier dedicated physics processor. For flagship titles developers may go the extra mile on physics, on most other titles we wouldn't expect such an effort.

So what does all this mean for hardware physics acceleration overall? In spite of the original battle being between the PPU and the GPU, we're wondering just how much longer Ageia's PhysX software/hardware package can hold out before losing the war of attrition, at the risk of becoming marginalized before any decent software library even comes out. Barring a near-miracle, we're ready to write off the PPU as a piece of impressive hardware that provided a technological solution to a problem few people ended up concerned about.

The battle that's shaping up looks to be between the GPU and the CPU, with both sides having the pockets and the manufacturing technology to play for keeps. The CPU is the safe bet for a developer, so it's largely up to NVIDIA to push the GPU as a viable physics solution (AMD has so far not taken a proactive approach with GPU physics outside of Havok FX). We know that the GPU can be a viable solution for second-order physics, but what we're really interested in is first-order physics. So far this remains unproven as far as gaming is concerned, as current GPGPU projects working with physics are all doing so as high performance computing applications that don't use simultaneous graphics rendering.

Without an idea of how well a GPU will perform with simultaneous tasks, it's too early to call to call the victor. At the very least, developers won't wait forever and the GPU camp will need to prove that their respective GPGPU interfaces can provide enough processing power to justify the cost of developing separate physics systems for each GPU line. However given the trend to move things back on to the CPU through projects such as AMD's forthcoming Fusion technology, there's an awful lot in favor of status quo.

PhysX
Comments Locked

32 Comments

View All Comments

  • FluffyChicken - Thursday, July 26, 2007 - link

    While it's not mass market like Gaming, there is Microsoft Robotics Studio that implements AGEIA PhysX hardware (& software ?)
    So they are trying ;-)

    Microsoft Robotics Studio targets a wide audience in an attempt to accelerate robotics development and adoption. An important part of this effort is the simulation runtime. It was immediately obvious that PC and Console gaming has paved the way when it comes to affordable, widely usable, robotics simulation. Games rely on photo-realistic visualizations with advanced physics simulation running within real time constraints. This was a perfect starting point for our effort.

    We designed the simulation runtime to be used in a variety of advanced scenarios with high demands for fidelity, visualization, and scaling. At the same time, a novice user with little to no coding experience can use simulation; developing interesting applications in a game-like environment. Our integration of the AGEIA PhysX Technologies enables us to leverage a very strong physics simulation product that is mature and constantly evolving towards features that will be invaluable to robotics. The rendering engine is based on Microsoft XNA Framework.


    So expect there to be a large surge a Dell for the 15yr olds to hook up the lego.
  • DeathBooger - Thursday, July 26, 2007 - link

    There is no need. Not to mention Epic hasn't said anything about it in over two years. If anything it would just be eye candy since Unreal Tournament 3 relies on it's online multiplayer. You can't have added interactive features only a percentage will be able utilize in a multiplayer game.

    Some Unreal Engine 3 titles are replacing the built in Ageia SDK in favor of Havok's SDK. Stranglehold and Blacksite are examples of this.
  • Bladen - Friday, July 27, 2007 - link

    Physics cards go here >

    Non physics cards go there <
  • Schrag4 - Thursday, July 26, 2007 - link

    My friends and I have had this 'chicken and egg' discussion on many occassions, specifically about why physics hardware is not taking off. As long as a game only uses the physics for eye-candy, the feature won't affect gameplay at all and therefore will be able to be turned off by those who don't have the resources to play with it turned on (no PhysX card, no multiple cores, no SLI graphics, whatever). So who's gonna buy a 200-400 dollar card that's not needed?

    In order for hardware like PhysX to take off, there MUST be a game where the physics is up front, interactive, what makes the game fun to play, and it MUST be required. Not only that, but it better be one hell of a game, one that people just can't do without. I mean, after all, since this is the 'egg' in the chicken-egg scenario, you're basically spending 400 bucks for the game that you want to play, since there are no other games that are even worth mentioning (again, if it's just eye candy, who cares).

    If you don't believe me about the eye-candy comments (about how eye-candy has its place but is over-valued), then please explain to me why the Wii is outselling its direct competition? It's because the games are FUN (mostly because of the innovative interface), not because they look great (they don't). I mean, come on, who cares what a game looks like if it's tedious and frustrating, shoot, even just boring to play.

    What we're longing for is a game where there are no more canned animations for everything. For instance, you don't press a fire button to swing a sword. You somehow define a sword stroke that's different every time you swing. Also, whether or not you hit your target should not be defined by your distance from your target. It should be defined by the strength of the joints that make up your character, along with the mass of the sword, along with the mass of whatever gets in the way of your swing, etc etc. We're actually working on such a game. It's early in the development, and we don't plan on having anything beyond what can be played at LAN parties, but it's a dream we all share and maybe, just maybe, we can eek out something interesting. FYI, we are using the PhysX SDK...
  • Myrandex - Friday, July 27, 2007 - link

    UT3 should use physX for environments and not just features. Reading the article shows that PhysX can be done in s/w. That way, everyone can pay the same game, and join the same servers, etc., but if they are running on an older system, PhysX will just eat their CPU's resources completely. If they upgrade to 64 core 256bit CPUs, then it will run nice, or if they pop in a little PCI card, it will run nice.

    Either way it is definite that the game has be be revolutionary, good, and always have PhysX running for at least the enrivonmental aspects (maybe leave it as an option for Particle physics so they can get performance back some how for playing on their Compy 486).
  • AttitudeAdjuster - Thursday, July 26, 2007 - link

    The issue of getting access to the results of any calculation performed on a GPU is a mjor one. On that subject you might be interested to look at the preprint of a scientific paper regarding using multiple GPUs to perform real physical (not game-related physics) calculations using nVidia CUDA SDK. The preprint is by http://arxiv.org/abs/0707.2991">Schive et alSchive et al (astro-ph/0707.2991), at the arXiv.org physics preprint server.
  • Warder45 - Thursday, July 26, 2007 - link

    I wonder what that new Lucasarts game Fracture(I think) is using for the deformable terrain.
  • jackylman - Thursday, July 26, 2007 - link

    Typo in the last paragraph of Page 3:
    "...if the PhysX hardware is going to take of or not..."
  • Sulphademus - Friday, July 27, 2007 - link

    " We except Ageia will be hanging on for dear life until then."

    Also page 3. I except you mean expect.
  • Regs - Thursday, July 26, 2007 - link

    I would think AMD would be pushing more physics by using a co-processor. Why not Aegia team up with AMD to make one for games and sell a AMD CPU bundled with the co-processor for gamers? I think that will be a lesser risk then making a completely independent card for it.

Log in

Don't have an account? Sign up now