Final Words

Ideally, we would have a few more games to test in order to get a better understanding of what developers are doing with the hardware. We'd also love a little more flexibility in how the software we test handles hardware usage and physics detail. For example, what sort of performance can be had using multithreaded physics calculations on dual-core or multi-core systems? Can a high-end CPU even handle the same level of physics detail as with the PhysX card, or has GRAW downgraded the complexity of the software calculations for a reason? It would also be very helpful if we could dig up some low level technical detail on the hardware. Unfortunately, you can't always get what you want.

For now, the tests we've run here are quite impressive in terms of visuals, but we can't say for certain whether or not the PPU contributes substantially to the quality. From what GRAW has shown us, and from the list of titles on the horizon, it is clear that developers are taking an interest in this new PPU phenomenon. We are quite happy to see more interactivity and higher levels of realism make their way into games, and we commend AGEIA for their role in speeding up this process.

The added realism and immersion of playing Ghost Recon Advanced Warfighter with hardware physics is a huge success in this gamer's opinion. Granted, the improved visuals aren't the holy grail of game physics, but this is an excellent first step. In a fast fire fight with bullets streaming by, helicopters raining destruction from the heavens, and grenades tearing up the streets, the experience is just that much more hair raising with a PPU plugged in.

If every game out right now supported some type of physics enhancement with a PPU under the hood, it would be easy to recommend it to anyone who wants higher image quality than the most expensive CPU and GPU can currently offer. For now, one or two games aren't going get a recommendation for spending the requisite $300, especially when we don't know the extent of what other developers are doing. For those with money to burn, it's certainly a great part to play with. Whether it actually becomes worth the price of admission will remain to be seen. We are hopefully optimistic having seen these first fruits, especially considering how much more can be done.

Obviously, there's going to be some question of whether or not the PPU will catch on and stay around for the long haul. Luckily, software developers need not worry. AGEIA has worked very hard to do everything right, and we think they're on the right track. Their PhysX SDK is an excellent software physics solution its own right - Sony is shipping it with every PS3 development console, and there are XBox 360 games around with the PhysX SDK powering them as well. Even if the hardware totally fails to gain acceptance, games can still fall back to a software solution. Unfortunately, it's still up to developers to provide the option for modifying physics quality under software as well as hardware, as GRAW demonstrates.

As of now, the PhysX SDK has been adopted by engines such as: UnrealEngine3 (Unreal Tournament 2007), Reality Engine (Cell Factor), and Gamebryo (recently used for Elder Scrolls IV: Oblivion, though Havok is implimented in lieu of PhysX support). This type of developer penetration is good to see, and it will hopefully provide a compelling upgrade argument to consumers in the next 6-12 months.

We are still an incredibly long way off from seeing games that require the PhysX PPU, but it's not outside the realm of possibility. With such easy access to the PhysX SDK for developers, there has got to be some pressure now for those one to two year timeframe products to get in as many beyond-the-cutting-edge features as possible. Personally, I'm hoping the AGEIA PhysX hardware support will make it onto the list. If AGEIA is able to prove their worth on the console middleware side, we may end up seeing a PPU in XBox3 and PS4 down the line as well. There were plenty of skeptics that doubted the PhysX PPU would ever make it out the door, but having passed that milestone, who knows how far they'll go?

We're still a little skeptical about how much the PhysX card is actually doing that couldn't be done on a CPU -- especially a dual core CPU. Hopefully this isn't the first "physics decellerator", rather like the first S3 Virge 3D chip was more of a step sideways for 3D than a true enhancement. The promise of high quality physics acceleration is still there, but we can't say for certain at this point how much faster a PhysX card really makes things - after all, we've only seen one shipping title, and it may simply be a matter of making better optimizations to the PhysX code. With E3 on the horizon and more games coming out "real soon now", rest assured that we will have continuing coverage of AGEIA and the PhysX PPU.

PhysX Performance
Comments Locked

101 Comments

View All Comments

  • DerekWilson - Friday, May 5, 2006 - link

    We will be taking a look at CellFactor as soon as we can
  • Egglick - Friday, May 5, 2006 - link

    quote:

    It seems most likely that the slowdown is the cost of instancing all these objects on the PhysX card and then moving them back and forth over the PCI bus and eventually to the GPU. It would certainly be interesting to see if a faster connection for the PhysX card - like PCIe X1 - could smooth things out....

    You've certainly got a point there. Seeing as how a physics card is more like a co-processor than anything else, the PCI bus is probably even more of a limitation than it would be with a graphics card, where most of the textures can simply be loaded into the framebuffer beforehand.

    I still believe that the best option is to piggyback PPU's onto graphics cards. Not only does this allow them to share the MUCH higher bandwidth PCIe x16 slot, but it would also mean nearly instant communication between the physics chip and the GPU. The two chips could share the same framebuffer (RAM), as well as a cooling solution. This would lower costs significantly and increase performance.
  • DerekWilson - Friday, May 5, 2006 - link

    combo boards, while not impossible to make, are going to be much more complex. There could also be power issues as PhysX and today's GPUs require external power. It'd be cool to see, and it might speed up adoption, but I think its unlikely to happen given the roi to board makers.

    The framebuffer couldn't really be shared between the two parts either.
  • Rolphus - Friday, May 5, 2006 - link

    On page 2: "A graphics card, even with a 512-bit internal bus running at core speed, has less than 350 Mb/sec internal bandwidth." - er, I'm guessing that should read 350Gb/sec?
  • JarredWalton - Friday, May 5, 2006 - link

    Yes. Correcting....
  • Rolphus - Friday, May 5, 2006 - link

    Thanks for the quick response - I've just finished the article. It's good stuff, interesting analysis, and commentary and general subtext of "nice but not essential" is extremely useful.

    One random thing - is images.anandtech.com struggling, or is my browser just being a pain? I've been having trouble seeing a lot of the images in the article, needing various reloads to get them to show etc.
  • ATWindsor - Friday, May 5, 2006 - link

    Anandtech images doesn't work properly if you disable referer logging (pretty annoying), can that be the root of your problem? (adblock disabling it or something)
  • JarredWalton - Friday, May 5, 2006 - link

    Seems to be doing fine from our end. If you're at a company with a firewall or proxy, that could do some screwy stuff. We've also had reports from users that have firewall/browser settings configured to only show images from the source website - meaning since the images aren't from www.anandtech.com, they get blocked.

    As far as I know, both the images and the content are on the same physical server, but there are two different names. I could be wrong, as I don't have anything to do with the system administration. :)
  • Rolphus - Friday, May 5, 2006 - link

    Weird, seems to be fine now I've disabled AdBlock in Firefox... that'll teach me. It's not like I block any of AnandTech's ads anyway, apart from the intellitxt stuff - that drives me NUTS.
  • JarredWalton - Friday, May 5, 2006 - link

    Click the "About" link, then "IntelliTxt". You might be pleasantly surprised to know it can be turned off.

Log in

Don't have an account? Sign up now