Last week, we took a first look at the new PhysX add-in physics accelerator from AGEIA. After our article was published, AGEIA released an update to their driver that addresses some of the framerate issues in Ghost Recon Advanced Warfighter. While our main focus this time around will be on BFG's retail part, we will explore the effectiveness of this patch and go a little further in-depth with the details behind our performance analysis.

In addition to the BFG retail PhysX card and Ghost Recon update, we will take a look at a few demos that require the PhysX card to run. While there aren't any games scheduled to come out in the near future that will take this new technology to the extreme, it will be nice to get a glimpse into the vision AGEIA has for the future. Getting there will certainly be a hard road to travel. Until more games come out that support the hardware, we certainly can't recommend PhysX to anyone but the wealthy enthusiasts who enjoy the novelty of hardware for hardware's sake. Even if PhysX significantly enhances the experience of a few games right now, it will be a tough sell to most users until there is either much wider software support, good games which require the hardware, or a killer app with a PhysX hardware accelerated feature that everyone wants to have.

As for games which will include PhysX hardware support, the only three out as of this week are Tom Clancy's Ghost Recon Advanced Warfighter (GRAW), Rise of Nations: Rise of Legends (ROL) and City of Villains (COV). Rise of Legends came out last week, and we have been extensively testing it. Unfortunately, PhysX hardware support will only be added in an upcoming patch for which we have no real ETA.

We worked very hard to test City of Villains, and we finally succeeded in creating a repeatable benchmark. The specific content in City of Villains which supports the AGEIA PhysX PPU (physics processing unit) is a series of events called the Mayhem Missions. This is a very small subset of the game consisting of timed (15 minute) missions. Currently these missions are being added in Issue 7 which is still on the test server and is not ready for primetime. Full support for PhysX was included on the test server as of May 10th, so we have benchmarks and videos available.

Before we jump into the numbers, we are going to take a look at the BFG card itself. As this is a full retail part, we will give it a full retail workup: power, noise, drivers, and pricing will all be explored. Our investigations haven't turned up an on-chip or on-board thermistor, so we won't be reporting heat for this review. Our power draw numbers and the size of the heat sink lead us to believe that heat should not be a big issue for PhysX add-in boards.

BFG PhysX and the AGEIA Driver


View All Comments

  • hatsurfer - Thursday, May 18, 2006 - link

    I just got my card today from Newegg. I only had about an hour to play before I left for work. I wanted to see the effects when I destroyed a building. I played through the first mission on 1600x1200 and my frames stayed a solid 30 with v-sync enabled due to playing on a large LCD. It was pretty nice to see the MANY building parts flying in every direction with the smoke effect. All in all it looked pretty cool and realistic. I am currently gaming on a EVGA 7800 GTX awaiting my 7900 GTX from EVGAs step-up program. I would like to take full advantage of my optimal resolution 1920x1200 but my frames dropped to the 20's. I can only imagine my 7900 GTX will get me my full HD resolution, which I can't sustain on a single 7800 GTX anyway.

    I think anyone with a high-end system isn't going to have any hangups when it comes to frame rates. Is this for low end budget gaming systems? Probably not just yet, but neither is the price tag of $300+. So, right now it's a nice little extra piece of eye candy for me to enjoy and in the end that's what is important anyway.

    I hope this technology takes off and drives the number of supported titles up and if it gets incorporated into other components so be it (I really hope so too!). We'll just do another upgrade that we're all used to doing every few months anyway. Such it has always been at the cutting edge of PC'ing and such it will always be.
  • Mabus - Wednesday, May 17, 2006 - link

    Well I wonder how long before ATI or Nvidia Buys this company and integrates the logic into it's GPU. Not only will this allow better performance but it will allow optimisations with drivers and communications and instruction exclusivity. This will be very nice smaller card, one less space used on mobo and cooler running etc etc..

    remember the seperate match co-pro well this tech will definatly go the same way, it is only a matter of time. And face it the current benchies show that time is what we need
    to get that performance where us gamers want it.

    Mabus signing off.....
  • Trisped - Wednesday, May 17, 2006 - link

    Why didn’t you compare to stats of Asus card with those of the BFG card. Sound, power, if they are the same then say so, otherwise what is the point of reviewing two different cards for the same thing?

    COV is an unfair test- Max Physics Debris Count should be the same so you can see if there is a performance boost at the same level. I know that if I give my GPU 1000 more objects it is going to go much slower, with or without the PhysX card. What I want to see is running with 1500 and 422 “Debris” both with and without the card.

    I would like to see the tests run with not only 2 different processors, but what about an Intel dual core and what about with different video card configurations? If you have a lower end video card should you expect to have less of an FPS impact or more?

    How long before there is a PCIe card? Would there be a performance boost using PCIe? I would think that if the physics and GPU were on the same bus then there would be less latency as well as faster communication. I think the fact that the PhysX card wouldn’t have to wait for the processor to stop sending it info before it started transmitting would also be important to speed.

    “If spawning lots of effects on the PhysX card makes the system stutter, then it is defeating its won purpose.” Should be one, not won.
  • VooDooAddict - Wednesday, May 17, 2006 - link

    I'm waiting to pass judgment on this tech till after I see reviews with high end Dual core CPUs and a PhysX board connected via PCI-Express.

    I just can't see how connection something like this via PCI was ever going to work. This isn't mainly a one way communication like an audio board during gaming. Send the information needing to be output, the sound board outputs it. My understanding is that the PhysX board also needs to communicate the calculated information back to the CPU and GPU.

    Now would be a great time for Nvidia to step up with some new demos for their physics hardware acceleration via SLI. For an extra couple bills ... SLI right now is much more justifiable then this physics hardware. AND SLI is really one of those things that isn't a nessesity. With SLI based physics you could run at low physics and max frames for Multiplayer twitch games... just enable hardware physics for Single player and MMOs. I'm hoping that will give you the best of both worlds. I'll be waiting for reviews like that.
  • DerekWilson - Wednesday, May 17, 2006 - link

    You are correct in that PhysX requires two way communication during gameplay in many cases. This hits on why I think the demos run smoother than the games out right now. In GRAW and CoV, data needs to be sent to the PhysX card, the PPU needs to do some work, and data needs to be sent back to the CPU.

    In the demos that use the PhysX hardware for everything, as the scene is being setup, data is loaded into the PhysX on-board memory. Interacting with the world requires much less data to be sent to the PPU per frame, as all the objects have already been created, setup and stored locally. This should significantly cut down on traffic and provide less of a performance impact while also enabling more complicated effects and physics.
  • DerekWilson - Wednesday, May 17, 2006 - link

    just going to say again that the above is just a theory Reply
  • poohbear - Wednesday, May 17, 2006 - link

    yea yea yea, so how do we overclock these things? :) Reply
  • Tephlon - Wednesday, May 17, 2006 - link

    I know this doesn't have EXACT bearing on the article, but I thought some readers would be interested.
    I now own one of these cards (BFG Physx PPU), and it doesn't seem as bad to me as the benchmarks really make it out to be. I installed the card and played the game with the exact settings I had previously, and didn't FEEL a difference. I havn't straight benchmarked it, though, so the numbers might very well end up similar to anandtech's findings.
    Although it has no real bearing on gameplay, I have noticed more with the addition of the PPU than anyone ever credits. Surrounding cars to rock more accuratly when nearby explosions errupt, and their suspension and doors move more accurately due to bullets and explosions. Even lightpoles bend and react better than without the card. I even had one explosion that caused the 'cross bars' (not sure what to call them) on top of the lightpole to break loose and swing back and forth for a good 20 seconds until they broke free and hit the ground. It was really neat. I know its not important stuff, but I can SEE a positive difference with the card, but don't really FEEL the lost performance it's claimed to have. Often I do see the fraps meter on my G15 lcd drop dramatically for a VERY quick instant, but I don't feel this in actual gameplay. I also believe this also happened on my rig before I had the PPU. Maybe its just that I'm not a 'BEST PERFORMANCE EVER' kinda guy. I've heard guys on forums complain about their FPS dropping from 56fps to 51fps. So what. This isn't a run&gun game. My roommate plays co-op with me and fraps tells him he's running 20fps average. It doesn't look it to me, and he also doesn't feel it either. An issue with fraps? Maybe. But again my point is as this point the loss of fps (for me) doesn't equal loss of performance, so I'm pleased with the card. I'm also curious for others that own a card to throw up their comments as well.

    Again, my findings aren't that SCIENTIFIC, but they're real life. I enjoy it, and REALLY hope they see enough support to get better implimentation in future games. I think improved support and tweaked drivers will convince the masses. I hope it happens soon, or it might be a lost cause.

    For those interested. My rig.

    Asus A8N32-SLI
    Corsair XMS 2GB (2 x 1GB) TWINX2048-3500LLPRO
    BFG Geforce 7900GTOC x2
    BFG Ageia Physx PPU
    BFG 650watt PSU
    Creative Audigy 2 ZS
    NEC Black 16X DVD+R
    WD 36gb Raptor
    WD 320gb SATA
  • Tephlon - Wednesday, May 17, 2006 - link

    ooops. I guess I probebly have a cpu to go with that rig. tehe.

    AMD Athlon X2 4400+ Toledo

  • poohbear - Wednesday, May 17, 2006 - link

    are u sure those physics effects are'nt already there without the physix's card? havoc states GRAW already uses the havoc engine to begin w/, physix only adds extra particles for grenades and stuff. Reply

Log in

Don't have an account? Sign up now