Introduction

Last week, we took a first look at the new PhysX add-in physics accelerator from AGEIA. After our article was published, AGEIA released an update to their driver that addresses some of the framerate issues in Ghost Recon Advanced Warfighter. While our main focus this time around will be on BFG's retail part, we will explore the effectiveness of this patch and go a little further in-depth with the details behind our performance analysis.


In addition to the BFG retail PhysX card and Ghost Recon update, we will take a look at a few demos that require the PhysX card to run. While there aren't any games scheduled to come out in the near future that will take this new technology to the extreme, it will be nice to get a glimpse into the vision AGEIA has for the future. Getting there will certainly be a hard road to travel. Until more games come out that support the hardware, we certainly can't recommend PhysX to anyone but the wealthy enthusiasts who enjoy the novelty of hardware for hardware's sake. Even if PhysX significantly enhances the experience of a few games right now, it will be a tough sell to most users until there is either much wider software support, good games which require the hardware, or a killer app with a PhysX hardware accelerated feature that everyone wants to have.

As for games which will include PhysX hardware support, the only three out as of this week are Tom Clancy's Ghost Recon Advanced Warfighter (GRAW), Rise of Nations: Rise of Legends (ROL) and City of Villains (COV). Rise of Legends came out last week, and we have been extensively testing it. Unfortunately, PhysX hardware support will only be added in an upcoming patch for which we have no real ETA.

We worked very hard to test City of Villains, and we finally succeeded in creating a repeatable benchmark. The specific content in City of Villains which supports the AGEIA PhysX PPU (physics processing unit) is a series of events called the Mayhem Missions. This is a very small subset of the game consisting of timed (15 minute) missions. Currently these missions are being added in Issue 7 which is still on the test server and is not ready for primetime. Full support for PhysX was included on the test server as of May 10th, so we have benchmarks and videos available.

Before we jump into the numbers, we are going to take a look at the BFG card itself. As this is a full retail part, we will give it a full retail workup: power, noise, drivers, and pricing will all be explored. Our investigations haven't turned up an on-chip or on-board thermistor, so we won't be reporting heat for this review. Our power draw numbers and the size of the heat sink lead us to believe that heat should not be a big issue for PhysX add-in boards.

BFG PhysX and the AGEIA Driver
Comments Locked

67 Comments

View All Comments

  • phusg - Wednesday, May 17, 2006 - link

    > Performance issues must not exist, as stuttering framerates have nothing to do with why people spend thousands of dollars on a gaming rig.

    What does this sentence mean? No, really. It seems to try to say more than just, "stuttering framerates on a multi-thousand dollar rig is ridiculous", or is that it?
  • nullpointerus - Wednesday, May 17, 2006 - link

    I believe he means that the card can't survive in the market if it dramatically lowers framerates on even high end rigs.
  • DerekWilson - Wednesday, May 17, 2006 - link

    check plus ... sorry if my wording was a little cumbersome.
  • QChronoD - Wednesday, May 17, 2006 - link

    It seems to me like you guys forgot to set a baseline for the system with the PPU card installed. From the picture that you posted in the CoV test, the nuber of physics objects looks like it can be adjusted when the AGIEA support is enabled. You should have ran a benchmark with the card installed but keeping the level of physics the same. That would eliminate the loading on the GPU as a variable. Doing so would cause the GPU load to remain nearly the same with the only difference being to do the CPU and PPU taking time sending info back and forth.
  • Brunnis - Wednesday, May 17, 2006 - link

    I bet a game like GRAW actually would run faster if the same physics effects were run directly on the CPU instead of this "decelerator". You could add a lot of physics before the game would start running nearly as bad as with the PhysX card. What a great product...
  • DigitalFreak - Wednesday, May 17, 2006 - link

    I'm wondering the same thing.

    "We still need hard and fast ways to properly compare the same physics algorithm running on a CPU, a GPU, and a PPU -- or at the very least, on a (dual/multi-core) CPU and PPU."

    Maybe it's a requirement that the developers have to intentionally limit (via the sliders, etc.) how many "objects" can be generated without the PPU in order to keep people from finding out that a dual core CPU could provide the same effects more efficiently than their PPU.
  • nullpointerus - Wednesday, May 17, 2006 - link

    Why would ASUS or BFG want to get mixed up in a performance scam?
  • DerekWilson - Wednesday, May 17, 2006 - link

    Or EPIC with UnrealEngine 3?

    Makes you wonder what we aren't seeing here doesn't it?
  • Visual - Wednesday, May 17, 2006 - link

    so what you're showing in all the graphs is lower performance with the hardware than without it. WTF?
    yes i understand that testing without the hardware is only faster because it's running lower detail, but that's not clearly visible from a few glances over the article... and you do know how important the first impression really is.

    now i just gotta ask, why can't you test both software and hardware with the same level of detail? that's what a real benchmark should show atleast. Can't you request some complete software emulation from AGEIA that can fool the game that the card is present, and turn on all the extra effects? If not from AGEIA, maybe from ATI or nVidia, who seem to have worked on such emulations that even use their GFX cards. In the worst case, if you can't get the software mode to have all the same effects, why not then atleast turn off those effects when testing the hardware implementation? In the city of villians for example, why is the software test ran with lower "Max Physics Debris Count"? (though I assume there are other effects that get automatically enabled with the hardware present and aren't configurable)

    I just don't get the point of this article... if you're not able to compare apples to apples yet, then don't even bother with an article.
  • Griswold - Wednesday, May 17, 2006 - link

    I think they clearly stated in the first article, that GRAW for example, doesnt allow higher debris settings in software mode.

    But even if it did, a $300 part that is supposed to be lightning fast and what not, should be at least as fast as ordinary software calculations - at higher debris count.

    I really dont care much about apples and oranges here. The message seems to be clear, right now it isnt performing up to snuff for whatever reason.

Log in

Don't have an account? Sign up now