BFG PhysX and the AGEIA Driver

Let us begin with the BFG PhysX card itself. The specs are the exact same as the ASUS card we previewed. Specifically, we have:

130nm PhysX PPU with 125 million transistors
128MB GDDR3 @ 733MHz Data Rate
32-bit PCI interface
4-pin Molex power connector


The BFG card has a bonus: a blue LED behind the fan. Our BFG card came in a retail box, pictured here:


Inside the box, we find CDs, power cables, and the card itself:



As we can see here, BFG opted to go with Samsung's K4J55323QF-GC20 GDDR3 chips. There are 4 chips on the board, each of which is 4 banks of 2Mb x 32 RAM (32MB). The chips are rated at 2ns, giving a maximum clock speed of 500MHz (1GHz data rate), but the memory clock speed used on current PhysX hardware is only 366MHz (733MHz data rate). It is possible that lower than rated clock speeds could be implemented to save on power and hit a lower thermal envelope. It might be possible that a lower clock speed allows board makers to be more aggressive with chip timing if latency is a larger concern than bandwidth for the PhysX hardware. This is just speculation at this point, but such an approach is certainly not beyond the realm of possibility.


The pricing on the BFG card costs about $300 at major online retailers, but can be found for as low as $280. The ASUS PhysX P1 Ghost Recon Edition is bundled with GRAW for about $340, while the BFG part does not come with any PhysX accelerated games. It is possible to download a demo of CellFactor now, which does add some value to the product, but until we see more (and much better) software support, we will have to recommend that interested buyers take a wait and see attitude towards this part.

As for software support, AGEIA is constantly working on their driver and pumping out newer versions. The driver interface is shown here:

Click to enlarge


There isn't much to the user side of the PhysX driver. We see an informational window, a test application, a diagnostic tool to check or reset hardware, and a help page. There are no real "options" to speak of in the traditional sense. The card itself really is designed to be plugged in and forgotten about. This does make it much easier on the end user under normal conditions.

We also tested the power draw and noise of the BFG PhysX card. Here are our results:

Noise (in dB)
Ambient (PC off): 43.4
No BFG PhysX: 50.5
BFG PhysX: 54.0


The BFG PhysX Accelerator does audibly add to the noise. Of course, the noise increase is nowhere near as bad as listening to an ATI X1900 XTX fan spin up to full speed.

Idle Power (in Watts)
No Hardware: 170
BFG PhysX: 190


Load Power without Physics Load
No Hardware: 324
BFG PhysX: 352


Load Power with Physics Load
No Hardware: 335
BFG PhysX: 300


At first glance these results can be a bit tricky to understand. The load tests were performed with our low quality Ghost Recon Advanced Warfighter physics benchmark. Our test "without Physics Load" is taken before we throw the grenade and blow up everything, while the "with Physics Load" reading is made during the explosion.

Yes, system power draw (measured at the wall with a Kill-A-Watt) decreases under load when the PhysX card is being used. This is made odder by the fact that the power draw of the system without a physics card increases during the explosion. Our explanation is quite simple: The GPU is the leading power hog when running GRAW, and it becomes starved for input while the PPU generates its data. This explanation fits in well with our observations on framerate under the games we tested: namely, triggering events which use PhysX hardware in current games results in a very brief (yet sharp) drop in framerate. With the system sending the GPU less work to do per second, less power is required to run the game as well. While we don't know the exact power draw of the PhysX card itself, it is clear from our data that it doesn't pull nearly the power that current graphics cards require.

Index Benchmarking Physics
Comments Locked

67 Comments

View All Comments

  • apesoccer - Wednesday, May 17, 2006 - link

    That's a good question as well...especially for those of us using other additional pci cards...
  • mbhame - Wednesday, May 17, 2006 - link

    You guys are giving Ageia WAY too much slack. :(
    Call a spade a spade and save face.
  • apesoccer - Wednesday, May 17, 2006 - link

    There's no use in throwing in the towel before we get in the ring...
  • mbhame - Wednesday, May 17, 2006 - link

    Throwing in the towel...? How do you infer that from what I said?

    I said "Call a spade a spade". Whether Anandtech.com chooses an easy-out path of "Currently the PPU sucks..." (not in so many words) or not, there is tremendous grace extended to Ageia around here, and frankly, it stinks.

    Obviously there is a fine line between journalism with respect (which 99% of other websites are ignorant of) and brown-nosing, or needing to get a pair. All I'm saying is it's not very clear where this site's stand is amongst these possibilities.
  • Trisped - Wednesday, May 17, 2006 - link

    I think everyone is being cautiously optimistic that the tech will improve. I wasn't on the 3d accelerator screen when that first happened, but from what I hear those cards were expensive and actually were worse then not having them. But now they are required for every game and windows vista.

    We want to wait to see if they can work out the bugs, give us better comparisons, and to compare it to the GPU only systems that are suppose to be coming. Once we have all the facts we can pass a final verdict, until then everything is guess work.
  • apesoccer - Wednesday, May 17, 2006 - link

    There's alot of grace given to it everywhere...I have yet to see an article bash them. There has been a lot of interest in this product, and frankly, the general concensis is that we want to see it succeed. That aside, i don't think they can make a precise statement saying...This product is going to suck balls...or this is going to be the next Sliced Bread...

    My problem with it, is the lack of depth to the findings (and your statement "Call a spade a spade"...), I wish they had tried more kinds of CPU's with different kinds of GPU's, at several resolutions at both the same settings hardware/software and different ones. Without those tests, you can't really say you've tested the product.

    Basically...because they haven't done enough work with it yet [imo](due to time restraints or whatever...), we can't make any real statements about this product. Other then, at the one hardware setting they ran it at, compared to the different software setting ( >< ), the software setting scored better in fps. Which tells us what? The ppu uses overhead cpu cycles when doing at least 3x the amount of work the cpu would be doing at the lower sofware settings. So lets see some different settings (and some of the hardware/software running at the same), so we can get a better idea of the big picture.
  • mbhame - Wednesday, May 17, 2006 - link

    I don't agree with your assessment on the general consensus. My circles vehemently want it to fail as it's an additional cost to our PCs, an additional heat source, an additional power requirement... and for what?

    I think you're kidding yourself if you think some other CPU:GPU combination would yield appreciably-different results.
  • DerekWilson - Wednesday, May 17, 2006 - link

    we're working very hard to find ways to more extensively test the hardware. you've hit the nail on the head with why people haven't been tearing this part up. we are certainly suspicious of its capabilities at this point, but we just don't have the facts to draw a hard line on its real value (exept in the context of "right now").
  • mbhame - Wednesday, May 17, 2006 - link

    Well then make stronger statements in the present-tense. Just because someone makes a product designed to do "X", it doesn't mean that they'll succeed in doing so. You guys come across as if it's a given the PPU *will be* a success and in doing so generate a level of expectation of success. As it stands now this is a total flop - treat it as such. Then IF and when they DO make it worthwhile for some appreciable reason then we can marvel at their about-face collectively.

    It's not cynicism, it's reality.
  • AnnonymousCoward - Friday, May 19, 2006 - link

    Why do you need stronger criticism? You've been able to determine what's going on, and that's because the performance charts speak for themselves. I'd rather read what's currently on the conclusion page, instead of the obvious "This product's performance sucks with current games."

Log in

Don't have an account? Sign up now