BFG PhysX and the AGEIA Driver

Let us begin with the BFG PhysX card itself. The specs are the exact same as the ASUS card we previewed. Specifically, we have:

130nm PhysX PPU with 125 million transistors
128MB GDDR3 @ 733MHz Data Rate
32-bit PCI interface
4-pin Molex power connector


The BFG card has a bonus: a blue LED behind the fan. Our BFG card came in a retail box, pictured here:


Inside the box, we find CDs, power cables, and the card itself:



As we can see here, BFG opted to go with Samsung's K4J55323QF-GC20 GDDR3 chips. There are 4 chips on the board, each of which is 4 banks of 2Mb x 32 RAM (32MB). The chips are rated at 2ns, giving a maximum clock speed of 500MHz (1GHz data rate), but the memory clock speed used on current PhysX hardware is only 366MHz (733MHz data rate). It is possible that lower than rated clock speeds could be implemented to save on power and hit a lower thermal envelope. It might be possible that a lower clock speed allows board makers to be more aggressive with chip timing if latency is a larger concern than bandwidth for the PhysX hardware. This is just speculation at this point, but such an approach is certainly not beyond the realm of possibility.


The pricing on the BFG card costs about $300 at major online retailers, but can be found for as low as $280. The ASUS PhysX P1 Ghost Recon Edition is bundled with GRAW for about $340, while the BFG part does not come with any PhysX accelerated games. It is possible to download a demo of CellFactor now, which does add some value to the product, but until we see more (and much better) software support, we will have to recommend that interested buyers take a wait and see attitude towards this part.

As for software support, AGEIA is constantly working on their driver and pumping out newer versions. The driver interface is shown here:

Click to enlarge


There isn't much to the user side of the PhysX driver. We see an informational window, a test application, a diagnostic tool to check or reset hardware, and a help page. There are no real "options" to speak of in the traditional sense. The card itself really is designed to be plugged in and forgotten about. This does make it much easier on the end user under normal conditions.

We also tested the power draw and noise of the BFG PhysX card. Here are our results:

Noise (in dB)
Ambient (PC off): 43.4
No BFG PhysX: 50.5
BFG PhysX: 54.0


The BFG PhysX Accelerator does audibly add to the noise. Of course, the noise increase is nowhere near as bad as listening to an ATI X1900 XTX fan spin up to full speed.

Idle Power (in Watts)
No Hardware: 170
BFG PhysX: 190


Load Power without Physics Load
No Hardware: 324
BFG PhysX: 352


Load Power with Physics Load
No Hardware: 335
BFG PhysX: 300


At first glance these results can be a bit tricky to understand. The load tests were performed with our low quality Ghost Recon Advanced Warfighter physics benchmark. Our test "without Physics Load" is taken before we throw the grenade and blow up everything, while the "with Physics Load" reading is made during the explosion.

Yes, system power draw (measured at the wall with a Kill-A-Watt) decreases under load when the PhysX card is being used. This is made odder by the fact that the power draw of the system without a physics card increases during the explosion. Our explanation is quite simple: The GPU is the leading power hog when running GRAW, and it becomes starved for input while the PPU generates its data. This explanation fits in well with our observations on framerate under the games we tested: namely, triggering events which use PhysX hardware in current games results in a very brief (yet sharp) drop in framerate. With the system sending the GPU less work to do per second, less power is required to run the game as well. While we don't know the exact power draw of the PhysX card itself, it is clear from our data that it doesn't pull nearly the power that current graphics cards require.

Index Benchmarking Physics
Comments Locked

67 Comments

View All Comments

  • Tephlon - Wednesday, May 17, 2006 - link

    yeah, no. I know that havoc is doing genaric physics. And light poles DO normally bend without the card. Cars do shake and explode. Cans can be kicked. All that stuff is normally threre.
    I'm just saying the card seems to accentuate all of it. Not just more particles, but better explosions. Better ragdoll. Pots break a bit different, etc.
    It was definately there before, but I think it all looks better with the physx. My roommate said he noticed the difference as well. I let him borrow it for a while while I was at work.
    Again, I know I have no proof, atleast not to show you atm... but to me it all seems better than before.
    If I get a chance I'll fraps a run through a level once with and once without, and throw the links up here. I personally have seen several sites' comparison vids, but I don't feel they show everything very well.
    Again, I'd heard it only adds particles to explosions, like you did, but I swear I can see the difference with everything.
    Anyone ever heard Ageia say EXACTLY what difference there is for GRAW with their card?
  • DerekWilson - Wednesday, May 17, 2006 - link

    Perception of experiences can greatly be affected by expectations. None of us are ever able to be 100% objective in all cases.

    That being said, in your first post you mention not "feeling" the performance impact. If you'll take a look at our first article on PhysX (and the comments) you will notice that I reported the same thing. There aren't really huge slow downs in GRAW, but only one or two frames that suffer. If the errant frame(s) took a quarter of a second to render, we would definitely notice it. But while an AVERAGE of 15 frames per second can look choppy having a frame or two take 0.066 seconds to render is not going to significantly impact the experience.

    Minimum framerates are important in analysing performance, but they are much more difficult to properly understand than averages. We want to see high minimum framerates becuase we see that as meaning less slow-down or stutter. But generally (in gpu limited situations) minimum framerates aren't outliers to the data set -- they mark a low point where framerate dips down for a good handful of frames. In the case of GRAW with PhysX, the minimum is really non contiguous with the performance of the rest of the frames.

    CoV is another story. The framerate drops several times and we see stuttering. It's definitely something easily "felt" during gameplay. But CoV Issue 7 is still beta, so we might see some performance imporvements when the code goes live.
  • Tephlon - Wednesday, May 17, 2006 - link

    Derek, I totally agree. I wasn't arguing about anything technicial the article, or the relativity minimum fps has on the 'feel' or 'playablilty'. It just doesn't seem like most readers (here and elsewhere) understand it. I also won't hide the fact that I DO WANT this tech to succeed, partly because I heard them speak at quakecon and I liked what I heard/saw, and partly because I've dropped $300 in good faith that my early adoption will help the cause and push us forward in the area of true physics in gaming. And even though my perception is undoubtably affected because of my expectations, its not entirely misled either. Even with my bias I can be realistic and objective. If this thing did nothing for visuals/gameplay and killed my experiance with crappy performance, I'd of course have a different opinion on the matter.

    I was simply saying that readers seem to lose sight of the big picture. Yeah, its in the rough stages. Yeah, it only works with a few games. I'm not here to pitch slogans and rants to make you buy it, I just wanted people to understand that device 'as it is now' isn't without its charm. It seems the only defense that's brought in for the card is that the future could be bright. It DOES have some value now, if your objective about it and not out to flame it immediately. I like what it does for my game, even if its not revolutionary. I just hope that there are enough people objective enough to give this company/card a chance to get off the ground. I DO think its better for industry if the idea of a seperate physics card can get off the ground.
    I dunno, maybe I see too much of 3DFX in them, and it gets me nostalgic.
    Again, Derek, I wasn't knocking on the report at all, and I hope it wasn't taken that way. I think it said just what it was supposed to or even could say. I was more trying to get the readers a balenced look at the card on the use side, since straight numbers send people into frenzies.

    Did all that get to what I was trying to convey? I dunno, I confuse myself sometimes. I wasn't meant to be an author of ANYTHING. In any case, good luck to you.
    Good luck to all.
  • DerekWilson - Wednesday, May 17, 2006 - link

    lol, I certainly didn't take it as a nagative commentary on anything I said. I was trying to say that I appreciate what you were saying. :-)

    At a basic level, I very much agree with your perspective. The situation does resemble the 3dfx era with 3d graphics. Hardware physics is a good idea, and it would be cool if it ends up working out.

    But is this part the right part to get behind to push the industry in that direction?

    AnandTech's first and foremost responsibility is to the consumer, not the industry. If the AGEIA PhysX card is really capable of adding significant value to games, then its success is beneficial to the consumer. But if the AGEIA PhysX card falls short, we don't want to see anyone jump on a bandwagon that is headed over a cliff.

    AGEIA has the engine and developer support to have a good chance at success. If we can verify their capabilities, then we can have confidence in recommending purchasing the PhysX card to people who want to push the agenda of physics hardware. There is a large group of people out there who feel the same way you do about hardware and will buy parts in order to benefit a company or industry segment. If you've got the ability and inclination, that's cool.

    Honesly, most people that go out and spend $300 on a card right now will need to find value in something beyond what has been added in GRAW, CoV, and the near term games. If we downplayed the impact of the added effects in GRAW and CoV, its because the added effects are no where near worth $300 they cost. It is certainly a valid perspective to look towards the future. You have the ability to enjoy the current benefits of the hardware, and you'll already have the part when future games that make more compelling use of the technology come out.

    We just want to make sure that there is a future with PhysX before start jumping up and down screaming its praises.

    So ... I'm not trying to say that anything is wrong with what you are saying :-)

    I'm just saying that AnandTech has a heavy responsibility to its readers to be more cautious when approaching new markets like this. Even if we would like to see it work out.
  • Tephlon - Thursday, May 18, 2006 - link

    true. I do get your point.

    And again, you're right. With a more balenced perspective on the matter, I sure can't see you suggesting a 300 dollar peice of hardware on a hunch either. I do respect how your articles are based on whats best for the little guy. I think I'd honestly have to say, if you were to suggest this product now AS IF it was as good as sliced bread... I would be unhappy with my purchase based on your excitment for it.
    teheh. Yeah, you made the right call with your article.
    Touche', Derek. TOUCHE'



    thehe. I guess not everyone can gamble the $300, and thats understandable. :-(

    Like I said... here's hopin'. :-D
  • RogueSpear - Wednesday, May 17, 2006 - link

    I'm not an expert on CPUs, but all of this has me wondering - isn't physics type code the kind of mathematical code that MMX and/or SSE and their follow-ons were supposed to accelerate? I'm sure physics was never mentioned way back then, but I do remember things like encryption/decryption and media encoding/decoding as being targets for those technologies. Are game developers currently taking advantage of those technologies? I know that to a certain point there is parity between AMD and Intel CPUs as far as compatibility with those instruction sets.
  • apesoccer - Wednesday, May 17, 2006 - link

    Seems like this was a pretty limited review...Were you guys working with a time table? Like 4hrs to use this card or something?

    I think i would have tried more then just single core cpu's...since we're heading mostly towards multicore cpus. I also would have run tests at the same lvl (if possible; it feels like we're intentionally being kept in the dark here) to compare software and hardware with the same number of effects, at different levels and at resolutions...At low res, you're maxing the cpu out right? Well, then if the ppu uses 15% of the cpu but outputs 30% more effects, you're being limited by the cpu even more...You should see greater returns the higher the resolution you go...Since you're maxing your gpu's out more (rather then the cpus) the higher the res. All of this is moot if the overhead cpu usage by the ppu can be run on a second cpu core...since that's where the industry is headed anyway. And making software/hardware runs on a dual core should give us a better idea of whether or not this card is worth it.
  • peternelson - Wednesday, May 17, 2006 - link

    To the people who say it's a decelerator. It is a little slower but it is NOT doing the same amount of work. The visual feast is better in the hardware accelerated game than without the card. But we need a way to quantify that extra as just "fps" ignores it.

    Second, Anandtech PLEASE get yourselves a PCI bus analyser, it need not be expensive. I want to know the % utilisation on the PCI bus. At 32 bit 33MHz it is potential max 133 MByte/sec.

    How much of that is being used to talk to and from the PHYSX card, and is it a bottleneck that would be solved by moving to PCI Express? Also in your demo setups, considering what peripherals you are using, are you hogging some of the PCI bandwidth for (say) a PCI based soundcard etc which would be unfair on the physx card.

    ALSO one of the main purposes of THIS review I would say is to COMPARE the ASUS card with the BFG card. You don't seem to do that. So assuming I want a physx card, I still don't know which of the two to buy. Please compare/contrast Asus vs BFG.
  • DerekWilson - Wednesday, May 17, 2006 - link

    honestly, the asus and bfg cards perform identically, pull about the same ammount of power and produce similar levels of noise.

    If you are trying to decide, buy the cheaper one. There aren't enough differences to make one better than the other (unless blue leds behind fans really does it for you).

    We didn't do a more direct comparison because we have an engineering sample ASUS part, while our BFG is full retail. We generally don't like to make direct comparisons with preproduction hardware in anything other than stock performance. Heat, noise, power, pcb layout, and custom drivers can all change dramatically before a part hits retail.

    We will look into the pci bus utilization.
  • peternelson - Wednesday, May 17, 2006 - link


    Thanks, so, I will have to choose on features like the nice triangle box on the BFG ;-)

    In gaming on older machines where both the sound and network and possibly other things are all on the same PCI bus, then either the physx or the other stuff could suffer from bus contention.

    I hope you can either ask or do some analysing to watch the amount of traffic there is.

Log in

Don't have an account? Sign up now