Time for a PPU?

by Anand Lal Shimpi on March 11, 2005 12:59 PM EST
Back when Cell was first announced I was talking to a friend and asked how long it would take for someone to take Cell's SPE array, stick it on a PCIe card, and sell it as a physics processor...well given AGEIA's PR wave, the answer is not long apparently. While AGEIA's PhysX has no architectural relationship to IBM/Sony/Tosh's Cell processor, the fundamental design philosophy is quite similar.

Our own Derek Wilson just published his thoughts on AGEIA's PPU, which succinctly explains the need and the fulfillment of that need by AGEIA's new processor. Although I'm not sure a separate card is the best way of incorporating this type of a unit in a modern day gaming box, it definitely won't be in x86 CPUs for a while if Intel's roadmap presented at IDF is to be adhered to. Intel won't be shipping specialized cores in multi-core IA microprocessors for at least another 5+ years, giving companies like AGEIA ample time to step in and gain control of the market (as well as help it evolve).

I can't help but think that the two current giants of consumer level parallel processing, ATI and NVIDIA, won't sit by idle while the physics revolution takes place...
Comments Locked

12 Comments

View All Comments

  • Vrocks - Saturday, March 19, 2005 - link

    #10 I agree with you, as there's no reason to think that the PPU must be integrated with a video card.

    The big question is, will Ageia survive against industry titans like Nvidia and ATI? Nvidia branched out into mother board chipsets years ago; the PPU seems like the next logical step.

    I think the tech sector will be revitalized in the near future. Things have been stagnant until recently with announcements like SLI returning to video cards, dual core processors, CELL technology, and now PPU. Things are starting to look more exciting in the tech sector than they have for the last 4-5 years.
  • The_Necromancer - Friday, March 18, 2005 - link

    yes its time, I mean with an extra slot you could have so much more in game. Hair, water, clothing,. The Interactions are endless. Tis is an inevitable outcome of simple product demand. The market is ready and waitiong for this ingenius idea to come alive. Unreal wont be so unreal anylonger

    games cant wait for: HL3, Spltr Cell 4, and many more PPUing games!!!!!!!!!
  • static1117 - Thursday, March 17, 2005 - link

    I remember back in the day when the 2d card and the 3d card were seperate.
    I also remember people bashing the integration of the 2d/3d card onto one PCB.
    I for one think that the IDEA of a PPU is a long time coming. If you look at how games have evolved over the past decade the one thing that is really lacking is environments and the interaction with said environments. Sure games like Far Cry HL2 and D3 look fantastic, but what about the details? Clothes that look and act real, landscapes that you can interact with, real collision detection. These are the things that I want.
    Lets not squash the idea of a PPU too quickly, someday it will be in our computers. Whether or not as a stand alone card or as part of the graphics card will be decided by the powers that be.

  • oepapel - Wednesday, March 16, 2005 - link

    There is no need for separate memory for a PPU chip located on a PCIe card or on the MB. Using main memory makes sense here since the CPU needs access to the results. Unless the PPU and GPU are HIGHLY integrated, PPU "rendering" to main memory is the most practical route. In fact, I can't really see a separate PPU chip lasting for long. It will either become a specialized core in a multi-core CPU or it will be integrated into the GPU.
    If it gets integrated into the GPU, then it becomes really easy to offer audio as well as graphics since the PPU could handle all of the audio processing as well as the physics. This would be a natural progression for GPU chips into true VPU's.
    If integrated into the CPU, we could finally do away with MMX, SSE/SSE2/SSE3/AltiVec instructions and have the CPU keep the PPU "core" pipeline filled. This would mean the end of specialized instruction sets in order to get decent performance. This would be ideal for apps like video transcoding, SETI, and solid modeling.

    Oscar
  • Orsin - Wednesday, March 16, 2005 - link

    Not completely sure it will fail. The high end vidoe cards are fighting very hard for the who is best spot. Cost is no object. Remember not too long ago a $200 was high end. Now some top $500.
    in a few years 6 or even 700 but not be insane. Now to top your you arch foe you af ina PPU and it cost another 100 for it. I see them doing it.
    Over time the PPU will drop in price and I think entry level PPU if insert directly on the video card would run the makes less 20-50 dollars.

    The other thing missing is have you noticed intel Graphics Media Accelerator 900 on board graphic solution is very robust in it feature set but lack any processing power. GMA 900 has all the feature of high end cards , plus a few more that makes media content even better. The real killer is no vertex shader or transform and lighting. Guess what, the PPU as an addin card could this nicely. If it could use a PCI-e bus (even a 1x)it would function a cheap add in card greatly increase GMA performance in all games. Remember Intel still hold 60% of the video market. Us gamers might not care but for the other billion users it might be very good alternative.
  • Anonymous - Wednesday, March 16, 2005 - link

    PPU will fail plain and simple. They were talking about it needed 128MB of RAM, thats simply way too much, imagine adding another 128MB of ram to a video card PLUS the cost of the PPU. it will never fly, PC game developers who think this is a good idea and who already have a tough time with predicting market video card upgrades should think twice.

    It's no wonder console gaming is taking over completely, 3D video cards fragmented the gaming market into players that could have good graphics and players who could have just bearable ones. If you do that with physics that will fuck the whole market up even more, now they won't have to worry just about graphics, but whether the user has the latest "PPU"
  • UlricT - Monday, March 14, 2005 - link

    #5: I've been noticing that in a lot of AT articles. They really need another editor.
  • Anonymous - Saturday, March 12, 2005 - link

    Was this article edited at all or just rushed out? There are grammatical and wording errors everywhere in the article. It was very frustrating to read through this piece - a shame.
  • Davediego - Saturday, March 12, 2005 - link

    personally I was wondering how long it would take ATI/Nvidia to announce physics processing in the video card. Since the shader models start allowing the video card to be used for non-graphics tasks and really to crunch any highly-paralell problem it seems like a perfect fit. At the very least having all the vertex data in the video card should allow efficient processing of rigid bodies...
  • SDA - Friday, March 11, 2005 - link

    #2: they do seem like natural companions, but how much would an onboard PPU add to the cost of a video card? Could be a major issue for budget cards if the cost is significant... I guess budget-grade cards could always just go without them, of course, to give the consumer another reason to go higher-end. Also, video cards ARE rather cluttered as it is, but I doubt that'll really matter. We'll always find space for something more.

    I'm curious as to how sound integration will move forward. Add-on sound cards have been around since the dawn of time (the dawn of the time that matters, anyway), and there are several advantages to using an add-on solution for both sides. I'd guess that integrated and add-on will continue to coexist peacefully for a while, but I really have no idea about what exactly the future will hold.

Log in

Don't have an account? Sign up now