We continue to hear new details about G72 and G73 here in Taiwan, and the latest batch of info from our vendors is that G72 and G73 will be pin compatible with NV40 and NV43. In other words, your next NVIDIA video card might have the same PCB from the 6600GT, but with a different GPU.

This means lower cost to the manufacturer - there is no need for new R+D or board designs. It also means G72 and G73 will launch very fast when the decision comes from NVIDIA as vendors can very easily switch production from the older chips to the new ones. Two vendors confirmed with us that they are already retooling their PCB for six pin 12V molex with the anticipation that G72 and G73 SLI might need the additional power, but even NVIDIA won't comment to the manufacturers at this point.

A lot seems to hinge on ATI's future choices with R580, X1600 and X1300. As of now, the launch date for X1600 is still late November and NVIDIA isn't exactly hurting for new value and midrange SKUs with the success of 6600GT. The X800GTO and X800GTO2 really give 6600GT a run for its money, but we digress.

NVIDIA's Secret Flip Chip GPU

Manufacturers seem to think G72 and G73 will be an easy tool over from NV40/43, but another vendor claims NVIDIA has bigger plans. They claim that NVIDIA is working on flip chip GPU sockets for motherboards. Apparently, inside NVIDIA engineering teams have several prototypes where the GPU, rather than the CPU, is the main focus of a motherboard with two sockets: one for the GPU and another for the CPU. Whether or not such a machine will ever see the light of day is difficult to say right now. However, the idea of pin compatible GPUs already suggests that we are halfway there when it comes to buying GPUs the same way we buy CPUs: in flip chips. We have plenty of questions, like how the memory interface will work and how that will affect performance, but GPU sockets are likely less a question of "if", but rather "when".



View All Comments

  • DeanO - Wednesday, October 19, 2005 - link

    Does this mean we might see dual socket GPUs? Or would there be dual core GPUs in place of SLI configs? Hmmm... Reply
  • squeezee - Wednesday, October 19, 2005 - link

    This is kind of a half-good idea, while being able to swap in a faster GPU will be nice it does offer a limited upgrade path as well. Some things benefit from pure GPU power but you still need a fair bit of memory bandwidth these days. With a gpu socket design the user will be stuck with the same memory performance nomatter how fast the gpu gets, and when the memory technology changes from say GDDR3 to GDDR4 or if they want to get more/faster video ram they will have to buy a whole new motherboard. Reply
  • RaynorWolfcastle - Wednesday, October 19, 2005 - link

    People forget that one of the reasons that video cards have fast RAM is because the RAM chips are soldered directly to the PCB.

    For one thing, that makes trace length management simpler since you always know exactly where each chip will be. It also saves the manufacturer money because they don't have to put in a socket. Thirdly, it also makes it possible to add many more contacts (and thus wider pdata paths as high-precision alignment for reflow soldering is done at the factory, instead of having joe six-pack trying to force the RAM in place in his basement. This also means that you get a much cleaner signal path to the soldered RAM.

    With that said, if upcoming GPUs are no longer limited by bandwidth but by pixel and vertex shading power, this could be a viable solution.

    Then again, maybe nVidia will push to have RAM chips soldered directly to the mobo. I'm not sure how good of an idea that is with the current state of RAM and video cards, However. Mobos generally use less PCB layers than video cards (6 vs 10 last I knew) so routing could become a nightmare.

    Just my 2 cents.
  • DeanO - Wednesday, October 19, 2005 - link

    I don't see why the RAM for the graphics core couldn't be changeable like RAM for the CPU currently is... If fact, it looks like you could have more control over how much RAM you have for the GPU.
    True about the change from say GDDR3 to GDDR4 though :-(
  • Schadenfroh - Wednesday, October 19, 2005 - link

    great idea, nvidia could sell the GPUs directly to the consumer and cut out the AIB makers, i bet they are not happy at all about this idea, save the ones that make mobos. Reply
  • bersl2 - Wednesday, October 19, 2005 - link

    I wonder, can the whole graphics card concept be split up into discrete components, as the components that support the CPU are? Can we have a GPU daughterboard, pluggable GDDR RAM, and a socketed GPU, all made by different manufacturers? Of course, that would seem to raise costs in some areas, but reduce it in others; and by how much, I wouldn't know.

    Another thing I wonder about is whether this is a prelude to finally having an open graphics ISA (which need not be standard for all or any GPUs). I certainly hope so.
  • semo - Thursday, October 20, 2005 - link

    i think i read somewhere that a 6800gpu costs $40. i really doubt that if you could upgrade your gpu now you would be paying that price for it. Reply
  • bob661 - Wednesday, October 19, 2005 - link

    That would be fantastic to be able to swap out GPU's like CPU's. For someone like me, that would mean no cards in any of my slots. Reply

Log in

Don't have an account? Sign up now