We continue to hear new details about G72 and G73 here in Taiwan, and the latest batch of info from our vendors is that G72 and G73 will be pin compatible with NV40 and NV43. In other words, your next NVIDIA video card might have the same PCB from the 6600GT, but with a different GPU.

This means lower cost to the manufacturer - there is no need for new R+D or board designs. It also means G72 and G73 will launch very fast when the decision comes from NVIDIA as vendors can very easily switch production from the older chips to the new ones. Two vendors confirmed with us that they are already retooling their PCB for six pin 12V molex with the anticipation that G72 and G73 SLI might need the additional power, but even NVIDIA won't comment to the manufacturers at this point.

A lot seems to hinge on ATI's future choices with R580, X1600 and X1300. As of now, the launch date for X1600 is still late November and NVIDIA isn't exactly hurting for new value and midrange SKUs with the success of 6600GT. The X800GTO and X800GTO2 really give 6600GT a run for its money, but we digress.

NVIDIA's Secret Flip Chip GPU

Manufacturers seem to think G72 and G73 will be an easy tool over from NV40/43, but another vendor claims NVIDIA has bigger plans. They claim that NVIDIA is working on flip chip GPU sockets for motherboards. Apparently, inside NVIDIA engineering teams have several prototypes where the GPU, rather than the CPU, is the main focus of a motherboard with two sockets: one for the GPU and another for the CPU. Whether or not such a machine will ever see the light of day is difficult to say right now. However, the idea of pin compatible GPUs already suggests that we are halfway there when it comes to buying GPUs the same way we buy CPUs: in flip chips. We have plenty of questions, like how the memory interface will work and how that will affect performance, but GPU sockets are likely less a question of "if", but rather "when".

Comments Locked


View All Comments

  • sethborg - Thursday, December 22, 2005 - link

    How about a follow up article. Where are we now? Yeah we'd be missing out on SLI but most people dont use it anyway. I'd rather have a huge heatsink on my GPU and have 100mhz more coming out of it than SLI anyway. Do you think we'd ever see 2 cores on one chip? I dunno if that makes sense since it's all parallel anyway, just use that chip area for more lanes. I dunno.
  • Rock Hydra - Sunday, October 30, 2005 - link

    While this seems like a great idea, especially for SLI, although the memory must be on the die or package or else memory latency will be a killer. Motherboards are already cramped as they are, so how big the package is is going to make a big difference. If there is a Zalman cooler like the CPU cooler, then there won't be room for PCI/PCI Express devices. Additionally, GPUs generate more heat than CPUs, and CPUs already have dedicated exhausts in the case. (rear panel/ PSU fans). That would most likely mean that there would have to be an exhaust fan near the PCI cutouts for cases. Although the idea is very interesting. I think it will be hard to implement. I think there hast to be a third party bus comitte that both companies should go to and adhere to those standards.
  • SniperWulf - Tuesday, October 25, 2005 - link

    I remember when I mentioned sockets back in the 3dfx days and people laughed at me.......
  • Rock Hydra - Sunday, October 30, 2005 - link

    Don't worry, man. Most ideas that seem revolutionary or extraordanary now were laughed at in its infancy of thought.
  • tyborg - Tuesday, October 25, 2005 - link

    I predict that NVidia will become the chipset/gfx king, and ATI will become dedicated to PPUs
  • KingofCamelot - Wednesday, October 19, 2005 - link

    Due to the fact that ATI and NVIDIA never seem to get along, I doubt there would be 1 standard socket. This would lead to four models for every mobo maker. One AMD motherboard with ATI socket, one with NVIDIA. One INTEL motherboard with ATI socket, one with NVIDIA. Also, CPU sockets and GPU sockets would likely not be sychronized time wise. What if ATI comes out with a new socket, while AMD and INTEL still have their old sockets? This would mean you would need to buy a new mobo to upgrade your graphics card. What about reverse compatibility? If GPU sockets change will you have to buy a new graphics card if you want to get a newer mobo?

    Don't tell me the sockets won't change . Just look at the history of CPUs. You can't limit next-generation architecture by a pin count. Even if NVIDIA's next GPUs are pin compatible, we don't know if they will continue to be that way down the road of GPU architecure.
  • bob661 - Thursday, October 20, 2005 - link


    Due to the fact that ATI and NVIDIA never seem to get along
    They'll agree or the mobo manufacturers won't make the sockets. Look at BTX.
  • Regs - Wednesday, October 19, 2005 - link

    We all know what's going to happen. Just like our CPU socket we would have to upgrade our motherboards to the latest and greatest every year to get the best performing CPU. You can't run a 3 GHz Barton on a Socket 939, right? This is just another way to increase profits mid year for vendors.
  • gibhunter - Wednesday, October 19, 2005 - link

    It would work great if you had embedded RAM in the chip or on the GPU itself like they do with notebook parts. Cool both the GPU and CPU with one big heatsink with dual contact points and one huge, slow and quiet fan and you have a killer solution on your hands. I like this idea a lot.
  • tuteja1986 - Wednesday, October 19, 2005 - link

    Great idea but what is NVIDIA really trying to achieve with this. Are they making sure that people need to buy these motherboards which are only compatible with NVIDIA card. Standards need to be placed i think or else it will turn into a very ugly war. This SLI and crossfire is just looking like the being of a total domination plans. I want a standard technology that will enable the user to do either do crossfire or SLI on the same motherboard. Anyways I am not liking where the Motherboard and Graphic Card Industry is going. This is all thanks to the stupid software that i now starting to hate called "3D MARKS".

Log in

Don't have an account? Sign up now