It would now appear we are saturated with two phase immersion liquid cooling (2PILC) – pun intended. One common element from the annual Supercomputing trade show, as well as the odd system at Computex and Mobile World Congress, is the push from some parts of the industry towards fully immersed systems in order to drive cooling. Last year at SC19 we saw a large number of systems featuring this technology – this year the presence was limited to a few key deployments.

Two Phase Immersion Liquid Cooling (2PILC) involves a server with next to no heatsinks, and putting it into a liquid that has a low boiling point. These liquids are often organic compounds (so not water, or oil) that give direct contact to the silicon and as the silicon is used it will give off heat which is transferred into the liquid around it, causing it to boil. The most common liquids are variants of 3M Novec of Fluorinert, which can have boiling points around 59C. Because it turns the liquid into a gas, the gas rises, forcing convection of the liquid. The liquid then condenses on a cold plate / water pipe and falls back into the system.


GIGABYTE from a previous show

These liquids are obviously non-ionic and so do not transfer electricity, and are of a medium viscocity in order to facilitate effective natural convection. Some deployments have extra forced convection which helps with liquid transport and supports higher TDPs. But the idea is that with a server or PC in this material, everything can be kept at a reasonable temperature, and it also supports super dense designs.


OTTO automated system with super dense racking

We reported on TMGcore’s OTTO systems, which involve this 2PILC technology to create data center units up to 60 kilowatts in 16 square feet – all the customer needs to do is supply power, water, and a network connection. Those systems also had automated pickup and removal, should maintenance be required. Companies like TMGcore cite that the 2PILC technology often allows for increased longevity of the hardware, due to the controlled environment.

One of the key directions of this technology last year was for crypto systems, or super-dense co-processors. We saw some of that again at SC19 this year, but not nearly as much. We also didn’t see any 2PILC servers directed towards 5G compute at the edge, which was also a common theme last year. All the 2PILC companies on the show floor this year were geared towards self-contained easy-to-install data center cubes that require little maintenance. This is perhaps unsurprising, given that 2PILC support without a dedicated unit is quite difficult without a data center ground up design.

One thing we did see was that component companies, such as companies building VRMs, were validating their hardware for 2PILC environments.

Typically a data center will discuss its energy efficiency in terms of PUE, or Power Usage Effectiveness. A PUE of 1.50 for example means that for every 1.5 megawatts of power used, 1 megawatt of useful work is performed. Standard air-cooled data centers can have a PUE of 1.3-1.5, or purpose built air-cooled datacenters can go as low as a PUE of 1.07. Liquid cooled datacenters are also around this 1.05-1.10 PUE, depending on the construction. The self-contained 2PILC units we saw at Supercomputing this year were advertising PUE values of 1.028, which is the lowest I’ve ever seen. That being said, given the technology behind them, I wouldn’t be surprised if a 2PILC rack would cost 10x of a standard air-cooled rack.

Related Reading

POST A COMMENT

35 Comments

View All Comments

  • sharath.naik - Sunday, December 1, 2019 - link

    I heard there is something similar but better way to use building liquid for cooling without all the hassles. It's called heatpipes, from what I hear it is very efficient. Reply
  • Dragonstongue - Sunday, December 1, 2019 - link

    even better is liquid within the CPU or GPU core itself, IBM did this design many years back, still has not hit "mainstream" and likely will once full optical style comes into play.

    I would <3 to see the bubbles

    nothing is saying cannot be liquid to heatpipe to free air rad style i.e no fans required just takes difference between the hot producing part and the cooler ambient air around it (even just a small TEC unit to provide smaller bursts of cooling per heatpipe or heatsink)
    Reply
  • mode_13h - Tuesday, December 3, 2019 - link

    In a 2-phase immersion cooling setup, you could just run with a bare die. Perhaps the die could even be textured in some way, to assist convection & nucleation. Reply
  • destorofall - Monday, December 2, 2019 - link

    heat pipes and vapor chambers are great but you still rely or relatively large mass flow rates of air to remove the heat from the fin stack in such a confined and restricted space. Reply
  • mode_13h - Tuesday, December 3, 2019 - link

    Funny thing is that both heat pipes and this two-phase liquid cooling work by basically the same principal. The main difference is that heat pipes have a low vapor pressure (i.e. near vacuum) and very little fluid, while these ore mostly full of fluid.

    If you would try to build a computer enclosure like a heatpipe (i.e. with low vapor pressure and little fluid), the failure mode would be much worse (and more likely). Also, you'd need to run capillaries so the fluid could move from the condensation sites to the heat sources.
    Reply
  • mode_13h - Tuesday, December 3, 2019 - link

    BTW, I know that's not what you were proposing, but I thought it was an interesting observation & tangent. Reply
  • Duncan Macdonald - Friday, November 29, 2019 - link

    Looking at the top picture - I would not trust this cooling as shown - the bottom of the chip would be far better cooled than the top. Much of the top half of the chip is covered by the gas bubbles and will have poorer cooling than the bottom. For the cooling to be good it needs a pump that supplies the liquid quickly enough to the hot surface for it not to get insulated by bubbles of gas. Reply
  • azfacea - Friday, November 29, 2019 - link

    nah small problem. if problem at all. the question is can all components take it for the duration of their life. if so this could be a big deal, for power and space efficiency Reply
  • PVG - Friday, November 29, 2019 - link

    If that's really a concern, a slight angle of the board would suffice to resolve the issue. Reply
  • azfacea - Friday, November 29, 2019 - link

    even vertically its not clear that its a problem. additional turbulence on top might compensate for less contact with liquid. you'd have to measure it but yea your approach can work if its a problem Reply

Log in

Don't have an account? Sign up now