New Systems for Cooling Chips

andino

Jackass Extraordinaire
https://www.smithsonianmag.com/scie...lutionary-invention-cool-computers-180975807/

In 1965, Gordon Moore, a co-founder of Intel, forecast that computing would increase in power and decrease in price exponentially. For decades what later became known as Moore’s Law proved true, as microchip processing power roughly doubled and costs dropped every couple of years. But as power increased exponentially, so did the heat produced by packing billions of transistors atop a chip the size of a fingernail.

As electricity meets resistance passing through those processors it creates heat. More processors mean higher temperatures, threatening the continued growth of computer power because as they get hotter, chips decrease in efficiency and eventually fail. There’s also an environmental cost. Those chips, and the cooling they require, devour power with an insatiable hunger. Data centers use roughly one percent of the world's electricity. In the United States alone, they consume electricity and water for cooling roughly equivalent to that used by the entire city of Philadelphia in a year.

Now, Swiss researchers have published a study in the journal Nature which says they have one solution to the cooling problem. “Data centers consume a huge amount of electricity and water as as we rely more and more on this data, this consumption is just going to increase,” says Elison Matioli, a professor in the Institute of Electrical Engineering at Ecole Polytechnique Fédérale de Lausanne (EPFL) who led the study. “So finding ways to deal with the dissipated heat or dissipated power is an extremely important issue.”

Previous attempts to cool microchips have relied on metal sinks, often combined with fans, that absorb heat and act like an exhaust system. Some data centers rely on fluid flowing through servers to draw away heat. But those systems are designed and fabricated separately and then combined with the chips. Matioli and his team have designed and fabricated chips and their fluid cooling systems together. In the new design, the cooling elements are integrated throughout by creating microchannels for fluid within semiconductors that spirit away the heat, save energy, and mitigate the environmental problems created by data centers.

Their work also could have important applications in an electrified future, helping eliminate the heat problem and reducing the size of power converters on cars, solar panels and other electronics. “The proposed technology should enable further miniaturization of electronics, potentially extending Moore's Law and greatly reducing the energy consumption in cooling of electronics,” they write.

...

Researchers explored microfluidics, the science of controlling fluids in tiny channels, as far back as the early 1990s. Efforts increased after the U.S. Department of Defense’s Defense Advanced Research Projects Agency (DARPA) first became interested in the technology in the late 1990s, but began to take deeper interest in 2008 as the number of heat-producing transistors on a microprocessor chip went from thousands to billions. Joshi estimates that the agency has spent $100 million on research, including funding what it called ICECool programs at IBM and Georgia Tech beginning in 2012.

Over the years, embedding liquid cooling in chips has been explored through three basic designs. The first two designs did not bring cooling fluid into direct contact with the chip. One used a cold plate lid with microfluidic channels to cool chips. Another featured a layer of material on the back of chips to transfer heat to a fluid-cooled plate without the lid. The third design, the one that Matioli and his team explored, brings the coolant into direct contact with the chip.

Matioli’s research builds on work by Joshi and others. In 2015, Joshi and his team reported cutting fluid channels directly into integrated circuits yielding temperatures 60 percent lower than air cooling. “Cooling technology is absolutely going to be critical and using fluids other than air is a key part of being able to take away these very large heat rejection requirements put out by the computers,” Joshi says. “And you want to have the coolant where the heat is being produced. The further away it is, the less effective at a very high level it's going to be.”

That’s what Matioli’s research advanced. To test their concept, the team designed a water-cooled chips, concerting alternating current (AC) into direct current (DC) integrating microchannels filled with water in the same semiconductor substrate. The substrate they used was gallium nitride, rather than silicon, which enabled much smaller miniaturization than the typically-used silicon. The result, according to the paper, is cooling power up to 50 times greater than conventional designs.

The trick was finding a new way to fabricate chips so the fluid channels, ranging from 20 microns (the width of a human skin cell) to 100 microns, were as close to possible as the electronics. They combined those with large channels on the back of the chip to reduce the pressure needed to make the liquid flow. “The analogy is it's like our bodies,” Matioli says. “We have the larger arteries and the smaller capillaries and that's how the entire body minimizes the pressure necessary to distribute blood.”

I had read a few articles on integrated cooling of cpus by cutting fluid channels in them a few years back. The hard part then was trying to find a good way to mass produce the units. It'll be interesting to see if this comes to fruition.
 
Up to 50 times cooler? Wouldn't they run into the old condensation problem with that design?
 
Only if the water is sub-ambient or below dew point, which wouldn't be possible without an external source cooling the water.
 
Only if the water is sub-ambient or below dew point, which wouldn't be possible without an external source cooling the water.

It's hard for me to envision this without that external source given this bit.

designed a water-cooled chips, concerting alternating current (AC) into direct current (DC) integrating microchannels filled with water in the same semiconductor substrate.

What exactly is the alternating current passing through? Is it the fluid itself? Is that alternating current what causes the fluid to flow and transfer heat?

Maybe my mind is just stuck in the past? :p
 
The current is what will cause the fluid to flow through the substrate. This concept would help greatly with heat monsters like Ryzen if they can perfect it. They run hot due to the thermal density of the chip being so high, that even though the cooling system is not heat-soaked, it cannot transfer the heat quickly enough. The fluid would move due to the AC/DC current, through the substrate, and basically cool the chip down internally. The closer you are (or in this case, inside of it), the better thermal transfer you'll have. This is why Direct-Die works so well on Intel chips - removing the IHS and connecting your big ol' piece of copper straight on the die of the processor results in significantly better thermal transfer.

I'm not sure how the liquid would keep itself cool, or fix the issues of high thermal density, since this would appear to be a closed system. I think this would work best if they managed to create grooves within the substrate, and basically having the processor come with it's own waterblock, that you would then connect a custom loop to and flow your own liquid through. I don't see why they couldn't make the grooves run above, and just make the chips taller to compensate for the height, but I'm no engineer.

This doesn't really fix the problem of heat not transferring through the IHS fast enough. That's why the high-end Intel chips run cooler in comparison to Ryzen despite having a higher TDP; there's a lot of **** packed in on the Ryzen chips and the IHS cannot spread/transfer the heat fast enough to your heatsink/waterblock, so the thermals climb quickly despite your cooling setup not being overwhelmed.
 
Last edited:
Back
Top