andino
Jackass Extraordinaire
https://www.smithsonianmag.com/scie...lutionary-invention-cool-computers-180975807/
I had read a few articles on integrated cooling of cpus by cutting fluid channels in them a few years back. The hard part then was trying to find a good way to mass produce the units. It'll be interesting to see if this comes to fruition.
In 1965, Gordon Moore, a co-founder of Intel, forecast that computing would increase in power and decrease in price exponentially. For decades what later became known as Moore’s Law proved true, as microchip processing power roughly doubled and costs dropped every couple of years. But as power increased exponentially, so did the heat produced by packing billions of transistors atop a chip the size of a fingernail.
As electricity meets resistance passing through those processors it creates heat. More processors mean higher temperatures, threatening the continued growth of computer power because as they get hotter, chips decrease in efficiency and eventually fail. There’s also an environmental cost. Those chips, and the cooling they require, devour power with an insatiable hunger. Data centers use roughly one percent of the world's electricity. In the United States alone, they consume electricity and water for cooling roughly equivalent to that used by the entire city of Philadelphia in a year.
Now, Swiss researchers have published a study in the journal Nature which says they have one solution to the cooling problem. “Data centers consume a huge amount of electricity and water as as we rely more and more on this data, this consumption is just going to increase,” says Elison Matioli, a professor in the Institute of Electrical Engineering at Ecole Polytechnique Fédérale de Lausanne (EPFL) who led the study. “So finding ways to deal with the dissipated heat or dissipated power is an extremely important issue.”
Previous attempts to cool microchips have relied on metal sinks, often combined with fans, that absorb heat and act like an exhaust system. Some data centers rely on fluid flowing through servers to draw away heat. But those systems are designed and fabricated separately and then combined with the chips. Matioli and his team have designed and fabricated chips and their fluid cooling systems together. In the new design, the cooling elements are integrated throughout by creating microchannels for fluid within semiconductors that spirit away the heat, save energy, and mitigate the environmental problems created by data centers.
Their work also could have important applications in an electrified future, helping eliminate the heat problem and reducing the size of power converters on cars, solar panels and other electronics. “The proposed technology should enable further miniaturization of electronics, potentially extending Moore's Law and greatly reducing the energy consumption in cooling of electronics,” they write.
...
Researchers explored microfluidics, the science of controlling fluids in tiny channels, as far back as the early 1990s. Efforts increased after the U.S. Department of Defense’s Defense Advanced Research Projects Agency (DARPA) first became interested in the technology in the late 1990s, but began to take deeper interest in 2008 as the number of heat-producing transistors on a microprocessor chip went from thousands to billions. Joshi estimates that the agency has spent $100 million on research, including funding what it called ICECool programs at IBM and Georgia Tech beginning in 2012.
Over the years, embedding liquid cooling in chips has been explored through three basic designs. The first two designs did not bring cooling fluid into direct contact with the chip. One used a cold plate lid with microfluidic channels to cool chips. Another featured a layer of material on the back of chips to transfer heat to a fluid-cooled plate without the lid. The third design, the one that Matioli and his team explored, brings the coolant into direct contact with the chip.
Matioli’s research builds on work by Joshi and others. In 2015, Joshi and his team reported cutting fluid channels directly into integrated circuits yielding temperatures 60 percent lower than air cooling. “Cooling technology is absolutely going to be critical and using fluids other than air is a key part of being able to take away these very large heat rejection requirements put out by the computers,” Joshi says. “And you want to have the coolant where the heat is being produced. The further away it is, the less effective at a very high level it's going to be.”
That’s what Matioli’s research advanced. To test their concept, the team designed a water-cooled chips, concerting alternating current (AC) into direct current (DC) integrating microchannels filled with water in the same semiconductor substrate. The substrate they used was gallium nitride, rather than silicon, which enabled much smaller miniaturization than the typically-used silicon. The result, according to the paper, is cooling power up to 50 times greater than conventional designs.
The trick was finding a new way to fabricate chips so the fluid channels, ranging from 20 microns (the width of a human skin cell) to 100 microns, were as close to possible as the electronics. They combined those with large channels on the back of the chip to reduce the pressure needed to make the liquid flow. “The analogy is it's like our bodies,” Matioli says. “We have the larger arteries and the smaller capillaries and that's how the entire body minimizes the pressure necessary to distribute blood.”
I had read a few articles on integrated cooling of cpus by cutting fluid channels in them a few years back. The hard part then was trying to find a good way to mass produce the units. It'll be interesting to see if this comes to fruition.