Navigation Content
University of Wisconsin Madison College of Engineering
You are here:
  1. Home > 
  2. News > 
  3. News archive > 
  4. 2012 > 

Modeling liquid flows to optimize electronics cooling

Computer-assisted simulation of a jet of water plunging into a still pool.

A liquid jet mixes into a pool of water, one of many simulations Mechanical Engineering Assistant Professor Mario Trujillo uses to study in detail the boundaries between fluids and other phases of matter.

In many of today's electronics, the price we pay for speed comes in the form of heat. However, a University of Wisconsin-Madison mechanical engineer is capitalizing on the computational power of those very electronics in detailed simulations that could improve the way certain electronics are cooled.

A UW-Madison assistant professor of mechanical engineering, Mario Trujillo is pursuing the secrets of how liquids undergo heat transfer and phase changes and how to model those to optimize cooling for radar and other high-density electronics. The U.S. Office of Naval Research is funding his effort via a $236,470 award.

There's definitely a need: Every few months, the number of processors manufacturers can fit onto a computer chip, for example, grows exponentially. Increase that density of information, and you also increase how much heat each chip gives off—and that heat needs to be managed if the electronics system is to avoid overheating.

At some point in the growth of energy density, says Trujillo, fans alone won't be enough to cool those computers—and as an alternative to this insufficient air flow, engineers are turning to liquid cooling.

In liquid cooling, the evaporation of liquid removes enough heat from the chips to keep them running at low temperatures—much like how sweat cools people. While liquid cooling itself has been around and in use in select systems for years, using computational tools to simulate the process is only now maturing enough to allow detailed study of the physical phenomena of how the process works.

Trujillo's focus is on the interface of different phases of matter, such as the points where liquids mix with gases in an engine fuel injection—or in the case of radar system cooling, where a jet of fresh coolant enters a pool of standing water. Using algorithms derived from natural law rather than a single experiment's data, researchers can track each droplet down to an extremely small scale.

Trujillo uses these algorithms to determine key pieces of information about an interaction: how the flow behaves in specific areas of the interface, what areas have very good heat transfer and which do not, and how fast specific particles are moving. From these algorithms, powerful groups of computers can create visual simulations.

And such simulation, he says, can provide a level of detail well beyond what a single experiment can produce. His work can help create pictures of behavior under not just one set of conditions, but any point or time in an interaction. “We can generate terabytes of data where all that information is available,” he says. “We can use that to investigate and interrogate physical phenomena on a level of detail that is unprecedented. We know the governing equations that nature obeys. Now let's use these governing equations and solve them, in full detail, with very little numerical error, and then let's play as if we have our own nature.”

Producing accurate simulations will be key to optimizing processes like liquid cooling and fuel combustion. "If you understand it, you know exactly what the limits are and what is causing those limits," he says. "That gives you tremendous insight into how to design it to do away with or stretch those limits."

For the Office of Naval Research, he is working on an even more fundamental question: What requirements must a simulation methodology meet to accurately predict how boiling and cooling behave, and how will this accuracy be measured?

For example, once the coolant interface can be simulated, researchers can find the most effective way to apply it to large nodes of servers, cooling thousands of plates of chips instead of just one, and recirculating coolant so that it can quickly be used again in the process.

However, the boiling and vaporization that occur in such processes complicate the equations immensely, and at this point in the research, what’s needed is a set of benchmarks just to identify when the math is correct, and what the symptoms of errors might look like. “What we want to know is, what are the errors, what are the current methods for dealing with them, and how do we improve on them?” Trujillo says. “As these types of simulations become ever more popular within the academic and industrial communities, yardsticks that quantify the degree of accuracy need to be established.

He says the effort here is in the same spirit as previous simulations for pure liquid or pure gas flows during the 1970s and 1980s, but are now applied to a violent mixture of these two phases, such as the case in boiling processes. “There are no measuring sticks right now,” he says.

As a result, flawed equations can still produce realistic-looking simulations, where errors aren’t always obvious.

The metrics Trujillo is working to create need to be relatively simple to implement, and must directly measure the more complicated aspects of the computation that specifically target phase change. “The aim is to enable researchers to assess their code so that they can compute these processes with confidence,” Trujillo says.

Down the road, the same research and resulting benchmarks could apply to engine research into fuel injection behavior, and even in nuclear engineering for optimizing how reactors are cooled with boiling water. No matter where it’s used, Trujillo says, such calculation will be a valuable cost-saving measure for researchers. “It is too expensive both in time and money to design and run experiments for every imaginable application,” he says.

Christie Taylor
7/9/2012