Big Data from Little Droplets: Unlocking Condensation Insights with AI and Machine Learning
Oct. 26, 2021 – Tiny droplets hold big data. Researchers from UC Irvine and University of Illinois at Urbana-Champaign are using artificial intelligence and machine learning to transform information from condensation droplets exponentially faster and more accurately than before.
Condensation is everywhere in nature and industry. While the transformation of vapor/gas to liquid is an efficient mass-transfer process in nature, it is also essential to many industrial applications such as thermoelectric and nuclear power generation, water-harvesting systems, heat exchangers and desalination plants.
Yoonjin Won, lead author and Samueli School associate professor of mechanical and aerospace engineering, her team and University of Illinois at Urbana-Champaign collaborators have published their research “A Deep Learning Perspective on Dropwise Condensation” in Advanced Science. Their work is also highlighted as a hot topic in in the journal’s artificial intelligence and machine learning category and will be featured on the back cover of issue 22.
“Our deep learning strategy has the potential to revolutionize thermofluidic sciences by providing meaningful physical descriptors and their inextricable relationships in the context of heat and mass transfer performance,” Won said. “Our strategy develops a framework for applying machine learning methods to take immense amounts of visual data and rapidly extract nuanced patterns that are unobservable with human analysis. These patterns then help to elucidate thermofluidic mechanisms previously not seen or understood. New data can pave the way to understand new principles about nucleation-initiated phase-change science, develop prediction models, and suggest design rules for next-generation surfaces.”
Condensation involves the nucleation of droplets on a surface that can lead to significant heat and mass transfer improvement depending on the droplet dynamics. Condensation on surfaces is commonly a continuous cycle of droplet nucleation, growth and departure. However, extracting quantifiable measurements has proven challenging for many researchers. Using physical sensors is impractical for collecting information because it would require thousands of images with each image containing thousands of droplets, totaling more than a million data for a one-hour experiment.
An alternative approach has been applying standard computer vision algorithms. The traditional computer vision methods are nonadjustable for environmental variables and thereby
unsuitable for autonomous data collection. For example, even small light reflection greatly changes how droplets are detected and must be manually adjusted. To address this challenge, Won and colleagues used an intelligent vision-based framework that blends classical thermofluidic imaging techniques with deep learning. Deep convolutional neural networks (DCNNs) emulate the human visual cortex by learning features through multiple operational layers. DCNNs are beneficial because they are spatially invariant, meaning that image features can be easily recognized. This advantage helps DCNNs quickly generalize to new imaging conditions and allows them to significantly outperform conventional image analysis methods.
This deep learning framework enables the researchers to autonomously harness physical descriptors and quantify thermal performance at extreme spatio-temporal resolutions at approximately 300 nanometers and approximately 200 milliseconds for the first time. This has revolutionized the way to characterize the heat flux from a condensing surface. More importantly, the relationship between these measured heat flux and droplet statistics have never been quantitatively correlated in any past work. Contrary to classical understanding, the data-centric analysis conclusively showed that the overall condensation performance is governed by a key tradeoff between heat transfer rate per individual droplet and droplet population density.
“The development of precise and automatic object prototyping of instances that can provide physical descriptors represents a game-changing innovation for thermofluidic engineering,” researchers said. “A promising avenue to address these challenges is to use emerging AI-based computer vision techniques to track and analyze moving objects.”
The vision-based approach presents a powerful tool for the study of not only phase-change processes but also any nucleation-based process within and beyond the thermal science community through the harnessing of big data.
“Machine vision and artificial intelligence represent a significant potential for advancement in the fields of energy and thermal science,” said Nenad Miljkovic, co-lead author and associate professor of mechanical science and engineering at University of Illinois at Urbana-Champaign. “Visualization is the most widely used method to understand the fundamental mechanisms governing thermal processes such as flows, mixing, boiling, evaporation, condensation, icing, freezing and more.”
He added, “The work has potential to impact the public through its future use in product design and development. Although the developed machine vision methods help to elucidate fundamental mechanisms, it is the understanding of these mechanisms that enables designers to design reduced order models that can be used by engineers to create next-generation products. These products include heat exchangers, heat sinks, heat spreaders, thermal management systems, and other components that significantly affect a plethora of real-life applications such as electric vehicles, consumer electronics and building energy systems.”
The video clip below shows deep learning microscale droplet-resolved heat flux mapping. (left) Time-lapse images of raw and tracked results for dropwise condensation on the HP surface. Spatio-temporal heat flux is represented by (center) 3D surface and (right) 2D contour plots. The circles in the contour plot mark the centroid locations of individual droplets. The proposed deep learning framework enables the researchers to autonomously harness physical descriptors and quantify thermal performance at extreme spatio-temporal resolutions at approximately 300 nanometers and approximately 200 milliseconds for the first time.
Contributors include Youngjoon Suh, Jonggyu Lee and Peter Simadiris, graduate students in the UCI Department of Mechanical and Aerospace Engineering; and Soumyadip Sett, Xiao Yan, Longnan Li and Kazi Fazle Rabbi from the University of Illinois at Urbana-Champaign Department of Mechanical Science and Engineering.
The research was supported by the UCI Department of Mechanical and Aerospace Engineering Graduate Fellowship, the National Science Foundation, the Office of Naval Research and the International Institute for Carbon Neutral Energy Research (WPI-I2CNER), sponsored by the Japanese Ministry of Education, Culture, Sports, Science and Technology.
– Tonya Becerra