Researchers Offer Advanced Design Space Exploration Based on Deep Neural Network Using Active Transfer Learning and Data Augmentation
A new study has proposed an advanced design approach based on a deep neural network that enables efficient search of superior materials far beyond the domain of the initial training set. This approach compensates for the low predictive power of neural networks on an invisible domain through progressive updates of the neural network with methods of active transfer learning and data augmentation.
Professor Seungwha Ryu believes this study will help solve a variety of optimization problems that have an astronomical number of possible design configurations. For the composite grid optimization problem, the proposed framework was able to provide excellent designs close to the overall optima, even with the addition of a very small data set corresponding to less than 0.5% of the size of the initial training data set. This study was published in npj Computational Materials last month.
“We wanted to alleviate the limitation of the neural network, the low predictive power beyond the domain of the learning set for the design of materials or structures,” said Professor Ryu from the Department of Mechanical Engineering.
Generative models based on neural networks have been actively investigated as a reverse design method to find new materials in a large design space. However, the applicability of conventional generative models is limited because they cannot access data outside the range of training sets. Advanced generative models that have been designed to overcome this limitation also suffer from poor predictive power for the invisible domain.
Professor Ryu’s team, working with researchers in Professor Grace Gu’s group at UC Berkeley, devised a design method that simultaneously expands the domain using the strong predictive power of a deep neural network and research optimal design by repetitively performing three key steps.
First, it searches for a few candidates with improved properties located close to the training set via genetic algorithms, mixing superior designs within the training set. Then it checks to see if the candidates really have improved properties and extends the training package by duplicating validated designs through a data augmentation method. Finally, they can extend the domain of reliable prediction by updating the neural network with the new superior designs via transfer learning. Because the expansion takes place along relatively narrow but correct routes to the optimal design (shown in the diagram in Figure 1), the framework allows for efficient search.
As a data-intensive method, a deep neural network model tends to have reliable predictive power only in and near the domain of the learning set. When the optimal configuration of materials and structures is well beyond the initial training set, which is frequently the case, design methods based on neural networks suffer from poor predictive power and become ineffective.
The researchers expect the framework to be applicable to a wide range of optimization problems in other scientific and technical disciplines with astronomically large design space, as it provides an efficient way to gradually expand the domain of reliable prediction towards the target design while avoiding the risk of being stuck in local minima. In particular, being a less data intensive method, design issues where data generation is time consuming and expensive will benefit the most from this new framework.
The research team is currently applying the optimization framework for the task of designing metamaterial structures, segmented thermoelectric generators and optimal sensor distributions. “From these sets of ongoing studies, we hope to better recognize the pros and cons, as well as the potential of the suggested algorithm. Ultimately, we want to design more efficient machine learning-based design approaches, ”explained Professor Ryu.
Yongtae Kim, Youngsoo, Charles Yang, Kundo Park, Grace X. Gu and Seunghwa Ryu, “Deep Learning Framework for Exploring Material Design Space Using Active Transfer Learning and increase in data ”, npj Computational Materials (https://doi.org/ 10.1038 / s41524-021-00609-2)