Stories
Slash Boxes
Comments

SoylentNews is people

Submission Preview

Link to Story

Fourier Transformations Reveal How AI Learns Complex Physics

Accepted submission by guest reader at 2023-03-22 18:52:07
Science

Fourier Transformations Reveal How AI Learns Complex Physics [scitechdaily.com]:

One of the oldest tools in computational physics — a 200-year-old mathematical technique known as Fourier analysis [wikipedia.org] — can reveal crucial information about how a form of artificial intelligence called a deep neural network learns to perform tasks involving complex physics like climate and turbulence modeling, according to a new study.

In the paper, Hassanzadeh, Adam Subel and Ashesh Chattopadhyay, both former students, and Yifei Guan, a postdoctoral research associate, detailed their use of Fourier analysis to study a deep learning neural network that was trained to recognize complex flows of air in the atmosphere or water in the ocean and to predict how those flows would change over time. Their analysis revealed “not only what the neural network had learned, it also enabled us to directly connect what the network had learned to the physics of the complex system it was modeling,” Hassanzadeh said.

“Deep neural networks are infamously hard to understand [nature.com] and are often considered ‘black boxes,’” he said. “That is one of the major concerns with using deep neural networks in scientific applications. The other is generalizability: These networks cannot work for a system that is different from the one for which they were trained."

Hassanzadeh’s team first performed the Fourier transformation on the equation of its fully trained deep-learning model. Each of the model’s approximately 1 million parameters act like multipliers, applying more or less weight to specific operations in the equation during model calculations. In an untrained model, parameters have random values. These are adjusted and honed during training as the algorithm gradually learns to arrive at predictions that are closer and closer to the known outcomes in training cases. Structurally, the model parameters are grouped in some 40,000 five-by-five matrices, or kernels.

“When we took the Fourier transform of the equation, that told us we should look at the Fourier transform of these matrices,” Hassanzadeh said. “We didn’t know that. Nobody has done this part ever before, looked at the Fourier transforms of these matrices and tried to connect them to the physics.

“And when we did that, it popped out that what the neural network is learning a combination of low-pass filters, high-pass filters and Gabor filters,” he said.

Subel said the findings have important implications for scientific deep learning, and even suggest that some things scientists have learned from studying machine learning in other contexts, like classification of static images, may not apply to scientific machine learning.

Journal Reference:
Adam Subel, Yifei Guan, Ashesh Chattopadhyay, Pedram Hassanzadeh, Explaining the physics of transfer learning in data-driven turbulence modeling, PNAS Nexus, Volume 2, Issue 3, March 2023, pgad015, https://doi.org/10.1093/pnasnexus/pgad015 [doi.org]


Original Submission