Stories
Slash Boxes
Comments

SoylentNews is people

Submission Preview

Link to Story

Physics of Foam Strangely Resembles AI Training

Accepted submission by hubie at 2026-01-18 18:02:08
News

Physics of Foam Strangely Resembles AI Training [upenn.edu]:

Foams are everywhere: soap suds, shaving cream, whipped toppings and food emulsions like mayonnaise. For decades, scientists believed that foams behave like glass, their microscopic components trapped in static, disordered configurations.

Now, engineers at the University of Pennsylvania have found that foams actually flow ceaselessly inside while holding their external shape. More strangely, from a mathematical perspective, this internal motion resembles the process of deep learning, the method typically used to train modern AI systems.

The discovery could hint that learning, in a broad mathematical sense, may be a common organizing principle across physical, biological and computational systems, and provide a conceptual foundation for future efforts to design adaptive materials. The insight could also shed new light on biological structures that continuously rearrange themselves, like the scaffolding in living cells.

In a paper in Proceedings of the National Academy of Sciences, the team describes using computer simulations to track the movement of bubbles in a wet foam. Rather than eventually staying put, the bubbles continued to meander through possible configurations. Mathematically speaking, the process mirrors how deep learning involves continually adjusting an AI system's parameters — the information that encodes what an AI "knows" — during training.

"Foams constantly reorganize themselves," says John C. Crocker, Professor in Chemical and Biomolecular Engineering (CBE) and the paper's co-senior author. "It's striking that foams and modern AI systems appear to follow the same mathematical principles. Understanding why that happens is still an open question, but it could reshape how we think about adaptive materials and even living systems."

In some ways, foams behave mechanically like solids: they more or less hold their shape and can rebound when pressed. At a microscopic level, however, foams are "two-phase" materials, made up of bubbles suspended in a liquid or solid. Because foams are relatively easy to create and observe yet exhibit complex mechanical behavior, they have long served as model systems for studying other crowded, dynamic materials, including living cells.

[...] During training, modern AI systems continually adjust their parameters — the numerical values that encode what they "know." Much like bubbles in foams were once thought to descend into metaphorical valleys, searching for the positions that require the least energy to maintain, early approaches to AI training aimed to optimize systems as tightly as possible to their training data.

Deep learning accomplishes this using optimization algorithms related to the mathematical technique "gradient descent," which involves repeatedly nudging a system in the direction that most improves its performance. If an AI's internal representation of its training data were a landscape, the optimizers guide the system downhill, step by step, toward configurations that reduce error — those that best match the examples it has seen before.

Over time, researchers realized that forcing systems into the deepest possible valleys was counterproductive. Models that optimized too precisely became brittle, unable to generalize beyond the data they had already seen. "The key insight was realizing that you don't actually want to push the system into the deepest possible valley," says Robert Riggleman, Professor in CBE and co-senior author of the new paper. "Keeping it in flatter parts of the landscape, where lots of solutions perform similarly well, turns out to be what allows these models to generalize."

When the Penn researchers looked again at their foam data through this lens, the parallel was hard to miss. Rather than settling into "deep" positions in this metaphorical landscape, bubbles in foams also remained in motion, much like the parameters in modern AI systems, continuously reorganizing within broad, flat regions with similar characteristics. The same mathematics that explains why deep learning works turned out to describe what foams had been doing all along.

[...] "Why the mathematics of deep learning accurately characterizes foams is a fascinating question," says Crocker. "It hints that these tools may be useful far outside of their original context, opening the door to entirely new lines of inquiry."

Journal Reference: Amruthesh Thirumalaiswamy et al, Slow relaxation and landscape-driven dynamics in viscous ripening foams, PNAS (2025). https://doi.org/10.1073/pnas.2518994122 [doi.org].     https://dx.doi.org/10.48550/arxiv.2301.13400 [doi.org] [arXiv]


Original Submission