Noise is often perceived as an unwanted disturbance, a bug to be eliminated from engineered systems. However, mounting evidence from biological systems suggests that noise serves essential functional roles, acting as a feature that enhances robustness, flexibility, and adaptive capabilities. This perspective shift from "noise as nuisance" to "noise as necessity" has profound implications for understanding both natural and artificial systems.
In a recent paper titled "Long Range Communication via Gap Junctions and Stress in Planarian Morphogenesis: A Computational Study" (Blattner & Levin, 2023), we explored how noise contributes to the stability of regenerative processes in planarian flatworms. Our computational model revealed that stochastic elements in cell-cell communication networks, far from being detrimental, actually stabilize the system's ability to achieve correct morphological outcomes during regeneration.
The model centered on planarian regeneration, a remarkable biological phenomenon where these flatworms can regenerate complete organisms from small tissue fragments. We implemented a system of coupled proportional-integral (PI) controllers representing cellular stress and membrane voltage dynamics, connected through gap junction networks. The key finding was counterintuitive: introducing noise into the communication channels between cells enhanced the system's stability rather than degrading it.
Specifically, our simulations demonstrated that when connections between cells were made probabilistic (following a directed bond percolation model), the resulting stochastic dynamics prevented the system from getting trapped in unstable oscillatory states. The noise effectively widened the basin of attraction for stable regenerative outcomes, allowing the system to navigate morphological state space more robustly. This aligns with the biological observation that planarians reliably regenerate their target morphology despite significant environmental and internal perturbations.
The github repos for this experiment can be found here: https://github.com/marcelbtec/NNNoise.git
This biological insight naturally extends to artificial neural networks, where noise injection has emerged as a powerful regularization technique. Just as biological systems leverage noise for robust morphogenesis, neural networks can benefit from carefully introduced stochasticity during training. The parallel is striking: both systems must navigate high-dimensional spaces (morphological state space for planarians, parameter space for neural networks) to reach desired outcomes while maintaining stability and generalization capabilities.
To demonstrate the beneficial effects of noise injection in neural networks, we implemented a comprehensive experimental framework examining out-of-distribution (OOD) generalization. Our approach directly parallels the biological findings: just as noise helps planarian cells achieve robust morphological outcomes under varied conditions, we hypothesized that noise injection during neural network training would improve generalization to distribution shifts.
We employed a feedforward neural network with the following specifications:
The critical modification was the introduction of Gaussian noise to inputs during training:
def forward(self, x, training=True):
if training and self.noise_std > 0:
x = x + torch.randn_like(x) * self.noise_std
# ... rest of forward pass
We created a challenging scenario using a combination of two-moons and two-circles datasets, designed to test generalization under various distribution shifts:
Models were trained with different noise injection levels (σ = 0.0, 0.1, 0.3, 0.5) using:
The impact of noise injection on learned decision boundaries was dramatic. Models trained without noise (σ = 0.0) produced sharp, complex decision boundaries that tightly fit the training data. In contrast, noise-trained models (σ = 0.3) exhibited smoother, more regular boundaries—a direct parallel to the smoothed morphological gradients observed in our planarian simulations.
The results clearly demonstrate that moderate noise injection (σ = 0.3) provides optimal OOD generalization, reducing the performance drop from 26.4% to 13.9%—a relative improvement of 47%.
A key mechanistic insight comes from analyzing input gradient magnitudes:
This 6-fold reduction in gradient magnitude indicates that noise injection encourages the network to learn flatter, more robust functions—precisely analogous to how noise in gap junction communication leads to more stable morphological patterns in planarians.
The mathematical framework reveals why noise injection is beneficial. For small noise variance σ², the expected loss under noise can be approximated as:
E[ℓ(f(x + ε), y)] ≈ ℓ(f(x), y) + (σ²/2)‖∇_x f(x)‖²
This shows that noise injection implicitly adds a gradient penalty term, encouraging the network to learn functions with smaller input gradients. This regularization effect:
The parallels between our neural network experiments and the planarian regeneration model are profound:
In planarians, stochastic gap junction communication prevents cells from getting locked into rigid signaling patterns. Similarly, noise injection prevents neural networks from memorizing exact input-output mappings, promoting more flexible representations.
Planarian regeneration succeeds despite tissue damage, environmental variations, and cellular heterogeneity. Noise-trained neural networks similarly maintain performance under distribution shifts, input corruptions, and adversarial perturbations.
Neither system requires explicit regularization terms. In planarians, robust morphogenesis emerges from noisy cell-cell communication. In neural networks, generalization emerges from noisy gradient descent dynamics.
Both systems exhibit homeostatic properties—planarians regenerate toward a target morphology, while neural networks converge toward generalizable solutions. Noise facilitates this convergence by preventing premature commitment to suboptimal states.
This work demonstrates that principles governing biological robustness can inform artificial intelligence design. The success of noise injection in both contexts suggests several research directions:
Noise injection represents a fundamental principle for achieving robust, generalizable behavior in complex systems. Our experiments demonstrate that the same stochastic mechanisms that enable reliable morphogenesis in planarians can enhance out-of-distribution generalization in neural networks. This convergence of biological and artificial intelligence insights suggests that embracing noise—rather than eliminating it—may be key to building more capable and resilient intelligent systems.
The quantitative improvements observed (47% reduction in OOD performance drop) validate the practical importance of this approach. More broadly, this work exemplifies how biological principles can guide the development of more robust machine learning systems, highlighting the value of interdisciplinary research at the intersection of developmental biology and artificial intelligence.
Blattner, M., & Levin, M. (2023). Long Range Communication via Gap Junctions and Stress in Planarian Morphogenesis: A Computational Study. Bioelectricity, 5(3), 196-209.