Noise as a Feature: From Biological Morphogenesis to Neural Network Robustness

June 20, 2025
marcel blattner | June 2025

Introduction

Noise is often perceived as an unwanted disturbance, a bug to be eliminated from engineered systems. However, mounting evidence from biological systems suggests that noise serves essential functional roles, acting as a feature that enhances robustness, flexibility, and adaptive capabilities. This perspective shift from "noise as nuisance" to "noise as necessity" has profound implications for understanding both natural and artificial systems.

In a recent paper titled "Long Range Communication via Gap Junctions and Stress in Planarian Morphogenesis: A Computational Study" (Blattner & Levin, 2023), we explored how noise contributes to the stability of regenerative processes in planarian flatworms. Our computational model revealed that stochastic elements in cell-cell communication networks, far from being detrimental, actually stabilize the system's ability to achieve correct morphological outcomes during regeneration.

The model centered on planarian regeneration, a remarkable biological phenomenon where these flatworms can regenerate complete organisms from small tissue fragments. We implemented a system of coupled proportional-integral (PI) controllers representing cellular stress and membrane voltage dynamics, connected through gap junction networks. The key finding was counterintuitive: introducing noise into the communication channels between cells enhanced the system's stability rather than degrading it.

Specifically, our simulations demonstrated that when connections between cells were made probabilistic (following a directed bond percolation model), the resulting stochastic dynamics prevented the system from getting trapped in unstable oscillatory states. The noise effectively widened the basin of attraction for stable regenerative outcomes, allowing the system to navigate morphological state space more robustly. This aligns with the biological observation that planarians reliably regenerate their target morphology despite significant environmental and internal perturbations.

The github repos for this experiment can be found here: https://github.com/marcelbtec/NNNoise.git

The Neural Network Connection

This biological insight naturally extends to artificial neural networks, where noise injection has emerged as a powerful regularization technique. Just as biological systems leverage noise for robust morphogenesis, neural networks can benefit from carefully introduced stochasticity during training. The parallel is striking: both systems must navigate high-dimensional spaces (morphological state space for planarians, parameter space for neural networks) to reach desired outcomes while maintaining stability and generalization capabilities.

Experimental Setup

To demonstrate the beneficial effects of noise injection in neural networks, we implemented a comprehensive experimental framework examining out-of-distribution (OOD) generalization. Our approach directly parallels the biological findings: just as noise helps planarian cells achieve robust morphological outcomes under varied conditions, we hypothesized that noise injection during neural network training would improve generalization to distribution shifts.

Architecture

We employed a feedforward neural network with the following specifications:

  • Input dimension: 2 (for 2D classification tasks)
  • Hidden layers: 2 layers with 64 units each
  • Output dimension: 2 (binary classification)
  • Activation: ReLU
  • Dropout: 0.2 (applied during training)

The critical modification was the introduction of Gaussian noise to inputs during training:

def forward(self, x, training=True):
    if training and self.noise_std > 0:
        x = x + torch.randn_like(x) * self.noise_std
    # ... rest of forward pass

Data Design

We created a challenging scenario using a combination of two-moons and two-circles datasets, designed to test generalization under various distribution shifts:

  1. Training distribution: Standard two-moons dataset with noise level 0.15
  2. OOD test scenarios:
    • Mild shift: Translation by (0.3, 0.2) units
    • Higher noise: Increased intrinsic noise (0.35 vs 0.15)
    • Mild rotation: 15-degree rotation
    • Mixed structure: 70% moons + 30% circles interpolation

Training Protocol

Models were trained with different noise injection levels (σ = 0.0, 0.1, 0.3, 0.5) using:

  • Adam optimizer with learning rate 0.01
  • Cosine annealing learning rate schedule
  • Cross-entropy loss
  • 300 epochs

Results

Decision Boundary Analysis

The impact of noise injection on learned decision boundaries was dramatic. Models trained without noise (σ = 0.0) produced sharp, complex decision boundaries that tightly fit the training data. In contrast, noise-trained models (σ = 0.3) exhibited smoother, more regular boundaries—a direct parallel to the smoothed morphological gradients observed in our planarian simulations.

Quantitative Performance

The results clearly demonstrate that moderate noise injection (σ = 0.3) provides optimal OOD generalization, reducing the performance drop from 26.4% to 13.9%—a relative improvement of 47%.

Neural Network Performance with Noise Injection

Training Noise (σ) In-Distribution Acc Avg OOD Acc Performance Drop
0.0 98.5% 72.1% 26.4%
0.1 97.8% 78.6% 19.2%
0.3 95.2% 81.3% 13.9%
0.5 88.1% 75.6% 12.5%

Gradient Magnitude Analysis

A key mechanistic insight comes from analyzing input gradient magnitudes:

  • No noise training: ‖∇_x f‖ = 2.13
  • With noise (σ = 0.3): ‖∇_x f‖ = 0.34

This 6-fold reduction in gradient magnitude indicates that noise injection encourages the network to learn flatter, more robust functions—precisely analogous to how noise in gap junction communication leads to more stable morphological patterns in planarians.

Theoretical Understanding

The mathematical framework reveals why noise injection is beneficial. For small noise variance σ², the expected loss under noise can be approximated as:

E[ℓ(f(x + ε), y)] ≈ ℓ(f(x), y) + (σ²/2)‖∇_x f(x)‖²

This shows that noise injection implicitly adds a gradient penalty term, encouraging the network to learn functions with smaller input gradients. This regularization effect:

  1. Penalizes sharp decision boundaries that would be sensitive to input perturbations
  2. Encourages smoothness in the learned function
  3. Prevents overfitting to training data specifics

Connection to Biological Systems

The parallels between our neural network experiments and the planarian regeneration model are profound:

Stability Through Stochasticity

In planarians, stochastic gap junction communication prevents cells from getting locked into rigid signaling patterns. Similarly, noise injection prevents neural networks from memorizing exact input-output mappings, promoting more flexible representations.

Robustness to Perturbations

Planarian regeneration succeeds despite tissue damage, environmental variations, and cellular heterogeneity. Noise-trained neural networks similarly maintain performance under distribution shifts, input corruptions, and adversarial perturbations.

Emergent Regularization

Neither system requires explicit regularization terms. In planarians, robust morphogenesis emerges from noisy cell-cell communication. In neural networks, generalization emerges from noisy gradient descent dynamics.

Homeostatic Behavior

Both systems exhibit homeostatic properties—planarians regenerate toward a target morphology, while neural networks converge toward generalizable solutions. Noise facilitates this convergence by preventing premature commitment to suboptimal states.

Implications and Future Directions

This work demonstrates that principles governing biological robustness can inform artificial intelligence design. The success of noise injection in both contexts suggests several research directions:

  1. Adaptive Noise Schedules: Just as biological noise levels vary with developmental stage, optimal noise injection in neural networks may benefit from dynamic scheduling.
  2. Structured Noise: Biological systems exhibit spatially and temporally correlated noise. Exploring structured noise patterns in neural networks could yield further improvements.
  3. Multi-Scale Integration: Our planarian model showed that noise operates across multiple scales (cellular, tissue, organism). Hierarchical noise injection in deep networks merits investigation.
  4. Theoretical Foundations: Developing unified mathematical frameworks that capture noise benefits across biological and artificial systems could reveal deeper organizational principles.

Conclusion

Noise injection represents a fundamental principle for achieving robust, generalizable behavior in complex systems. Our experiments demonstrate that the same stochastic mechanisms that enable reliable morphogenesis in planarians can enhance out-of-distribution generalization in neural networks. This convergence of biological and artificial intelligence insights suggests that embracing noise—rather than eliminating it—may be key to building more capable and resilient intelligent systems.

The quantitative improvements observed (47% reduction in OOD performance drop) validate the practical importance of this approach. More broadly, this work exemplifies how biological principles can guide the development of more robust machine learning systems, highlighting the value of interdisciplinary research at the intersection of developmental biology and artificial intelligence.

References

Blattner, M., & Levin, M. (2023). Long Range Communication via Gap Junctions and Stress in Planarian Morphogenesis: A Computational Study. Bioelectricity, 5(3), 196-209.

Go back to Experiments
Share this post
Link copied!
©2025 tangential by blattner technology
Imprint