Summary
Implement the Mamba Neural Operator, which applies selective state-space models (Mamba) to PDE solving with global receptive fields and linear complexity.
Reference
- "Alias-Free Mamba Neural Operator," NeurIPS 2024. Paper
Description
The Mamba Neural Operator applies the Mamba selective state-space model architecture to operator learning. It achieves global receptive fields with linear complexity (vs quadratic for transformers), uses adaptive state-space matrices, and includes an alias-free design to prevent spectral aliasing. Reports up to ~90% error reduction over transformer baselines and greatly improved long-time stability for autoregressive rollouts.
Summary
Implement the Mamba Neural Operator, which applies selective state-space models (Mamba) to PDE solving with global receptive fields and linear complexity.
Reference
Description
The Mamba Neural Operator applies the Mamba selective state-space model architecture to operator learning. It achieves global receptive fields with linear complexity (vs quadratic for transformers), uses adaptive state-space matrices, and includes an alias-free design to prevent spectral aliasing. Reports up to ~90% error reduction over transformer baselines and greatly improved long-time stability for autoregressive rollouts.