Skip to content

Implement Mamba Neural Operator (Alias-Free MNO) #123

@ChrisRackauckas-Claude

Description

@ChrisRackauckas-Claude

Summary

Implement the Mamba Neural Operator, which applies selective state-space models (Mamba) to PDE solving with global receptive fields and linear complexity.

Reference

  • "Alias-Free Mamba Neural Operator," NeurIPS 2024. Paper

Description

The Mamba Neural Operator applies the Mamba selective state-space model architecture to operator learning. It achieves global receptive fields with linear complexity (vs quadratic for transformers), uses adaptive state-space matrices, and includes an alias-free design to prevent spectral aliasing. Reports up to ~90% error reduction over transformer baselines and greatly improved long-time stability for autoregressive rollouts.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions