Summary
Implement LSM, which encodes input functions into a latent space, constructs an orthogonal spectral basis there, and composes mappings using multiple basis operators.
Reference
- Wu et al., "Solving High-Dimensional PDEs with Latent Spectral Models," ICML 2023. Paper
Description
LSM uses cross-attention to encode input functions into a latent space, then constructs orthogonal basis functions in that latent space (inspired by classical spectral methods). The operator mapping is decomposed into multiple basis operators, enabling efficient learning for high-dimensional PDEs.
Summary
Implement LSM, which encodes input functions into a latent space, constructs an orthogonal spectral basis there, and composes mappings using multiple basis operators.
Reference
Description
LSM uses cross-attention to encode input functions into a latent space, then constructs orthogonal basis functions in that latent space (inspired by classical spectral methods). The operator mapping is decomposed into multiple basis operators, enabling efficient learning for high-dimensional PDEs.