Awesome Mixture of Experts (MoE): A Curated List of Mixture of Experts (MoE) and Mixture of Multimodal Experts (MoME)
-
Updated
Oct 6, 2025
Awesome Mixture of Experts (MoE): A Curated List of Mixture of Experts (MoE) and Mixture of Multimodal Experts (MoME)
Let's activate Net://Anchor
ASAN: A conceptual architecture for a self-creating (autopoietic), energy-efficient, and governable multi-agent AI system.
Add a description, image, and links to the sparse-moe topic page so that developers can more easily learn about it.
To associate your repository with the sparse-moe topic, visit your repo's landing page and select "manage topics."