Your on-device AI, built the Apple way.
A native local LLM experience for macOS, iOS, and iPadOS — powered by Swift, SwiftUI, and MLX.
CoreLLM is a fully native Swift application that brings large language model (LLM) capabilities directly to your Apple devices — no cloud required.
Built with MLX and SwiftUI, it offers a seamless, fast, and private AI experience across macOS, iOS, and iPadOS.
Whether you're running queries, generating content, or experimenting with prompts, CoreLLM puts the power of language models in your hands — locally.
- 🖥️ Cross-platform: Available for Mac, iPhone, and iPad.
- ⚡ Fast & Local: All inference is done on-device — no internet or server needed.
- 🧱 Powered by MLX: Optimized for Apple Silicon with Apple’s machine learning stack.
- 🧑💻 Built in Swift: Clean architecture with SwiftUI and native performance.
- ✨ Chat-style UI: Minimal and modern experience to interact with your LLM.
- Supported Platforms: macOS (Apple Silicon), iOS, iPadOS
⚠️ You’ll need Xcode 15+, a Mac with Apple Silicon, and the MLX framework installed.
git clone https://github.com/yourusername/CoreLLM.git
cd CoreLLM
open CoreLLM.xcodeproj