React Ollama UI is a web interface for ollama.ai, a tool that enables running Large Language Models (LLMs) on your local machine.
Check out live preview!
-
Download and install Ollama CLI.
-
Run your selected model Ollama library.
ollama run <model-name>- Clone the repository and start your dev server.
git clone https://github.com/AshmaDev/react-ollama-ui.git
cd react-ollama-ui
pnpm install
pnpm run devNote
The current Docker Compose configuration runs Ollama on CPU only. If you wish to use an NVIDIA or AMD GPU, you will need to modify the docker-compose.yml file. For more details, visit the Ollama Docker Hub page.
docker compose up -dLicensed under the MIT License. See the LICENSE file for details.
