Skip to content

Latest commit

Β 

History

History
66 lines (44 loc) Β· 1.95 KB

File metadata and controls

66 lines (44 loc) Β· 1.95 KB

🌌 vynUI

A self-hosted, minimalist web interface that connects to your Ollama API from any device on your network.


⚠️ Project status: vynUI is actively in development. Expect frequent updates and experimental features.

If you'd like to follow progress or contribute to planning, check out:


✨ Key features

  • 🌐 Connect to your host Ollama machine over your local network
  • ⚑ Real-time chat with streaming responses
  • πŸ’» Lightweight, responsive UI focused on minimalism
  • πŸ”’ 100% open-source & self-hosted

πŸ›  Configuration

  • The app expects to connect to an Ollama host IP and port β€” update the connection string in the UI connection modal.
  • Add any environment variables or config files here if you add server-side logic later.

🧠 How it works (short)

vynUI connects through your local network to a host running Ollama. Messages typed in the UI are forwarded to Ollama, which replies with streaming responses. The UI is deliberately minimal to keep latency and resource usage low.


🀝 Contributing

We’d love help. Please read CONTRIBUTING.md before opening issues or PRs.

Good first issues:

  • Improve documentation or add examples
  • Find errors in the code

When opening PRs:

  • Keep commits focused and atomic
  • Document behavior changes in the PR description
  • Add screenshots if the UI is modified

πŸ“£ Support & Community

If you like this project:

  • Star the repo ⭐
  • PLEASE Share it with devs who run local LLM setups
  • Open issues for bugs or feature requests

Where we plan to share updates:

  • Repo releases
  • Project boards (Miro / Trello linked above)

πŸ“œ License

Licensed under the MIT License. See LICENSE.md.

Β© 2025 Lunar Productions