This repository packages the documentation, prompt template, helper scripts, and launcher shortcuts needed to distribute {{PRODUCT_NAME}} as an Ollama-based assistant exposed through Open WebUI.
QUICKSTART.md– concise setup instructions for installing Ollama, baking the {{ASSISTANT_NAME}} model, and running Open WebUI without Docker.Modelfile– template forollama create; replace{{INSTRUCTIONS_PLACEHOLDER}}with the full prompt fromprompts/instructions.mdbefore building the model.prompts/– system instructions template with placeholders for {{COMPANY_NAME}} to customize.knowledge/– placeholder directory where users can drop PDFs/TXTs/CSVs prior to uploading through Open WebUI’s Knowledge feature.scripts/– cross-platform helpers to pull the base model, bake instructions, and launch Open WebUI against the local Ollama service.launchers/– OS-specific shortcuts that open http://localhost:8080 in the default browser.CONFIG.example.env– sample environment variables for default Ollama and WebUI endpoints.
- Read and follow
QUICKSTART.mdto install prerequisites, customize the Modelfile, and create the {{ASSISTANT_NAME}} model from {{MODEL_TAG}}. - Update placeholders (
{{PRODUCT_NAME}},{{ASSISTANT_NAME}},{{CONTACT_EMAIL}}, etc.) across the prompt and documentation before sharing with end users. - Use the helper scripts (
create_model_*,start_webui_*) for a streamlined setup on macOS, Linux, or Windows. - Share the launchers with end users so they can open the hosted WebUI quickly once the server is running.
- Local network sharing: run Open WebUI with
--host 0.0.0.0and a fixed port (e.g., 8080), ensure the firewall allows inbound traffic, then have teammates visithttp://<your-hostname>:8080. Keep Ollama bound to127.0.0.1so only the UI is network-exposed, and require authentication inside WebUI. - Public exposure: front the WebUI with a reverse proxy (Nginx, Caddy, or cloud load balancer) that terminates TLS, enforces authentication, and optionally adds rate limits. If hosting externally, lock down the Ollama endpoint with a VPN or IP allow-list to prevent unauthorized access. Avoid publishing the raw Ollama port to the internet.
- Tailor
prompts/instructions.mdwhenever {{COMPANY_NAME}} updates operating policies, then rebuild the model via the create script. - Mirror edits to
QUICKSTART.mdor this README whenever processes change. - Keep binaries, build artifacts, and large files out of version control; rely on the scripts and instructions instead of packaging compiled assets.
Document licensing separately (e.g., add a LICENSE file) and direct customers to {{CONTACT_EMAIL}} for assistance. Update copyright notices with {{YEAR}}.