Skip to content

Conversation

@jasielmacedo
Copy link
Owner

Description

This PR bundles Ollama with the application, adds auto-start functionality, implements a model management UI button, and sets up automated binary downloads for contributors. Users no longer need to manually install Ollama separately.

Type of Change

  • New feature (non-breaking change that adds functionality)
  • Code quality improvement (refactoring, formatting, etc.)
  • Documentation update

Changes Made

Core Features

  • Bundled Ollama v0.12.9: Application now includes Ollama executables for Windows (x64) and macOS (Intel + Apple Silicon)
  • Auto-start on Launch: Ollama server automatically starts when the application launches (no manual ollama serve required)
  • Model Manager UI Button: Added visible button in NavigationBar to open Model Manager (previously only accessible via Ctrl+M)
  • Automatic Binary Downloads: Created scripts/download-ollama.js to download Ollama binaries automatically during npm install

Implementation Details

  • Modified src/main/index.ts:
    • Added ollamaService.ensureRunning() in app.whenReady() callback
    • Added cleanup on quit (ollamaService.stop())
    • Made callback async to support auto-start
  • Updated src/main/services/ollama.ts:
    • Added getBundledOllamaPath() method to locate platform-specific Ollama executable
    • Updated start() method to use bundled executable instead of system PATH
    • Set environment variables for Windows to find required DLLs
    • Fixed 4 ESLint errors by prefixing unused catch variables with underscore
  • Modified src/renderer/components/Browser/NavigationBar.tsx:
    • Added Model Manager button with CPU/chip icon
    • Integrated with useModelStore for state management

Build & Distribution

  • Created electron-builder.json with configuration to bundle Ollama in production builds
  • Added extraResources to include resources/bin/ in distribution
  • Updated .gitignore to exclude resources/bin/ directory (~1.8GB)

Developer Experience

  • Created scripts/download-ollama.js:
    • Downloads Ollama v0.12.9 for Windows and macOS from GitHub releases
    • Extracts to resources/bin/win32/ and resources/bin/darwin/
    • Shows download progress
    • Skips if binaries already exist
  • Added npm scripts:
    • setup:ollama - Manually download Ollama binaries
    • postinstall - Automatically runs after npm install
  • Created scripts/README.md with usage and troubleshooting documentation

Documentation

  • Updated README.md:
    • Documented bundled Ollama version (v0.12.9, released Nov 1, 2025)
    • Added "Using AI Features" section
    • Added "Bundled Ollama Version" section with update instructions
    • Updated Getting Started to reflect automatic binary download
    • Updated Contributing guidelines
  • Updated TECH_BRIEFING.md:
    • Updated LLM Integration section with bundled version info
    • Updated file structure to show resources/bin/ directory
    • Updated code examples to show bundled executable usage
    • Updated build documentation
    • Updated environment requirements

Code Quality

  • Updated eslint.config.js:
    • Added caughtErrorsIgnorePattern: '^_' to ignore underscore-prefixed catch variables
  • Fixed ESLint errors in ollama.ts (4 unused variable warnings)

Testing

  • Tested locally in development mode
  • Tested production build
  • Manually tested affected features

Test Results

  • ✅ Build completes successfully (npm run build)
  • ✅ Ollama auto-starts when app launches (tested on Windows)
  • ✅ Model Manager button visible and functional in NavigationBar
  • ✅ Download script works correctly (npm run setup:ollama)
  • ✅ Binaries are properly excluded from git
  • ✅ ESLint passes with no errors for modified files
  • ✅ Platform detection works for both Windows and macOS paths

Screenshots

Model Manager Button in NavigationBar:

  • New button added between Bookmarks and AI Chat buttons
  • Icon: CPU/chip design (matching other navigation controls)
  • Tooltip: "Model Manager (Ctrl+M)"

Download Script Output:

╔═══════════════════════════════════════════════════════════╗
║        Ollama Binary Download Script                     ║
║        Version: v0.12.9                                   ║
╚═══════════════════════════════════════════════════════════╝

[1/2] Windows (x64)
────────────────────────────────────────────────────────────
  Downloading from: https://github.com/ollama/ollama/releases/...
  Progress: 100%
  Extracting to: resources/bin/win32
  ✓ Complete

[2/2] macOS (Universal)
────────────────────────────────────────────────────────────
  ✓ Already downloaded, skipping...

╔═══════════════════════════════════════════════════════════╗
║              Download Complete! ✓                         ║
╚═══════════════════════════════════════════════════════════╝

Checklist

  • My code follows the project's code style (ESLint and Prettier)
  • I have performed a self-review of my code
  • I have commented my code where necessary
  • My changes generate no new warnings or errors
  • I have tested my changes locally

Additional Notes

Bundle Size Impact

  • Git Repository: No change (binaries excluded via .gitignore)
  • Final Installer: ~1.8GB (Windows includes CUDA 12/13, ROCm libraries for GPU acceleration)
  • macOS Bundle: ~46MB (smaller, no CUDA required)

Breaking Changes

None. This is a fully backward-compatible enhancement. The app now works out-of-the-box without requiring users to install Ollama separately.

Future Improvements

  • Consider adding Linux support (currently Windows + macOS only)
  • Could add automatic update checks for newer Ollama versions
  • May want to add option to use system-installed Ollama if user prefers

Migration Notes for Contributors

After pulling this PR:

  1. Run npm install (this will download Ollama binaries automatically)
  2. If npm install was already run before this PR, manually run: npm run setup:ollama
  3. The binaries are stored in resources/bin/ (excluded from git, ~1.8GB)

@jasielmacedo jasielmacedo merged commit ce50865 into main Nov 5, 2025
1 check passed
@jasielmacedo jasielmacedo deleted the ollama-service branch November 5, 2025 17:04
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants