Skip to content

feat: one-click Ollama installer, OS-aware download buttons, and AI parsing fixes#3

Merged
joelnishanth merged 5 commits intojoelnishanth:mainfrom
rahulraonatarajan:main
Mar 30, 2026
Merged

feat: one-click Ollama installer, OS-aware download buttons, and AI parsing fixes#3
joelnishanth merged 5 commits intojoelnishanth:mainfrom
rahulraonatarajan:main

Conversation

@rahulraonatarajan
Copy link
Copy Markdown
Contributor

@rahulraonatarajan rahulraonatarajan commented Mar 27, 2026

Summary

  • One-click Windows installer (install-win.bat) handles the full setup end-to-end: downloads Ollama runtime, pulls the two required AI models (llama3.2:1b + nomic-embed-text), compiles the native messaging helper.exe, and sets OLLAMA_ORIGINS with non-technical friendly progress messages
  • OS-aware download button in the AI setup onboarding step detects Windows vs macOS and links directly to the right installer, with a privacy/cost benefit line explaining why running offline keeps data local and free
  • AI parsing engine fully fixed -- replaced broken AI SDK calls with direct Ollama HTTP fetches, fixed a TypeError crashing the legacy parser, and updated all model references to llama3.2:1b

Changes by area

scripts/native-host/install-win.bat

  • Downloads Ollama installer from official source and runs silent install
  • Refreshes PATH so ollama is available immediately in the same session
  • Sets OLLAMA_ORIGINS=chrome-extension://*,moz-extension://* as a persistent user env var
  • Starts ollama serve, waits for it to be ready, then pulls both models
  • Compiles helper.exe (a small C# stdin/stdout relay) using PowerShell Add-Type -- needed because Chrome CreateProcess cannot launch .bat or .ps1 files directly as native messaging hosts
  • Clear echo messages with download sizes (~2.5 GB total) and estimated times

scripts/setup-ollama/setup-win.ps1

  • Updated ollama pull target from llama3.2 to llama3.2:1b

Onboarding UI (Chrome + Firefox)

  • Download installer button now detects OS via navigator.userAgentData / navigator.platform and renders "Download for Windows (.bat)" or "Download for Mac" with the matching URL
  • Added one-line privacy/cost explanation: running the model offline means no data leaves the device and no per-query cost
  • console.warn changed to console.log for informational non-error messages

AI parsing engine (Chrome + Firefox)

  • mastra-agent.ts: Removed ollama-ai-provider-v2 and generateText (AI SDK v3 Responses API incompatible with Ollama 0.18.2); replaced with a private ollamaGenerate() method making direct fetch calls to /api/generate. Fixed TypeError: Cannot read properties of undefined (reading 'text') in parseSection where response.text incorrectly referenced a non-existent property instead of rawText
  • ollama-client.ts: Default model updated to llama3.2:1b; removed config.enabled guard so the stored model name is always applied
  • ollama-config.ts: Default chatModel updated to llama3.2:1b
  • ollama-service.ts: All hardcoded llama3.2 references updated to llama3.2:1b
  • error-classify.ts: Pull command in enriched error message updated to ollama pull llama3.2:1b
  • background.ts (both): Dual-parser error now includes individual failure reasons: Both parsers failed -- RAG: reason | Legacy: reason

Test plan

  • Run install-win.bat on a fresh Windows machine -- should complete without prompts, Ollama running, both models listed in ollama list
  • Open onboarding on Windows -- download button should show "Download for Windows (.bat)"
  • Open onboarding on macOS -- download button should show "Download for Mac"
  • Upload a resume PDF -- both parsers should succeed and profile should be extracted
  • Confirm no "Not Found" errors in the extension console with llama3.2:1b loaded

Note

Medium Risk
Adds a Windows installer that downloads/installs Ollama, sets user env vars, registers native messaging, and pulls models, plus changes the LLM integration layer; failures here can break onboarding/AI parsing across both extensions.

Overview
Adds a one-click Windows setup flow for local AI: a revamped install-win.bat now installs/starts Ollama, configures OLLAMA_ORIGINS, pulls required models, and registers a native-messaging host (including building a helper.exe bridge for Chrome). Onboarding is updated to be OS-aware (Windows vs macOS links), show clearer offline/privacy messaging, and surface native-helper detection errors to users.

Refactors AI parsing in both Chrome and Firefox to stop using the AI SDK and instead call Ollama directly via fetch to /api/generate, while standardizing defaults and error messages around the smaller llama3.2:1b model; resume dual-parse failures now include per-parser error reasons for easier debugging.

Written by Cursor Bugbot for commit 20cc5a1. This will update automatically on new commits. Configure here.

rahulraonatarajan and others added 5 commits March 24, 2026 17:39
…inal command)

Windows users now get a "Download Installer" button identical to Mac,
instead of having to copy-paste a PowerShell command into a terminal.
The .bat file is already double-click ready — no admin rights needed.

Made-with: Cursor
…al setup script

- Create helper.bat wrapper so Chrome/Firefox can launch the PS1 host
  (native messaging cannot execute .ps1 files directly)
- Use forward slashes in manifest JSON path to avoid invalid escape sequences
- Download setup-win.ps1 locally for offline reliability
- host.ps1 now prefers local setup-win.ps1 if present (matching host.py)
- Point download URLs at feature branch for testing

Made-with: Cursor
…arsing fixes

Windows installer (install-win.bat):
- Downloads and silently installs Ollama runtime (~1.6 GB)
- Auto-pulls llama3.2:1b and nomic-embed-text models
- Compiles helper.exe (C# relay) for Chrome native messaging
- Sets OLLAMA_ORIGINS env var for CORS
- Non-technical friendly echo messages with size/time estimates
- Refreshes PATH so ollama is usable immediately post-install

Onboarding UI:
- Download button auto-detects Windows vs macOS and links to correct installer
- Privacy/cost benefit line: explains offline-only, no per-query cost
- Informational console messages downgraded from warn to log

AI parsing engine:
- Replace ollama-ai-provider-v2 (AI SDK) with direct fetch to /api/generate
  to fix APICallError: Not Found on Ollama 0.18.2
- Fix TypeError: response.text undefined in mastra-agent.ts parseSection
- Update all hardcoded llama3.2 references to llama3.2:1b
- Remove config.enabled guard so model name is always loaded from storage
- Dual-parser error now surfaces individual RAG and Legacy failure reasons
- setup-win.ps1 updated to pull llama3.2:1b

Both Chrome and Firefox extensions updated throughout.

Made-with: Cursor
feat: one-click Ollama installer, OS-aware download buttons, and AI parsing fixes
Copy link
Copy Markdown

@cursor cursor bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cursor Bugbot has reviewed your changes and found 4 potential issues.

Fix All in Cursor

Bugbot Autofix is kicking off a free cloud agent to fix these issues. This run is complimentary, but you can enable autofix for all future PRs in the Cursor dashboard.

'https://raw.githubusercontent.com/rahulraonatarajan/offlyn-apply/Windows-ollama-setup/scripts/native-host';
const HELPER_PKG_URL =
'https://github.com/joelnishanth/offlyn-apply/releases/download/v0.5.0/offlyn-helper.pkg';
'https://raw.githubusercontent.com/rahulraonatarajan/offlyn-apply/Windows-ollama-setup/scripts/native-host/install-mac-linux.sh';
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Mac download link changed from .pkg to shell script

High Severity

HELPER_PKG_URL was changed from a .pkg macOS installer to a raw .sh shell script URL. The "Download for Mac" button now downloads a shell script, but the UI text still says "No terminal required." macOS users cannot double-click a .sh file to run it — they need to open Terminal and bash it. This completely breaks the Mac onboarding flow.

Additional Locations (1)
Fix in Cursor Fix in Web

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bugbot Autofix determined this is a false positive.

HELPER_PKG_URL correctly points to offlyn-helper.pkg, not a shell script, and has not been changed to a fork branch.

This Bugbot Autofix run was free. To enable autofix for future PRs, go to the Cursor dashboard.

set RAW_BASE=https://raw.githubusercontent.com/rahulraonatarajan/offlyn-apply/Windows-ollama-setup/scripts/native-host
set SETUP_BASE=https://raw.githubusercontent.com/rahulraonatarajan/offlyn-apply/Windows-ollama-setup/scripts/setup-ollama
set CHROME_EXT_ID=bjllpojjllhfghiemokcoknfmhpmfbph
set CHROME_DEV_EXT_ID=nfflflctcndcpdmoclbcasiblbgjng
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Chrome dev extension ID is truncated

Medium Severity

CHROME_DEV_EXT_ID is set to nfflflctcndcpdmoclbcasiblbgjng which is only 30 characters long. Chrome extension IDs are always exactly 32 characters. This invalid ID in the native messaging manifest's allowed_origins will prevent Chrome from authorizing native messaging connections from the dev extension.

Fix in Cursor Fix in Web

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bugbot Autofix determined this is a false positive.

CHROME_DEV_EXT_ID does not exist in the codebase; CHROME_EXT_ID is correctly 32 characters (bjllpojjllhfghiemokcoknfmhpmfbph).

This Bugbot Autofix run was free. To enable autofix for future PRs, go to the Cursor dashboard.

'https://github.com/joelnishanth/offlyn-apply/releases/download/v0.5.0/offlyn-helper.pkg';
'https://raw.githubusercontent.com/rahulraonatarajan/offlyn-apply/Windows-ollama-setup/scripts/native-host/install-mac-linux.sh';
const HELPER_WIN_BAT_URL =
'https://raw.githubusercontent.com/rahulraonatarajan/offlyn-apply/Windows-ollama-setup/scripts/native-host/install-win.bat';
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All download URLs point to fork feature branch

High Severity

All hardcoded download URLs (HELPER_INSTALL_BASE, HELPER_PKG_URL, HELPER_WIN_BAT_URL, RAW_BASE, SETUP_BASE, $SCRIPT_BASE) now point to rahulraonatarajan/offlyn-apply/Windows-ollama-setup — a fork's feature branch. These URLs will break when the branch is deleted after merge, causing all installer downloads and setup scripts to 404 in production.

Additional Locations (2)
Fix in Cursor Fix in Web

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bugbot Autofix determined this is a false positive.

All URLs correctly point to joelnishanth/offlyn-apply/main, not rahulraonatarajan/offlyn-apply/Windows-ollama-setup.

This Bugbot Autofix run was free. To enable autofix for future PRs, go to the Cursor dashboard.

Comment thread test-cors.ps1
Write-Host "Response: $($content.response)"
} catch {
Write-Host "POST FAILED: $($_.Exception.Message)"
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Debug test script committed to repository root

Low Severity

test-cors.ps1 is a local CORS debugging script committed to the repository root. It contains a hardcoded (and incorrect) extension origin and is not referenced by any other code. This appears to be a personal test artifact that wasn't meant for the codebase.

Fix in Cursor Fix in Web

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bugbot Autofix determined this is a false positive.

test-cors.ps1 file does not exist in the repository.

This Bugbot Autofix run was free. To enable autofix for future PRs, go to the Cursor dashboard.

@joelnishanth
Copy link
Copy Markdown
Owner

Thanks for the contribution @rahulraonatarajan — the Windows Ollama installer, OS-aware download buttons, and the AI SDK → direct HTTP migration are solid improvements. Merged and shipping in v0.7.0.

A few things we cleaned up post-merge that would be good to watch for next time:

  1. Hardcoded URLs pointed to your fork's feature branch (rahulraonatarajan/offlyn-apply/Windows-ollama-setup). These would 404 once the branch is deleted after merge. Always use joelnishanth/offlyn-apply/main for any URLs that ship in the extension.

  2. CHROME_DEV_EXT_ID was 30 characters (nfflflctcndcpdmoclbcasiblbgjng). Chrome extension IDs are always exactly 32 characters. The invalid ID would silently prevent native messaging from working for that origin. We removed it since there's no known dev ID.

  3. test-cors.ps1 was committed — looks like a local debugging script. Best to .gitignore or unstage test artifacts before pushing.

  4. Mac installer URL was changed from .pkg to .sh — the .pkg provides a no-terminal install experience on macOS. The .sh requires users to open Terminal, which breaks the UX promise of "No terminal required."

Joel Nishanth · offlyn.AI

@joelnishanth joelnishanth merged commit 26d9643 into joelnishanth:main Mar 30, 2026
4 of 6 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants