Skip to content

Fix/ollama test connection issue#17

Open
Yaswanth-ampolu wants to merge 4 commits intosuperhq-ai:mainfrom
Yaswanth-ampolu:fix/ollama-connection-issue
Open

Fix/ollama test connection issue#17
Yaswanth-ampolu wants to merge 4 commits intosuperhq-ai:mainfrom
Yaswanth-ampolu:fix/ollama-connection-issue

Conversation

@Yaswanth-ampolu
Copy link
Copy Markdown

closes #16

Code change summary

  • Added explicit handling for 403 responses from Ollama in:
    • generate()
    • listModels()
  • When a 403 occurs, Superpowers now shows a clear, actionable error instead of a generic failure.

New user-facing error message


Connection refused (403).
Please stop Ollama and run:

OLLAMA_ORIGINS="*" ollama serve

This helps users immediately identify and fix the Ollama CORS issue when running Superpowers in a browser context.



### Update

Root cause identified: Ollama returns `403` when accessed from a browser context unless
`OLLAMA_ORIGINS` is set. This affects Superpowers because it runs inside a browser extension.

I’ve confirmed the fix by:
1. Restarting Ollama with `OLLAMA_ORIGINS="*" ollama serve`
2. Verifying `/api/tags` responds correctly
3. Retesting the Superpowers connection (now works)


### Systemd (persistent) setup

For users running Ollama as a systemd service, the fix needs to be applied at the service level so it persists across restarts:

```bash
sudo systemctl stop ollama
sudo systemctl edit ollama

Add the following override:

[Service]
Environment="OLLAMA_ORIGINS=*"

Then reload and restart:

sudo systemctl daemon-reexec
sudo systemctl restart ollama

Verification:

curl http://127.0.0.1:11434/api/tags

When Ollama is running without OLLAMA_ORIGINS="*", it rejects requests
from the extension with a 403 Forbidden error. This change catches that
specific error in both listModels and generate methods and provides a
clear instruction to the user to configure the environment variable.
@Yaswanth-ampolu
Copy link
Copy Markdown
Author

image

@Yaswanth-ampolu
Copy link
Copy Markdown
Author

also closes #11

Fix: Replace generic LLM response in “Test Connection” with app-specific success message

Problem

When clicking Test Connection, the UI displayed the raw LLM greeting
(e.g. “Hi there! I'm a large language model…”), which is generic and not aligned
with Superpowers branding or intent.

Solution

  • Stop streaming and displaying the LLM’s default response during connection tests
  • Replace it with a deterministic, application-specific success message

Implementation

  • Updated src/hooks/useTestLlm.ts
  • Removed per-chunk streaming updates to setTestResponse
  • After a successful test request, set a custom confirmation message instead
await readStream(stream);

const successMessage = `Your API key for model ${settings.model} is working perfectly.
Welcome to Superpowers — your ultimate productivity extension that helps you do more, right from your favorite browser. No more switching tabs. Just ask, and get things done.`;

setTestResponse(successMessage);

Result

Users now see a clear, branded confirmation message instead of a generic LLM intro,
making the Test Connection flow more meaningful and polished.

image

@harshdoesdev
Copy link
Copy Markdown

Hi @Yaswanth-ampolu, Thanks for raising this PR! Please fix the issues raised by Biome. Otherwise, this PR looks good to me!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

🐞 Ollama (Local) connection test fails with 403 status code (no body)

2 participants