Skip to content

fix(ai): allow non-default ports in Tauri HTTP plugin scope#259

Open
DirkScharff wants to merge 1 commit intoavihaymenahem:mainfrom
DirkScharff:main
Open

fix(ai): allow non-default ports in Tauri HTTP plugin scope#259
DirkScharff wants to merge 1 commit intoavihaymenahem:mainfrom
DirkScharff:main

Conversation

@DirkScharff
Copy link
Copy Markdown

Summary

  • Replaces http://*/* with http://*:* and https://*/* with https://*:* in the Tauri HTTP plugin capability scope
  • The URL Pattern spec treats http://* as matching port 80 only — the *:* form is required to allow any port
  • This fixes connections to local AI servers running on non-default ports, such as LM Studio (:1234) and Ollama (:11434)

Test plan

  • Set AI provider to "Local AI (Ollama / LMStudio)" in Settings
  • Set Server URL to http://localhost:1234 (LM Studio) or http://localhost:11434 (Ollama)
  • Click "Test Connection" — should succeed where it previously silently failed

🤖 Generated with Claude Code

The URL Pattern spec treats http://* as port 80 only. Replace http://*/*
with http://*:* to allow any port, fixing connections to local AI servers
like LM Studio (:1234) and Ollama (:11434).

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant