Conversation
|
Automated review (bot): thanks for pushing LMStudio support. I think the cleanest/most maintainable approach here is to use OpenCode’s native provider system instead of adding a generic Why this matters (OpenWork philosophy / parity)
What OpenCode already supports
Suggested direction for this PR
Happy to re-review if you pivot the implementation to the native provider path. |
|
Automated message: this PR adds a general-purpose HTTP tunnel in Tauri, hard-codes LMStudio as default, and targets the old Recommendation: use OpenCode’s provider system for LMStudio and avoid a generic HTTP command (or strictly allowlist localhost). Please rebase onto |
Hi.
This an extension I did to be able to use LLMs running locally in LMStudio. Very useful to avoid direct dependencies on hyper scalars obviously.
This isn't a "pretty" PR, but if you are interested in this functionality, I can clean it up into a pretty PR. Please say "please try" or words to that effect then I will, if not then I won't. In any case: Awesome project. Love it!