-
Notifications
You must be signed in to change notification settings - Fork 2.8k
Open
Description
Deepseek local is not very usable within Theia AI as it is too verbose.
Feature Description:
The local llms such as deepseek are thinking models. They will output large blurbs before returning the desired answer.
https://ollama.com/blog/thinking
e.g. https://files.ollama.com/think.mp4
To fix the problem, one needs to add "think":false, to the openai api request to Ollama.
such as this:
"model": "deepseek-r1",
"messages": [
{
"role": "user",
"content": "how many r in the word strawberry?"
}
],
"think": false,
"stream": true
}'
I believe the place to edit would be here.
| messages: request.messages.map(m => this.toOllamaMessage(m)).filter(m => m !== undefined) as Message[], |
Metadata
Metadata
Assignees
Labels
No labels