-
Notifications
You must be signed in to change notification settings - Fork 259
Description
Describe the bug
When using the official AGUI example with OllamaChatModel, calling the /agui/run endpoint returns an error response: messageId cannot be null.
To Reproduce
For easier debugging and to inspect raw requests/responses, I modified the stream parameter in the streamWithHttpClient call inside OllamaChatModel from true to false. Specifically, in the doStream method:
@Override
protected Flux<ChatResponse> doStream(
List<Msg> messages, List<ToolSchema> tools, GenerateOptions options) {
return streamWithHttpClient(
messages,
tools,
options.getToolChoice(),
OllamaOptions.fromGenerateOptions(options),
false);
}When sending a request as per the example, debugging shows that the raw response from Ollama does not contain an id field:
As a result, AguiEvent fails to extract a messageId, causing the client-side to throw an error:
Expected behavior
The system should return the LLM’s response content successfully.
Environment
- AgentScope-Java Version: 1.0.8
- Java Version: 17
- OS: windows
Additional context
When using OpenAIChatModel under identical conditions, everything works as expected.
In contrast, when using JdkHttpTransport with OpenAI, the actual HTTP response does include an id field:
Thus, AguiEvent correctly obtains the messageId, and the client receives the expected response:

Metadata
Metadata
Assignees
Labels
Type
Projects
Status