Skip to content

[Bug]: AGUI protocol fails to correctly serialize LLM responses when using OllamaChatModel #708

@CodeWang20

Description

@CodeWang20

Describe the bug
When using the official AGUI example with OllamaChatModel, calling the /agui/run endpoint returns an error response: messageId cannot be null.

Image

To Reproduce
For easier debugging and to inspect raw requests/responses, I modified the stream parameter in the streamWithHttpClient call inside OllamaChatModel from true to false. Specifically, in the doStream method:

    @Override
    protected Flux<ChatResponse> doStream(
            List<Msg> messages, List<ToolSchema> tools, GenerateOptions options) {
        return streamWithHttpClient(
                messages,
                tools,
                options.getToolChoice(),
                OllamaOptions.fromGenerateOptions(options),
                false);
    }

When sending a request as per the example, debugging shows that the raw response from Ollama does not contain an id field:

Image

As a result, AguiEvent fails to extract a messageId, causing the client-side to throw an error:

Image Image

Expected behavior
The system should return the LLM’s response content successfully.

Image

Environment

  • AgentScope-Java Version: 1.0.8
  • Java Version: 17
  • OS: windows

Additional context
When using OpenAIChatModel under identical conditions, everything works as expected.

In contrast, when using JdkHttpTransport with OpenAI, the actual HTTP response does include an id field:

Image

Thus, AguiEvent correctly obtains the messageId, and the client receives the expected response:

Image Image

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    Status

    Backlog

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions