Open
Description
Describe the bug
Chat history summarizer is not working here. It still passes the complete chat history to llm api call.
This is one of the agent. Each agent has its own kernel instance.
Its is used in GroupChatOrchestation as one of agent memeber.
summarization_reducer = ChatHistorySummarizationReducer(
service=kernel.get_service(),
target_count=3,
threshold_count=2,
auto_reduce=True,
include_function_content_in_summary=True,
)
system_message = """
Summarize the chat history to reduce its size while retaining key information.
The summary should be concise, capturing the essence of the conversation without losing important details.
"""
summarization_reducer.add_system_message(system_message)
return ChatCompletionAgent(
kernel=kernel,
name='agent1',
prompt_template_config=prompt_config,
arguments=KernelArguments(
chat_history=summarization_reducer,
settings=PromptExecutionSettings(
temperature=temperature
),
),
)
To Reproduce
Steps to reproduce the behavior:
Just create a multi turn agent collaboration using 2 or 3 agents and observer in debug log message of semantic kernel when it is making llm chat completion api call using openai base client.
Expected behavior
It should summarize then chat history should be passed to llm request body
Platform
- Language: Python
- Source: main branch of repository
- AI model: gpt-4o Azure
- IDE: VS Code
- OS: Windows
Metadata
Metadata
Assignees
Type
Projects
Status
Bug