-
Notifications
You must be signed in to change notification settings - Fork 3
Open
Labels
bugSomething isn't workingSomething isn't working
Description
Search terms
ValidationError, TextEvent
Expected Behavior
Not crash, generate autoreply normally
Actual Behavior
I get the following error..
---------------------------------------------------------------------------
ValidationError Traceback (most recent call last)
Cell In[11], line 1
----> 1 await main()
Cell In[10], line 81, in main(on_event, state_json)
79 for index, result in enumerate(results):
80 result_events = []
---> 81 await result.process() # [v5] this is an issue..
82 async for event in result.events:
83 try:
File ...\.venv\lib\site-packages\autogen\io\run_response.py:289, in AsyncRunResponse.process(self, processor)
287 async def process(self, processor: AsyncEventProcessorProtocol | None = None) -> None:
288 processor = processor or AsyncConsoleEventProcessor()
--> 289 await processor.process(self)
File ....\.venv\lib\site-packages\autogen\io\processors\console_event_processor.py:38, in AsyncConsoleEventProcessor.process(self, response)
37 async def process(self, response: "AsyncRunResponseProtocol") -> None:
---> 38 async for event in response.events:
39 await self.process_event(event)
File ....\.venv\lib\site-packages\autogen\io\run_response.py:244, in AsyncRunResponse._queue_generator(self, q)
241 break
243 if isinstance(event, ErrorEvent):
--> 244 raise event.content.error # type: ignore[attr-defined]
245 except queue.Empty:
246 continue
File ....\.venv\lib\site-packages\autogen\agentchat\group\multi_agent_chat.py:296, in a_run_group_chat.<locals>._initiate_group_chat(pattern, messages, max_rounds, safeguard_policy, safeguard_llm_config, mask_llm_config, iostream, response, a_pause_event)
294 with IOStream.set_default(iostream):
295 try:
--> 296 chat_result, context_vars, agent = await a_initiate_group_chat(
297 pattern=pattern,
298 messages=messages,
299 max_rounds=max_rounds,
300 safeguard_policy=safeguard_policy,
301 safeguard_llm_config=safeguard_llm_config,
302 mask_llm_config=mask_llm_config,
303 a_pause_event=a_pause_event,
304 )
306 IOStream.get_default().send(
307 RunCompletionEvent( # type: ignore[call-arg]
308 history=chat_result.chat_history,
(...)
313 )
314 )
315 except Exception as e:
File c....\.venv\lib\site-packages\autogen\agentchat\group\multi_agent_chat.py:192, in a_initiate_group_chat(pattern, messages, max_rounds, safeguard_policy, safeguard_llm_config, mask_llm_config, a_pause_event)
188 raise ValueError("No agent selected to start the conversation")
190 manager._a_pause_event = a_pause_event
--> 192 chat_result = await last_agent.a_initiate_chat(
193 manager,
194 message=last_message, # type: ignore[arg-type]
195 clear_history=clear_history,
196 summary_method=pattern.summary_method,
197 )
199 cleanup_temp_user_messages(chat_result)
201 return chat_result, context_variables, manager.last_speaker
File ....\.venv\lib\site-packages\autogen\agentchat\conversable_agent.py:1663, in ConversableAgent.a_initiate_chat(self, recipient, clear_history, silent, cache, max_turns, summary_method, summary_args, message, **kwargs)
1661 else:
1662 msg2send = await self.a_generate_init_message(message, **kwargs)
-> 1663 await self.a_send(msg2send, recipient, silent=silent)
1664 summary = self._summarize_chat(
1665 summary_method,
1666 summary_args,
1667 recipient,
1668 cache=cache,
1669 )
1670 for agent in [self, recipient]:
File ...\.venv\lib\site-packages\autogen\agentchat\conversable_agent.py:1176, in ConversableAgent.a_send(self, message, recipient, request_reply, silent)
1174 valid = self._append_oai_message(message, recipient, role="assistant", name=self.name)
1175 if valid:
-> 1176 await recipient.a_receive(message, self, request_reply, silent)
1177 else:
1178 raise ValueError(
1179 "Message can't be converted into a valid ChatCompletion message. Either content or function_call must be provided."
1180 )
File ....\.venv\lib\site-packages\autogen\agentchat\conversable_agent.py:1273, in ConversableAgent.a_receive(self, message, sender, request_reply, silent)
1271 if request_reply is False or (request_reply is None and self.reply_at_receive[sender] is False):
1272 return
-> 1273 reply = await self.a_generate_reply(messages=self.chat_messages[sender], sender=sender)
1274 if reply is not None:
1275 await self.a_send(reply, sender, silent=silent)
File ....\.venv\lib\site-packages\autogen\agentchat\conversable_agent.py:2954, in ConversableAgent.a_generate_reply(self, messages, sender, exclude)
2952 if self._match_trigger(reply_func_tuple["trigger"], sender):
2953 if inspect.iscoroutinefunction(reply_func):
-> 2954 final, reply = await reply_func(
2955 self,
2956 messages=messages,
2957 sender=sender,
2958 config=reply_func_tuple["config"],
2959 )
2960 else:
2961 final, reply = reply_func(self, messages=messages, sender=sender, config=reply_func_tuple["config"])
File ...\.venv\lib\site-packages\autogen\agentchat\groupchat.py:1447, in GroupChatManager.a_run_chat(self, messages, sender, config)
1444 reply["content"] = self.clear_agents_history(reply, groupchat)
1446 # The speaker sends the message without requesting a reply
-> 1447 await speaker.a_send(reply, self, request_reply=False, silent=silent)
1448 message = self.last_message(speaker)
1449 if self.client_cache is not None:
File...\.venv\lib\site-packages\autogen\agentchat\conversable_agent.py:1176, in ConversableAgent.a_send(self, message, recipient, request_reply, silent)
1174 valid = self._append_oai_message(message, recipient, role="assistant", name=self.name)
1175 if valid:
-> 1176 await recipient.a_receive(message, self, request_reply, silent)
1177 else:
1178 raise ValueError(
1179 "Message can't be converted into a valid ChatCompletion message. Either content or function_call must be provided."
1180 )
File ....\.venv\lib\site-packages\autogen\agentchat\conversable_agent.py:1270, in ConversableAgent.a_receive(self, message, sender, request_reply, silent)
1240 async def a_receive(
1241 self,
1242 message: dict[str, Any] | str,
(...)
1245 silent: bool | None = False,
1246 ):
1247 """(async) Receive a message from another agent.
1248
1249 Once a message is received, this function sends a reply to the sender or stop.
(...)
1268 ValueError: if the message can't be converted into a valid ChatCompletion message.
1269 """
-> 1270 self._process_received_message(message, sender, silent)
1271 if request_reply is False or (request_reply is None and self.reply_at_receive[sender] is False):
1272 return
File ...\.venv\lib\site-packages\autogen\agentchat\conversable_agent.py:1201, in ConversableAgent._process_received_message(self, message, sender, silent)
1196 raise ValueError(
1197 "Received message can't be converted into a valid ChatCompletion message. Either content or function_call must be provided."
1198 )
1200 if not ConversableAgent._is_silent(sender, silent):
-> 1201 self._print_received_message(message, sender)
File ...\.venv\lib\site-packages\autogen\agentchat\conversable_agent.py:1184, in ConversableAgent._print_received_message(self, message, sender, skip_head)
1182 def _print_received_message(self, message: dict[str, Any] | str, sender: Agent, skip_head: bool = False):
1183 message = message_to_dict(message)
-> 1184 message_model = create_received_event_model(event=message, sender=sender, recipient=self)
1185 iostream = IOStream.get_default()
1186 # message_model.print(iostream.print)
File ...\.venv\lib\site-packages\autogen\events\agent_events.py:277, in create_received_event_model(uuid, event, sender, recipient)
270 if content is not None and "context" in event:
271 content = OpenAIWrapper.instantiate(
272 content, # type: ignore [arg-type]
273 event["context"],
274 allow_format_str_template,
275 )
--> 277 return TextEvent(
278 content=content,
279 sender=sender.name,
280 recipient=recipient.name,
281 uuid=uuid,
282 )
File ....\.venv\lib\site-packages\autogen\events\base_event.py:70, in wrap_event.<locals>.WrapperBase.__init__(self, *args, **data)
68 if "content" in data:
69 content = data.pop("content")
---> 70 super().__init__(*args, content=event_cls(*args, **data, content=content), **data)
71 else:
72 super().__init__(content=event_cls(*args, **data), **data)
File ....\.venv\lib\site-packages\autogen\events\base_event.py:24, in BaseEvent.__init__(self, uuid, **kwargs)
22 def __init__(self, uuid: UUID | None = None, **kwargs: Any) -> None:
23 uuid = uuid or uuid4()
---> 24 super().__init__(uuid=uuid, **kwargs)
File ....\.venv\lib\site-packages\pydantic\main.py:250, in BaseModel.__init__(self, **data)
248 # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
249 __tracebackhide__ = True
--> 250 validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
251 if self is not validated_self:
252 warnings.warn(
253 'A custom validator is returning a value other than `self`.\n'
254 "Returning anything other than `self` from a top level model validator isn't supported when validating via `__init__`.\n"
255 'See the `model_validator` docs (https://docs.pydantic.dev/latest/concepts/validators/#model-validators) for more details.',
256 stacklevel=2,
257 )
ValidationError: 5 validation errors for TextEvent
content.str
Input should be a valid string [type=string_type, input_value=<coroutine object AsyncTh...t at 0x000002B5FFBE8EB0>, input_type=coroutine]
For further information visit https://errors.pydantic.dev/2.12/v/string_type
content.int
Input should be a valid integer [type=int_type, input_value=<coroutine object AsyncTh...t at 0x000002B5FFBE8EB0>, input_type=coroutine]
For further information visit https://errors.pydantic.dev/2.12/v/int_type
content.float
Input should be a valid number [type=float_type, input_value=<coroutine object AsyncTh...t at 0x000002B5FFBE8EB0>, input_type=coroutine]
For further information visit https://errors.pydantic.dev/2.12/v/float_type
content.bool
Input should be a valid boolean [type=bool_type, input_value=<coroutine object AsyncTh...t at 0x000002B5FFBE8EB0>, input_type=coroutine]
For further information visit https://errors.pydantic.dev/2.12/v/bool_type
content.list[dict[str,union[str,dict[str,any]]]]
Input should be a valid list [type=list_type, input_value=<coroutine object AsyncTh...t at 0x000002B5FFBE8EB0>, input_type=coroutine]
For further information visit https://errors.pydantic.dev/2.12/v/list_type
Steps to reproduce the bug
Happens on a group chat managed workflow, when in the user agent-input I take too long to reply or when I just press enter with no text (reply None/empty user input)
Environment
- Versions:
- Python version: 3.10
- waldiez extension version: 0.6.6
- OS: Windows 11
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working