fix: prevent duplicate messages during long response streaming#162
Closed
joshuaking42 wants to merge 1 commit intoopenabdev:mainfrom
Closed
fix: prevent duplicate messages during long response streaming#162joshuaking42 wants to merge 1 commit intoopenabdev:mainfrom
joshuaking42 wants to merge 1 commit intoopenabdev:mainfrom
Conversation
The edit-streaming loop was re-splitting content every 1.5s tick and calling channel.say() for all overflow chunks, creating duplicate messages each tick. Fix by tracking posted message IDs in a Vec so overflow chunks are only sent once, and existing messages are edited in-place. The final edit also reuses these IDs instead of blindly posting new messages.
Contributor
|
Hi @joshuaking42, this PR overlaps with #135 which fixes the same duplicate message issue and has already been reviewed and approved. Would you be okay closing this one in favor of #135? Thanks 🙏 |
Contributor
|
Closing in favor of #135 which addresses the same issue. Thanks for the heads up! 🙏 |
Collaborator
|
Closing — the duplicate message fix has been merged via #135. Thanks @joshuaking42 for the contribution and for being gracious about consolidating efforts! 🙏 |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Problem
When a Discord response exceeds 1900 characters, the edit-streaming loop (which runs every 1.5s) re-splits the entire content and calls
channel.say()for all overflow chunks on every tick. This causes the same content to be posted as new messages repeatedly — the 跳針 (broken record) effect.Root cause
current_edit_msgwas a singleMessageIdthat got overwritten to the lastsay()result each tick:say()chunk2 →current_edit_msg= chunk2say()chunk2 + chunk3 as new messagesThe final edit had the same issue — it didn't know about overflow messages created during streaming.
Fix
Replace
current_edit_msg: MessageIdwithmsg_ids: Vec<MessageId>to track all posted messages:edit()in-placesay()once and push the IDmsg_idsfrom the streaming taskFull content is visible during streaming, no truncation, no duplication.