Skip to content

Conversation

@carlosrivera
Copy link

@carlosrivera carlosrivera commented Mar 24, 2023

Great work on building this project! I recently discovered that committing large binary files can still cause a crash. Upon further investigation, I found that the issue stems from the binary data being encoded as text within the patch string.

To address this issue, I propose removing the unnecessary encoding of binary data before slicing the array for processing by ChatGPT. This should effectively prevent the crash when committing large files.

While this approach may prioritize binary files over other changes due to the commit list being sorted by length, the same issue can occur with large code changes. To further improve the function, we could consider sending a request to OpenAI for each diff --git command and then combining the results to build the summary later. This would help to avoid dismissing significant changes based solely on their length and ensure that changes are ranked by relevance.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant