Preserve all original tool input fields instead of whitelisting#17
Open
wjessup wants to merge 2 commits intopeteromallet:mainfrom
Open
Preserve all original tool input fields instead of whitelisting#17wjessup wants to merge 2 commits intopeteromallet:mainfrom
wjessup wants to merge 2 commits intopeteromallet:mainfrom
Conversation
added 2 commits
February 28, 2026 12:36
Parses conversations from Cursor's local SQLite database (state.vscdb), extracting user/assistant messages, tool calls, thinking blocks, and token counts. Cursor-specific code lives in dataclaw/parsers/cursor.py with the main parser.py delegating to it. Includes 11 tests covering discovery, parsing, tool calls, MCP prefix stripping, thinking, and nested JSON unwrapping. Made-with: Cursor
Made-with: Cursor
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
_parse_tool_inputpreviously hardcoded a whitelist of known fields per tool, silently dropping any unrecognized parameters. For example,Readonly keptfile_path— if the original input hadoffset,limit, or any other field, those were lost. This made the exported data less useful for training and analysis, since the tool call signatures didn't match what was actually sent.This PR replaces the per-tool whitelist approach with a single recursive anonymizer that preserves all original fields while still applying the correct anonymization:
file_path,targetFile,cwd,workdir, etc.) →anonymizer.path()command,cmd) →redact_text()+anonymizer.text()anonymizer.text()This is a net deletion of ~75 lines and passes all 301 existing tests unchanged (the previously tested fields still appear in output — extra fields are now additionally preserved).
What changes for exported data
inputis still adict, just with more keys when the original had themMade with Cursor