Skip to content

Conversation

jtpio
Copy link
Member

@jtpio jtpio commented Oct 15, 2025

Implement support for discovering commands using a query argument. This helps reduce the number of commands returned by the tool and the overall token usage, as well as reducing the amount of data sent to the LLM as part of the prompt.

  • Support query
  • Adjust the system prompt accordingly to mention the use of this query argument
  • UI tests
image

@jtpio jtpio added the enhancement New feature or request label Oct 15, 2025
@jtpio jtpio added this to the 0.9.0 milestone Oct 15, 2025
@jtpio jtpio marked this pull request as ready for review October 15, 2025 13:47
@jtpio jtpio mentioned this pull request Oct 17, 2025
@jtpio
Copy link
Member Author

jtpio commented Oct 17, 2025

This will actually be quite useful for built-in AI, since these models will have a maximum amount of tokens they can process:

https://developer.chrome.com/docs/ai/prompt-api#session_management

@jtpio jtpio requested a review from brichet October 17, 2025 11:52
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant