Fix(llm-cli): Ensure script exits after displaying help message #13329
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
The
llm-cliscript did not exit after displaying the help message when run with the-hor--helpflags. This caused the script to continue execution and display an "Invalid model_family" error.This commit adds an
exit 0command after thedisplay_helpfunction is called, ensuring that the script exits gracefully after printing the help message.A new test case has been added to
python/llm/test/cli/test_cli.pyto verify that the fix is working correctly and to prevent future regressions.