Skip to content

Add max token limit support#408

Merged
s-kostyaev merged 2 commits intomainfrom
fix-issue-342-max-tokens
Apr 29, 2026
Merged

Add max token limit support#408
s-kostyaev merged 2 commits intomainfrom
fix-issue-342-max-tokens

Conversation

@s-kostyaev
Copy link
Copy Markdown
Owner

Fixes #342.

Adds an Ellama-level max token setting, passes it through llm-make-chat-prompt, exposes it in the model transient, and documents/tests the behavior.

Checks run:

  • make build
  • make test-detailed
  • make test
  • make checkdocs
  • make check-compile-warnings
  • make refill-readme
  • make manual

Added an ellama-max-tokens option and wired it into chat prompt creation so generated responses can be capped through llm-make-chat-prompt. Exposed the setting in the model transient menu, documented it, regenerated the manual, and covered the prompt and transient behavior with tests.
@s-kostyaev s-kostyaev merged commit 03e1029 into main Apr 29, 2026
14 checks passed
@s-kostyaev s-kostyaev deleted the fix-issue-342-max-tokens branch April 29, 2026 20:26
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Feature request: limit number of output tokens

1 participant