Skip to content

Conversation

@Butanium
Copy link
Collaborator

@Butanium Butanium commented Jan 7, 2026

Summary

  • Support passing token IDs as list of ints or list of lists of ints
  • Support passing HuggingFace tokenizer output dicts (input_ids + attention_mask)
  • Filter padding tokens using attention_mask when provided
  • Fix deprecated vllm.transformers_utils.tokenizer import
  • Set pad_token to eos_token if not set (mirroring LanguageModel behavior)

Test plan

  • test_single_token_list - single list of token IDs
  • test_batched_token_lists - multiple lists of token IDs
  • test_hf_tokenizer_dict_single - HF tokenizer output for single prompt
  • test_hf_tokenizer_dict_batched - HF tokenizer output for batched prompts
  • test_hf_tokenizer_with_padding_mask - padding tokens filtered correctly
  • test_token_list_in_invoker - token list within invoker
  • test_mixed_string_and_token_invokers - mixing string and token inputs

Note: test_invoker_group_batching fails on upstream/main (pre-existing bug, unrelated to this PR)

🤖 Generated with Claude Code

Butanium and others added 2 commits January 7, 2026 23:18
- Support passing token IDs as list of ints or list of lists of ints
- Support passing HuggingFace tokenizer output dicts (input_ids + attention_mask)
- Filter padding tokens using attention_mask when provided
- Fix deprecated vllm.transformers_utils.tokenizer import
- Set pad_token to eos_token if not set
- Add tests for token input compatibility

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixes deprected imports

Copy link
Collaborator Author

@Butanium Butanium left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

tests were claude generated, all code is mine

@JadenFiotto-Kaufman
Copy link
Member

@Butanium Amazing thank you

Right, our vLLM doesn't work with empty invokes as of now: #590

@JadenFiotto-Kaufman JadenFiotto-Kaufman merged commit ed49406 into ndif-team:main Jan 8, 2026
1 check passed
@Butanium
Copy link
Collaborator Author

Butanium commented Jan 8, 2026

@JadenFiotto-Kaufman do you mean this is a bug introduced by this PR?

@JadenFiotto-Kaufman
Copy link
Member

@JadenFiotto-Kaufman do you mean this is a bug introduced by this PR?

No I never got it working. It's on the list.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants