Skip to content

Relax version upper bounds on huggingface-hub and transformers#472

Merged
JoelNiklaus merged 6 commits intohuggingface:mainfrom
JoelNiklaus:relax/dependency-version-bounds
Mar 11, 2026
Merged

Relax version upper bounds on huggingface-hub and transformers#472
JoelNiklaus merged 6 commits intohuggingface:mainfrom
JoelNiklaus:relax/dependency-version-bounds

Conversation

@JoelNiklaus
Copy link
Copy Markdown
Contributor

@JoelNiklaus JoelNiklaus commented Mar 11, 2026

Problem

pyproject.toml had upper-bound constraints that prevented installing newer versions of key dependencies:

  • huggingface-hub>=0.34.0,<1.0 — blocked huggingface-hub 1.x
  • transformers>=4.57 — pinned to v4 only
  • datatrove[decont] was commented out in testing because lighteval had very restrictive vllm requirements (vllm>=0.10.0,<0.10.2)

These constraints were originally added because transformers v4 required huggingface-hub<1.0, and vllm did not yet support transformers v5. However, datatrove itself does not depend on these upper bounds, and they unnecessarily restrict downstream users.

Solution

  • huggingface-hub>=0.34.0 — remove the <1.0 cap
  • transformers — remove the version pin entirely
  • Re-enable datatrove[decont] in the testing extras, since lighteval has relaxed its vllm dependency

This lets the dependency resolver pick the best compatible versions based on the user's environment. If vllm is installed alongside, its own constraint (transformers<5,>=4.56.0 as of vllm 0.17.1) will still enforce the appropriate transformers version — datatrove does not need to duplicate that restriction.

Note

vllm 0.17.1 still constrains transformers<5. Users installing datatrove[inference] will get transformers 4.x until vllm releases a version with transformers 5 support. Users who don't need vllm are now free to use transformers 5.x and huggingface-hub 1.x.

Testing

  • Verified that uv pip install -e ".[dev]" resolves successfully
  • Confirmed vllm 0.17.1 correctly constrains transformers to 4.x at install time
  • Confirmed that without vllm, transformers 5.x and huggingface-hub 1.x install cleanly

Made with Cursor


Note

Low Risk
Low risk: this only adjusts dependency constraints and test extras, with no runtime logic changes. Main risk is CI/install variability if newer dependency versions introduce incompatibilities.

Overview
Relaxes dependency constraints in pyproject.toml by removing the <1.0 upper bound on huggingface-hub and dropping the transformers version pin so resolvers can select newer compatible releases.

Re-enables datatrove[decont] in the testing extra (previously excluded), expanding the default test dependency set.

Written by Cursor Bugbot for commit 245bed5. Configure here.

Joel Niklaus added 6 commits March 11, 2026 14:46
Remove unnecessary upper-bound constraints from pyproject.toml so
downstream users can install newer versions. vllm's own constraints
still enforce transformers<5 when installed alongside.

Also re-enable datatrove[decont] in testing extras now that lighteval
has relaxed its vllm dependency.

Made-with: Cursor
Set a lower bound on vllm in inference extras so dependency
resolution no longer backtracks to vllm 0.2.5, which attempts a
CUDA source build on CI runners.

Made-with: Cursor
Set vllm>=0.17.1 so dependency resolution cannot select legacy
vllm==0.2.5 (which triggers a CUDA source build and fails in
CPU-only CI).

Made-with: Cursor
Lower the vllm minimum version from 0.17.1 to 0.10.0 in inference
dependencies while still preventing resolver backtracking to legacy
CUDA-build-only releases.

Made-with: Cursor
Set huggingface-hub>=1.0 and adjust disk writer retry tests to
construct HfHubHTTPError with a real httpx Response, which is
required by huggingface-hub 1.x.

Made-with: Cursor
@JoelNiklaus JoelNiklaus merged commit da1713e into huggingface:main Mar 11, 2026
4 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant