Skip to content

Update Python dependencies to RHOAI index 3.4#176

Draft
syedriko wants to merge 1 commit intolightspeed-core:mainfrom
syedriko:syedriko-rhoai-3.4
Draft

Update Python dependencies to RHOAI index 3.4#176
syedriko wants to merge 1 commit intolightspeed-core:mainfrom
syedriko:syedriko-rhoai-3.4

Conversation

@syedriko
Copy link
Copy Markdown
Collaborator

@syedriko syedriko commented Apr 22, 2026

Description

Update Python dependencies to RHOAI index 3.4

Type of change

  • Refactor
  • New feature
  • Bug fix
  • CVE fix
  • Optimization
  • Documentation Update
  • Configuration Update
  • Bump-up service version
  • Bump-up dependent library
  • Bump-up library or tool used for development (does not change the final image)
  • CI configuration change
  • Konflux configuration change
  • Unit tests improvement
  • Integration tests improvement
  • End to end tests improvement

Tools used to create PR

Identify any AI code assistants used in this PR (for transparency and review context)

  • Assisted-by: (e.g., Claude, CodeRabbit, Ollama, etc., N/A if not used)
  • Generated by: (e.g., tool name and version; N/A if not used)

Related Tickets & Documents

  • Related Issue #
  • Closes #

Checklist before requesting a review

  • I have performed a self-review of my code.
  • PR has passed all pre-merge test jobs.
  • If it is a core feature, I have added thorough tests.

Testing

  • Please provide detailed steps to perform tests related to this code change.
  • How were the fix/results from this change verified? Please provide relevant screenshots or results.

Summary by CodeRabbit

Release Notes

  • Chores
    • Updated build infrastructure to latest base image and repository indices
    • Refreshed Python dependency versions for improved compatibility and stability
    • Removed upper version constraint on docling library to support newer releases
    • Simplified build pipeline configuration for enhanced maintainability

@syedriko syedriko marked this pull request as draft April 22, 2026 00:03
@coderabbitai
Copy link
Copy Markdown

coderabbitai Bot commented Apr 22, 2026

Important

Review skipped

Draft detected.

Please check the settings in the CodeRabbit UI or the .coderabbit.yaml file in this repository. To trigger a single review, invoke the @coderabbitai review command.

⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 78f0e325-05e9-4222-8f35-4dbf593580b9

You can disable this status message by setting the reviews.review_status to false in the CodeRabbit configuration file.

Use the checkbox below for a quick retry:

  • 🔍 Trigger review

Walkthrough

This PR updates project infrastructure and dependencies from RHOAI version 3.3 to 3.4, including updated Tekton pipeline pip package lists, removed version constraints for docling dependency, simplified build scripts by eliminating arch-specific wheel injection logic, and regenerated dependency lock files reflecting new package versions.

Changes

Cohort / File(s) Summary
Tekton Pipeline Configuration
.tekton/rag-tool-pull-request.yaml, .tekton/rag-tool-push.yaml
Updated pip.binary.packages lists with new dependencies (e.g., accelerate, anyio, aiosqlite, llama-stack, mcp, opentelemetry-*, pgvector, semchunk, starlette) and removed others, reflecting updated pipeline requirements.
Base Image and Index References
README.md, build-args-konflux.conf
Updated RHOAI index references from version 3.3 to 3.4 for CUDA wheels and base image configuration.
Dependency Constraints
pyproject.toml, requirements.overrides.txt
Removed upper version bound for docling dependency in pyproject.toml; updated multiple package version pins in requirements.overrides.txt to versions from RHOAI 3.4 index.
Generated Build Dependencies
requirements-build.txt
Pruned build-related dependencies by removing versioning/templating tools (dunamai, jinja2, uv-build, etc.) and updated transitive dependency provenance.
Dependency Lock Files
requirements.hashes.source.txt, requirements.hashes.wheel.pypi.txt, requirements.hashes.wheel.txt
Substantially updated pinned package versions and SHA256 hashes across all three lock files, reflecting new dependency universe from RHOAI 3.4 (e.g., removed torch==2.9.0, tokenizers==0.22.1, rapidocr==3.8.0; added psycopg2-binary==2.9.12 and updated many others).
Build Script Updates
scripts/konflux_requirements.sh
Updated RHOAI index URL to 3.4, removed Tekton YAML file targets for wheel injection, and eliminated post-processing logic for arch-specific wheel pins (aiohttp, markupsafe, torch, torchvision, triton).

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~50 minutes

Possibly related issues

  • #1381: This PR modifies Tekton pipeline prefetch pip.binary.packages lists in both pull-request and push YAML files, directly addressing the referenced issue's scope.
🚥 Pre-merge checks | ✅ 5
✅ Passed checks (5 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title 'Update Python dependencies to RHOAI index 3.4' directly and clearly describes the main change across all modified files—updating dependency versions and configurations to align with RHOAI index version 3.4.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.
Linked Issues check ✅ Passed Check skipped because no linked issues were found for this pull request.
Out of Scope Changes check ✅ Passed Check skipped because no linked issues were found for this pull request.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
scripts/konflux_requirements.sh (1)

118-128: ⚠️ Potential issue | 🟠 Major

Keep per-arch hash files generated or stop consuming them.

The script still leaves requirements.hashes.wheel.cpu.x86_64.txt and requirements.hashes.wheel.cpu.aarch64.txt referenced by both Tekton configs, but this flow no longer writes them. Regenerating requirements can therefore leave stale arch-specific pins in files that hermetic prefetch still consumes. Either regenerate harmless placeholders or remove those files from both requirements_files lists.

🛠️ Minimal placeholder-generation fix
 if grep -qE '^(torch|torchvision|triton)==' "$WHEEL_HASH_FILE"; then
 	awk '
 /^torch==|^torchvision==|^triton==/ { skip=1; next }
 skip && /^[ \t]/ { next }
 skip && /^[a-zA-Z0-9]/ { skip=0 }
 { print }
 ' "$WHEEL_HASH_FILE" > "${WHEEL_HASH_FILE}.strip" && mv "${WHEEL_HASH_FILE}.strip" "$WHEEL_HASH_FILE"
 fi
+for _arch_hash_file in "$WHEEL_HASH_CPU_X86" "$WHEEL_HASH_CPU_AARCH"; do
+	printf '%s\n' "# No architecture-specific wheel pins; generated by konflux_requirements.sh." > "$_arch_hash_file"
+done
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@scripts/konflux_requirements.sh` around lines 118 - 128, The script modifies
WHEEL_HASH_FILE but no longer generates per-arch files
(requirements.hashes.wheel.cpu.x86_64.txt and
requirements.hashes.wheel.cpu.aarch64.txt) that Tekton recipes still reference;
either ensure placeholder per-arch files are regenerated after the strip step
(create harmless empty or header-only files named
requirements.hashes.wheel.cpu.x86_64.txt and
requirements.hashes.wheel.cpu.aarch64.txt when WHEEL_HASH_FILE is mutated), or
remove those filenames from the Tekton requirements_files lists so they are not
consumed; update scripts/konflux_requirements.sh to write the placeholders
(using the WHEEL_HASH_FILE basename logic) or update the Tekton configs'
requirements_files entries to stop referencing the arch-specific files, and make
sure references to WHEEL_HASH_FILE and the two arch-specific filenames are
consistent across both Tekton configs.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In @.tekton/rag-tool-push.yaml:
- Line 64: The binary.packages allowlist is missing "llama-stack-api" while
"llama-stack" and "llama-stack-client" are present and requirements include
llama-stack-api==0.5.0; update the "packages" comma-separated string in the
binary.packages entry (the same list shown in the diff) to include
"llama-stack-api" and apply the same change in the peer pipeline config that has
an identical allowlist (the other .tekton rag-tool file), ensuring the package
name is added exactly as "llama-stack-api".

In `@requirements-build.txt`:
- Around line 23-33: The build requirements file contains conflicting duplicate
pins (e.g., hatchling and setuptools) produced by pip-compile; fix this by
resolving the source dependency conflicts in your
requirements.source.txt.build—identify where hatchling and setuptools are
required with different versions (direct or transitive), choose a single
compatible version (or add an explicit constraint/compatibility marker) in
requirements.source.txt.build, then re-run pip-compile to regenerate
requirements-build.txt so it contains a single pinned entry per package.

In `@requirements.overrides.txt`:
- Around line 21-22: The requirements override pins for torch and torchvision
violate pyproject.toml constraints: change the pinned versions in
requirements.overrides (currently "torch==2.10.0" and "torchvision==0.25.0") to
versions that respect the upper bounds specified in pyproject.toml (e.g.,
upgrade to allowed ranges such as "torch>=2.9.0,<2.10.0" and
"torchvision>=0.24.0,<0.25.0") or alternatively update pyproject.toml to
explicitly allow the newer exact versions; ensure consistency between the
override entries for "torch" and "torchvision" and the version ranges declared
in pyproject.toml.

---

Outside diff comments:
In `@scripts/konflux_requirements.sh`:
- Around line 118-128: The script modifies WHEEL_HASH_FILE but no longer
generates per-arch files (requirements.hashes.wheel.cpu.x86_64.txt and
requirements.hashes.wheel.cpu.aarch64.txt) that Tekton recipes still reference;
either ensure placeholder per-arch files are regenerated after the strip step
(create harmless empty or header-only files named
requirements.hashes.wheel.cpu.x86_64.txt and
requirements.hashes.wheel.cpu.aarch64.txt when WHEEL_HASH_FILE is mutated), or
remove those filenames from the Tekton requirements_files lists so they are not
consumed; update scripts/konflux_requirements.sh to write the placeholders
(using the WHEEL_HASH_FILE basename logic) or update the Tekton configs'
requirements_files entries to stop referencing the arch-specific files, and make
sure references to WHEEL_HASH_FILE and the two arch-specific filenames are
consistent across both Tekton configs.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: ac2f76b9-682d-4dfd-a7cf-6bd2aa5eca09

📥 Commits

Reviewing files that changed from the base of the PR and between 6946c22 and 55d8a73.

⛔ Files ignored due to path filters (1)
  • uv.lock is excluded by !**/*.lock
📒 Files selected for processing (11)
  • .tekton/rag-tool-pull-request.yaml
  • .tekton/rag-tool-push.yaml
  • README.md
  • build-args-konflux.conf
  • pyproject.toml
  • requirements-build.txt
  • requirements.hashes.source.txt
  • requirements.hashes.wheel.pypi.txt
  • requirements.hashes.wheel.txt
  • requirements.overrides.txt
  • scripts/konflux_requirements.sh

"requirements_build_files": ["requirements-build.txt"],
"binary": {
"packages": "aiohappyeyeballs,aiohttp,aiosignal,annotated-doc,annotated-types,antlr4-python3-runtime,asyncpg,beautifulsoup4,cffi,colorama,cryptography,dataclasses-json,defusedxml,deprecated,dill,distro,docling-parse,einops,et-xmlfile,faiss-cpu,filetype,fire,frozenlist,greenlet,h11,hf-xet,httpcore,httpx,httpx-sse,idna,importlib-metadata,jinja2,joblib,jsonlines,jsonref,jsonschema,jsonschema-specifications,lxml,markdown-it-py,marko,markupsafe,marshmallow,mdurl,mpire,mpmath,multidict,mypy-extensions,nest-asyncio,networkx,numpy,omegaconf,openpyxl,packaging,pandas,pillow,pluggy,prompt-toolkit,propcache,psycopg2-binary,pycparser,pydantic,pydantic-core,pylatexenc,pypdfium2,python-dateutil,python-docx,python-pptx,pyyaml,referencing,rpds-py,rtree,safetensors,scikit-learn,scipy,setuptools,shapely,shellingham,six,sniffio,sympy,termcolor,threadpoolctl,tiktoken,tomlkit,torchvision,tqdm,transformers,tree-sitter,tree-sitter-c,tree-sitter-javascript,tree-sitter-python,tree-sitter-typescript,triton,typing-extensions,typing-inspect,typing-inspection,urllib3,xlsxwriter,zipp,uv-build,uv,pip,maturin,jiter,opencv-python,rapidocr,tokenizers,torch",
"packages": "accelerate,aiohappyeyeballs,aiohttp,aiosignal,aiosqlite,annotated-doc,annotated-types,antlr4-python3-runtime,anyio,asyncpg,attrs,banks,beautifulsoup4,certifi,cffi,chardet,charset-normalizer,circuitbreaker,click,colorama,colorlog,cryptography,dataclasses-json,defusedxml,deprecated,dill,dirtyjson,distro,docling-ibm-models,docling-parse,einops,et-xmlfile,faiss-cpu,filetype,fire,frozenlist,fsspec,googleapis-common-protos,greenlet,griffe,griffecli,griffelib,h11,hf-xet,httpcore,httpx,httpx-sse,importlib-metadata,jinja2,jiter,joblib,jsonlines,jsonref,jsonschema,jsonschema-specifications,latex2mathml,llama-index-embeddings-openai,llama-index-instrumentation,llama-index-llms-openai,llama-index-workflows,llama-stack,llama-stack-client,markdown-it-py,marko,markupsafe,marshmallow,mcp,mdurl,mpire,mpmath,multidict,mypy-extensions,nest-asyncio,networkx,nltk,numpy,omegaconf,opencv-python,openpyxl,opentelemetry-api,opentelemetry-exporter-otlp-proto-common,opentelemetry-exporter-otlp-proto-http,opentelemetry-proto,opentelemetry-sdk,opentelemetry-semantic-conventions,oracledb,packaging,pandas,pgvector,pillow,platformdirs,pluggy,polyfactory,prompt-toolkit,propcache,protobuf,psutil,pyaml,pycparser,pygments,pyjwt,pylatexenc,pyopenssl,pypdf,pypdfium2,python-dateutil,python-docx,python-dotenv,python-multipart,python-pptx,pytz,pyyaml,referencing,regex,requests,rich,rpds-py,rtree,safetensors,scikit-learn,scipy,semchunk,setuptools,shapely,shellingham,six,sniffio,soupsieve,sqlalchemy,sse-starlette,starlette,striprtf,sympy,tabulate,tenacity,termcolor,threadpoolctl,tiktoken,tinytag,tokenizers,tomlkit,torch,torchvision,tornado,tqdm,transformers,tree-sitter,tree-sitter-c,tree-sitter-javascript,tree-sitter-python,tree-sitter-typescript,triton,typer,typing-extensions,typing-inspect,typing-inspection,tzdata,urllib3,wcwidth,websockets,wrapt,xlsxwriter,yarl,zipp,uv-build,uv,pip,maturin,psycopg2-binary,rapidocr",
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

# First, locate and examine the two YAML files
find . -name "rag-tool-push.yaml" -o -name "rag-tool-pull-request.yaml" | head -20

Repository: lightspeed-core/rag-content

Length of output: 136


🏁 Script executed:

# Search for wheel hash requirements files
find . -name "*requirements*" -o -name "*wheel*" | grep -E "\.(txt|lock)$" | head -20

Repository: lightspeed-core/rag-content

Length of output: 626


🏁 Script executed:

# Check if llama-stack-api appears in the YAML files
rg "llama-stack-api" .tekton/ 2>/dev/null || echo "No matches found"

Repository: lightspeed-core/rag-content

Length of output: 87


🏁 Script executed:

# Get the full content of rag-tool-push.yaml to verify line 64
cat .tekton/rag-tool-push.yaml | head -80

Repository: lightspeed-core/rag-content

Length of output: 4671


🏁 Script executed:

# Check all requirements files for llama-stack-api
rg "llama-stack-api" ./requirements*.txt 2>/dev/null || echo "No matches in requirements files"

Repository: lightspeed-core/rag-content

Length of output: 397


🏁 Script executed:

# Check the pull-request YAML for comparison
cat .tekton/rag-tool-pull-request.yaml | grep -A 5 "llama-stack"

Repository: lightspeed-core/rag-content

Length of output: 2128


🏁 Script executed:

# Extract the packages line from pull-request YAML to see if it has llama-stack-api
cat .tekton/rag-tool-pull-request.yaml | sed -n '50,80p'

Repository: lightspeed-core/rag-content

Length of output: 2835


🏁 Script executed:

# Extract and compare the packages lists from both YAML files
echo "=== PUSH YAML packages ===" && \
cat .tekton/rag-tool-push.yaml | sed -n '64p' | grep -o '"packages": "[^"]*"' | cut -d'"' -f4 | tr ',' '\n' | grep llama && \
echo "=== PULL-REQUEST YAML packages ===" && \
cat .tekton/rag-tool-pull-request.yaml | grep -o '"packages": "[^"]*"' | cut -d'"' -f4 | tr ',' '\n' | grep llama

Repository: lightspeed-core/rag-content

Length of output: 402


🏁 Script executed:

# Verify that llama-stack-api is in source requirements files
grep -l "llama-stack-api" ./requirements.hashes.source*.txt

Repository: lightspeed-core/rag-content

Length of output: 141


Add llama-stack-api to the binary packages allowlist.

Both .tekton/rag-tool-push.yaml and .tekton/rag-tool-pull-request.yaml use identical package allowlists that include llama-stack and llama-stack-client but omit llama-stack-api. However, llama-stack-api==0.5.0 appears in requirements.hashes.source.txt and requirements.hashes.source.cuda.txt. Add it to the binary.packages list to maintain consistency with requirements files across both pipeline configurations.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.tekton/rag-tool-push.yaml at line 64, The binary.packages allowlist is
missing "llama-stack-api" while "llama-stack" and "llama-stack-client" are
present and requirements include llama-stack-api==0.5.0; update the "packages"
comma-separated string in the binary.packages entry (the same list shown in the
diff) to include "llama-stack-api" and apply the same change in the peer
pipeline config that has an identical allowlist (the other .tekton rag-tool
file), ensuring the package name is added exactly as "llama-stack-api".

Comment thread requirements-build.txt
Comment on lines 23 to 33
hatch-vcs==0.5.0
# via
# attrs
# chardet
# filelock
# fsspec
# platformdirs
# via filelock
hatchling==1.26.3
# via
# hatch-fancy-pypi-readme
# llama-stack-client
# openai
hatchling==1.29.0
# via
# attrs
# banks
# chardet
# expandvars
# filelock
# fsspec
# griffe
# griffecli
# griffelib
# hatch-fancy-pypi-readme
# hatch-vcs
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Detect duplicate pinned package names in requirements-build.txt.
set -euo pipefail

python - <<'PY'
import re
from collections import defaultdict

fname = "requirements-build.txt"
pat = re.compile(r'^([A-Za-z0-9_.-]+)==([^\s]+)')
seen = defaultdict(list)

with open(fname, encoding="utf-8") as f:
    for i, line in enumerate(f, 1):
        m = pat.match(line.strip())
        if m:
            seen[m.group(1).lower()].append((i, m.group(2)))

dups = {k:v for k,v in seen.items() if len(v) > 1}
if not dups:
    print("No duplicate pinned package names found.")
else:
    print("Duplicate pinned package names:")
    for pkg, rows in sorted(dups.items()):
        print(f"- {pkg}: " + ", ".join([f'line {ln} -> {ver}' for ln, ver in rows]))
PY

Repository: lightspeed-core/rag-content

Length of output: 203


🏁 Script executed:

cat -n requirements-build.txt | head -90

Repository: lightspeed-core/rag-content

Length of output: 2561


Fix conflicting duplicate package pins in auto-generated build requirements.

This auto-generated file contains multiple unpinned versions of the same packages (hatchling at lines 25 and 29, setuptools at lines 79 and 81). pip-compile should not produce this output. The duplicates indicate unresolved dependency conflicts in your source specification. When pip installs this file, it uses only the last occurrence of each package, silently dropping the others, making the resolution non-deterministic and hiding the underlying constraint conflict.

Fix by resolving the conflicting requirements in your source dependency file (requirements.source.txt.build).

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@requirements-build.txt` around lines 23 - 33, The build requirements file
contains conflicting duplicate pins (e.g., hatchling and setuptools) produced by
pip-compile; fix this by resolving the source dependency conflicts in your
requirements.source.txt.build—identify where hatchling and setuptools are
required with different versions (direct or transitive), choose a single
compatible version (or add an explicit constraint/compatibility marker) in
requirements.source.txt.build, then re-run pip-compile to regenerate
requirements-build.txt so it contains a single pinned entry per package.

Comment on lines +21 to +22
torch==2.10.0
torchvision==0.25.0
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Verify torch/torchvision constraints in pyproject.toml and overrides side-by-side.
set -euo pipefail

echo "== pyproject constraints =="
rg -n '^\s*"torch(vision)?' pyproject.toml -n -C1

echo
echo "== requirements.overrides pins =="
rg -n '^(torch|torchvision)=='

Repository: lightspeed-core/rag-content

Length of output: 360


Torch and torchvision override pins violate project constraints in pyproject.toml.
Lines 21-22 pin torch==2.10.0 and torchvision==0.25.0, but pyproject.toml constrains these to torch>=2.9.0,<2.10.0 and torchvision>=0.24.0,<0.25.0. Update the overrides to respect the upper bound constraints or update pyproject.toml to align with the intended versions.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@requirements.overrides.txt` around lines 21 - 22, The requirements override
pins for torch and torchvision violate pyproject.toml constraints: change the
pinned versions in requirements.overrides (currently "torch==2.10.0" and
"torchvision==0.25.0") to versions that respect the upper bounds specified in
pyproject.toml (e.g., upgrade to allowed ranges such as "torch>=2.9.0,<2.10.0"
and "torchvision>=0.24.0,<0.25.0") or alternatively update pyproject.toml to
explicitly allow the newer exact versions; ensure consistency between the
override entries for "torch" and "torchvision" and the version ranges declared
in pyproject.toml.

@syedriko syedriko force-pushed the syedriko-rhoai-3.4 branch 2 times, most recently from 7ea3892 to 55d6924 Compare April 22, 2026 03:55
@syedriko syedriko force-pushed the syedriko-rhoai-3.4 branch from 55d6924 to 262793b Compare April 22, 2026 04:50
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant