Skip to content

fix(examples): add missing runtime dependencies for cloud-edge LLM example#324

Open
iashutoshyadav wants to merge 1 commit intokubeedge:mainfrom
iashutoshyadav:fix/llm-example-missing-deps
Open

fix(examples): add missing runtime dependencies for cloud-edge LLM example#324
iashutoshyadav wants to merge 1 commit intokubeedge:mainfrom
iashutoshyadav:fix/llm-example-missing-deps

Conversation

@iashutoshyadav
Copy link

What does this PR do?

Fixes missing runtime dependencies in the
examples/cloud-edge-collaborative-inference-for-llm example that cause
sequential ModuleNotFoundError failures in a fresh environment.

Issues observed

Running the example fails due to missing dependencies:

  • prettytable – result visualization
  • onnx – required by MultiedgeInference
  • retry – required by API-based LLM module
  • colorlog, pyyaml – required by ianvs runtime
  • Rust/Cargo – required for building tokenizers

These were not fully documented or included in the example requirements,
blocking new users from running the benchmark successfully.

What was changed

  • Updated examples/cloud-edge-collaborative-inference-for-llm/requirements.txt
    to include missing Python runtime dependencies
  • Aligns example dependencies with actual imports used at runtime

How it was tested

  • Fresh conda environment (Python 3.8)
  • Ubuntu Linux (WSL2)
  • Reproduced failures, installed dependencies, verified progress

Related issue

Fixes #322

@kubeedge-bot
Copy link
Collaborator

[APPROVALNOTIFIER] This PR is NOT APPROVED

This pull-request has been approved by: iashutoshyadav
To complete the pull request process, please assign moorezheng after the PR has been reviewed.
You can assign the PR to them by writing /assign @moorezheng in a comment when ready.

The full list of commands accepted by this bot can be found here.

Details Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@gemini-code-assist
Copy link

Summary of Changes

Hello @iashutoshyadav, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request significantly improves the user experience for the cloud-edge-collaborative-inference-for-llm example by resolving critical runtime dependency issues. By updating the requirements.txt file, it ensures that all necessary Python packages are readily available, preventing ModuleNotFoundError failures and allowing new users to successfully execute the benchmark without manual dependency resolution.

Highlights

  • Dependency Resolution: Addresses and resolves multiple ModuleNotFoundError issues in the cloud-edge-collaborative-inference-for-llm example, preventing execution failures in fresh environments.
  • Enhanced Usability: Ensures the example is immediately runnable for new users by including all necessary Python runtime dependencies in its requirements.txt file.
  • Specific Additions: Integrates prettytable, scikit-learn, numpy, pandas, tqdm, matplotlib, onnx, retry, colorlog, and pyyaml into the example's requirements.txt.

🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console.

Changelog
  • examples/cloud-edge-collaborative-inference-for-llm/requirements.txt
    • Added prettytable~=2.5.0 for result visualization.
    • Included scikit-learn, numpy, pandas, tqdm, and matplotlib to support various data processing and plotting functionalities.
    • Added onnx as a requirement for MultiedgeInference.
    • Integrated retry for the APIBasedLLM module.
    • Added colorlog for logging and pyyaml for configuration parsing.
    • Reorganized existing dependencies and added clarifying comments for better readability.
Activity
  • The author, iashutoshyadav, created this pull request to fix missing runtime dependencies in the cloud-edge-collaborative-inference-for-llm example.
  • The PR description clearly outlines the observed issues, specifically sequential ModuleNotFoundError for several packages and a mention of Rust/Cargo for tokenizers.
  • It details the changes made, which primarily involve updating requirements.txt to align with actual imports used at runtime.
  • Testing was performed in a fresh conda environment on Ubuntu Linux (WSL2), where failures were reproduced and the fix verified.
  • This PR addresses issue Missing Dependencies in Cloud-Edge Collaborative Inference for LLM Example #322.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

@kubeedge-bot kubeedge-bot added the size/S Denotes a PR that changes 10-29 lines, ignoring generated files. label Feb 5, 2026
Copy link

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request addresses missing runtime dependencies for the cloud-edge LLM example, which is a valuable fix. The changes in requirements.txt correctly add several necessary packages. However, I've noticed a couple of duplicate dependency entries (openai and groq) that should be removed to keep the file clean. My review comment points out these specific redundancies and suggests a fix.

Comment on lines 16 to +22
vllm
transformers
openai
accelerate
datamodel_code_generator
kaggle
groq No newline at end of file
groq

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

This block of dependencies appears to be the contents of the original requirements.txt file. However, openai and groq are now also specified on lines 14 and 15, making them duplicates. To keep the dependency list clean and avoid redundancy, please remove the duplicated entries from this block.

vllm
transformers
accelerate
datamodel_code_generator
kaggle

Signed-off-by: Ashutosh Yadav <ashutosh2213072@akgec.ac.in>
@iashutoshyadav iashutoshyadav force-pushed the fix/llm-example-missing-deps branch from 2eea313 to f757da4 Compare February 5, 2026 21:29
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

size/S Denotes a PR that changes 10-29 lines, ignoring generated files.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Missing Dependencies in Cloud-Edge Collaborative Inference for LLM Example

2 participants