Skip to content

Added optional LoRA adapter support for vLLM inference.#66

Merged
DzmitryPihulski merged 4 commits intomainfrom
LORA
Apr 26, 2026
Merged

Added optional LoRA adapter support for vLLM inference.#66
DzmitryPihulski merged 4 commits intomainfrom
LORA

Conversation

@ViktoriaNov
Copy link
Copy Markdown
Collaborator

No description provided.

@codecov
Copy link
Copy Markdown

codecov Bot commented Jan 8, 2026

Codecov Report

❌ Patch coverage is 61.53846% with 5 lines in your changes missing coverage. Please review.

Files with missing lines Patch % Lines
llmsql/inference/inference_vllm.py 61.53% 5 Missing ⚠️

📢 Thoughts on this report? Let us know!

@DzmitryPihulski
Copy link
Copy Markdown
Collaborator

Please add some tests for the new feature, to satisfy the target code coverage (81.25%)

Comment thread llmsql/inference/inference_vllm.py Outdated
Comment on lines +74 to +78
# === LoRA Parameters ===
lora_path: str | None = None,
lora_name: str = "default",
lora_scale: float = 1.0,
max_lora_rank: int = 64,
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can change this parameters to the lora_params as the optional dict for the LoRA arguments to pass to the class.

@DzmitryPihulski DzmitryPihulski linked an issue Feb 23, 2026 that may be closed by this pull request
@DzmitryPihulski DzmitryPihulski merged commit bea17d8 into main Apr 26, 2026
4 of 5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request feature

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add handaling of LoRA adapters for vllm inference

2 participants