Skip to content

Comments

Added optional LoRA adapter support for vLLM inference.#66

Open
ViktoriaNov wants to merge 3 commits intomainfrom
LORA
Open

Added optional LoRA adapter support for vLLM inference.#66
ViktoriaNov wants to merge 3 commits intomainfrom
LORA

Conversation

@ViktoriaNov
Copy link
Collaborator

No description provided.

@codecov
Copy link

codecov bot commented Jan 8, 2026

Codecov Report

❌ Patch coverage is 61.53846% with 5 lines in your changes missing coverage. Please review.

Files with missing lines Patch % Lines
llmsql/inference/inference_vllm.py 61.53% 5 Missing ⚠️

📢 Thoughts on this report? Let us know!

@DzmitryPihulski
Copy link
Collaborator

Please add some tests for the new feature, to satisfy the target code coverage (81.25%)

Comment on lines 74 to 78
# === LoRA Parameters ===
lora_path: str | None = None,
lora_name: str = "default",
lora_scale: float = 1.0,
max_lora_rank: int = 64,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can change this parameters to the lora_params as the optional dict for the LoRA arguments to pass to the class.

@DzmitryPihulski DzmitryPihulski linked an issue Feb 23, 2026 that may be closed by this pull request
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request feature

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add handaling of LoRA adapters for vllm inference

2 participants