[Test][e2e][LoRA] Add more e2e tests to cover scenarios of LoRA (#4075)
### What this PR does / why we need it?
This PR depends on PR
https://github.com/vllm-project/vllm-ascend/pull/4046. And only if the
latter merged, it will work.
This PR aims to solve the issue
https://github.com/vllm-project/vllm-ascend/issues/3240.
The new-added Llama-2-7b-hf and Qwen3-0.6B testcases will cover the
senarios that the LoRA weights are added to q_proj, v_proj, k_proj,
o_proj, gate_proj, up_proj, down_proj, embed_tokens and lm_head modules.
### Does this PR introduce _any_ user-facing change?
No.
### How was this patch tested?
pytest -sv tests/e2e/singlecard/test_llama2_lora.py
pytest -sv tests/e2e/singlecard/test_qwen3_multi_loras.py
- vLLM version: v0.11.0
- vLLM main:
83f478bb19
---------
Signed-off-by: paulyu12 <507435917@qq.com>
This commit is contained in:
4
.github/workflows/_e2e_test.yaml
vendored
4
.github/workflows/_e2e_test.yaml
vendored
@@ -104,8 +104,9 @@ jobs:
|
||||
pytest -sv --durations=0 tests/e2e/singlecard/test_cpu_offloading.py
|
||||
# xgrammar has parameter mismatching bug, please follows: https://github.com/vllm-project/vllm-ascend/issues/5524
|
||||
# pytest -sv --durations=0 tests/e2e/singlecard/test_guided_decoding.py
|
||||
# torch 2.8 doesn't work with lora, fix me
|
||||
pytest -sv --durations=0 tests/e2e/singlecard/test_ilama_lora.py
|
||||
pytest -sv --durations=0 tests/e2e/singlecard/test_llama32_lora.py
|
||||
pytest -sv --durations=0 tests/e2e/singlecard/test_qwen3_multi_loras.py
|
||||
pytest -sv --durations=0 tests/e2e/singlecard/test_models.py
|
||||
pytest -sv --durations=0 tests/e2e/singlecard/test_multistream_overlap_shared_expert.py
|
||||
pytest -sv --durations=0 tests/e2e/singlecard/test_profile_execute_duration.py
|
||||
@@ -215,7 +216,6 @@ jobs:
|
||||
pytest -sv --durations=0 tests/e2e/multicard/2-cards/test_expert_parallel.py
|
||||
pytest -sv --durations=0 tests/e2e/multicard/2-cards/test_external_launcher.py
|
||||
pytest -sv --durations=0 tests/e2e/multicard/2-cards/test_full_graph_mode.py
|
||||
# torch 2.8 doesn't work with lora, fix me
|
||||
pytest -sv --durations=0 tests/e2e/multicard/2-cards/test_ilama_lora_tp2.py
|
||||
|
||||
|
||||
|
||||
Reference in New Issue
Block a user