[CI] Pin transformers<4.53.0 and fix EPLB load_weights to make CI passed (#1482)
### What this PR does / why we need it? - Fix vLLM EPLB breake9fd658a73by recovering load_weights back to [v0.9.1 version](07b8fae219) temporarily. - Fix transformers>=4.53.0 image processor break Related: https://github.com/vllm-project/vllm-ascend/issues/1470 - Mirror torch_npu requirements to pyproject.toml ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? CI passed --------- Signed-off-by: MengqingCao <cmq0113@163.com> Signed-off-by: Yikun Jiang <yikunkero@gmail.com> Co-authored-by: Yikun Jiang <yikunkero@gmail.com>
This commit is contained in:
@@ -25,3 +25,6 @@ numba
|
||||
--pre
|
||||
--extra-index-url https://mirrors.huaweicloud.com/ascend/repos/pypi
|
||||
torch-npu==2.5.1.post1.dev20250619
|
||||
|
||||
# Remove after https://github.com/vllm-project/vllm-ascend/issues/1470
|
||||
transformers<4.53.0
|
||||
|
||||
Reference in New Issue
Block a user