[CI] Pin transformers<4.53.0 and fix EPLB load_weights to make CI passed (#1482)
### What this PR does / why we need it? - Fix vLLM EPLB breake9fd658a73by recovering load_weights back to [v0.9.1 version](07b8fae219) temporarily. - Fix transformers>=4.53.0 image processor break Related: https://github.com/vllm-project/vllm-ascend/issues/1470 - Mirror torch_npu requirements to pyproject.toml ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? CI passed --------- Signed-off-by: MengqingCao <cmq0113@163.com> Signed-off-by: Yikun Jiang <yikunkero@gmail.com> Co-authored-by: Yikun Jiang <yikunkero@gmail.com>
This commit is contained in:
@@ -12,12 +12,14 @@ requires = [
|
||||
"scipy",
|
||||
"setuptools>=64",
|
||||
"setuptools-scm>=8",
|
||||
"torch-npu==2.5.1.post1.dev20250528",
|
||||
"torch-npu==2.5.1.post1.dev20250619",
|
||||
"torch>=2.5.1",
|
||||
"torchvision<0.21.0",
|
||||
"wheel",
|
||||
"msgpack",
|
||||
"quart",
|
||||
"numba",
|
||||
# Remove after https://github.com/vllm-project/vllm-ascend/issues/1470
|
||||
"transformers<4.53.0",
|
||||
]
|
||||
build-backend = "setuptools.build_meta"
|
||||
|
||||
Reference in New Issue
Block a user