Files
xc-llm-ascend/pyproject.toml
Mengqing Cao d59e7fa095 [CI] Pin transformers<4.53.0 and fix EPLB load_weights to make CI passed (#1482)
### What this PR does / why we need it?

- Fix vLLM EPLB break
e9fd658a73
by recovering load_weights back to [v0.9.1
version](07b8fae219)
temporarily.

- Fix transformers>=4.53.0 image processor break
Related: https://github.com/vllm-project/vllm-ascend/issues/1470

- Mirror torch_npu requirements to pyproject.toml

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
CI passed

---------

Signed-off-by: MengqingCao <cmq0113@163.com>
Signed-off-by: Yikun Jiang <yikunkero@gmail.com>
Co-authored-by: Yikun Jiang <yikunkero@gmail.com>
2025-06-28 00:12:43 +08:00

26 lines
542 B
TOML

[build-system]
# Should be mirrored in requirements.txt
requires = [
"cmake>=3.26",
"decorator",
"einops",
"numpy<2.0.0",
"packaging",
"pip",
"pybind11",
"pyyaml",
"scipy",
"setuptools>=64",
"setuptools-scm>=8",
"torch-npu==2.5.1.post1.dev20250619",
"torch>=2.5.1",
"torchvision<0.21.0",
"wheel",
"msgpack",
"quart",
"numba",
# Remove after https://github.com/vllm-project/vllm-ascend/issues/1470
"transformers<4.53.0",
]
build-backend = "setuptools.build_meta"