From 2a2d527e967ac5de32df4de4f230fcc073eda012 Mon Sep 17 00:00:00 2001 From: shaopeng-666 Date: Tue, 23 Dec 2025 23:55:40 +0800 Subject: [PATCH] fix transformer version to 4.57.3 (#5250) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit ### What this PR does / why we need it? In certain scenarios (such as smoke testing), the source code is used to update the vllm-ascend version for running updated models (such as Qwen3-VL). However, vllm and vllm-ascend themselves have no restrictions on the transformer version, and the transformer will not be updated, resulting in errors when launching the model. ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? - vLLM version: release/v0.13.0 - vLLM main: https://github.com/vllm-project/vllm/commit/ad32e3e19ccf0526cb6744a5fed09a138a5fb2f9 --------- Signed-off-by: 李少鹏 --- requirements.txt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/requirements.txt b/requirements.txt index 57a695ef..c32d0817 100644 --- a/requirements.txt +++ b/requirements.txt @@ -30,5 +30,5 @@ numba #--extra-index-url https://mirrors.huaweicloud.com/ascend/repos/pypi torch-npu==2.8.0 -transformers<=4.57.1 +transformers>=4.57.3 fastapi<0.124.0