[CI] Fix broken CI (#2302)

1. disable test_eagle_ccorrectness test, we'll reopen it once oom error
fixed.
2. drop transformers version limit for main, since vLLM rely on
>=4.55.0, see:
65552b476b
3. fix kv_connector_output bug, see:
796bae07c5

- vLLM version: v0.10.0
- vLLM main:
d1af8b7be9

Signed-off-by: wangxiyuan <wangxiyuan1007@gmail.com>
This commit is contained in:
wangxiyuan
2025-08-11 11:22:32 +08:00
committed by GitHub
parent ee6f79c44a
commit 9260910c8d
5 changed files with 13 additions and 7 deletions

View File

@@ -13,8 +13,6 @@ setuptools-scm>=8
torch>=2.7.1
torchvision
wheel
# Remove after https://github.com/vllm-project/vllm-ascend/issues/2034
transformers<4.54.0
# requirements for disaggregated prefill
msgpack