[CI] Fix broken ci (#2530)

vLLM commit https://github.com/vllm-project/vllm/pull/22711 changed the
encode cache entries logic, this PR adapt the same change for vllm
ascend to make CI happy.

Co-Authored-By: zhoux77899 <zhouxiang100@huawei.com>

- vLLM version: v0.10.1.1
- vLLM main:
0ff902f3b4

Signed-off-by: wangxiyuan <wangxiyuan1007@gmail.com>
This commit is contained in:
wangxiyuan
2025-08-26 07:42:24 +08:00
committed by GitHub
parent 99bf25af76
commit 7e494e94a9
5 changed files with 256 additions and 123 deletions

View File

@@ -215,6 +215,7 @@ def _construct_cached_request_state(req_id_suffix: int):
generator=None,
num_computed_tokens=len(output_token_ids),
output_token_ids=output_token_ids,
mm_hashes=None,
)