Update vllm pin to 12.25 (#5342)

### What this PR does / why we need it?
- Fix vllm break in the pr:
1.[Drop v0.14 deprecations
]https://github.com/vllm-project/vllm/pull/31285
### Does this PR introduce _any_ user-facing change?
No
### How was this patch tested?
- vLLM version: release/v0.13.0
- vLLM main:
bc0a5a0c08

---------

Signed-off-by: ZT-AIA <1028681969@qq.com>
This commit is contained in:
ZT-AIA
2025-12-26 14:05:40 +08:00
committed by GitHub
parent c2f776b846
commit adaa89a7a5
20 changed files with 22 additions and 22 deletions

View File

@@ -28,7 +28,7 @@ from unittest.mock import patch
import openai
import pytest
from modelscope import snapshot_download # type: ignore
from vllm.utils import get_open_port
from vllm.utils.network_utils import get_open_port
from tests.e2e.conftest import RemoteOpenAIServer, VllmRunner