Update vllm pin to 12.25 (#5342)
### What this PR does / why we need it?
- Fix vllm break in the pr:
1.[Drop v0.14 deprecations
]https://github.com/vllm-project/vllm/pull/31285
### Does this PR introduce _any_ user-facing change?
No
### How was this patch tested?
- vLLM version: release/v0.13.0
- vLLM main:
bc0a5a0c08
---------
Signed-off-by: ZT-AIA <1028681969@qq.com>
This commit is contained in:
@@ -28,7 +28,7 @@ from unittest.mock import patch
|
||||
import openai
|
||||
import pytest
|
||||
from modelscope import snapshot_download # type: ignore
|
||||
from vllm.utils import get_open_port
|
||||
from vllm.utils.network_utils import get_open_port
|
||||
|
||||
from tests.e2e.conftest import RemoteOpenAIServer, VllmRunner
|
||||
|
||||
|
||||
Reference in New Issue
Block a user