update torch-npu to 2.5.1.post1.dev20250619 (#1347)
### What this PR does / why we need it? This PR update the torch_npu to newest release version 2.5.1.post1.dev20250619 . ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? CI tested will guarantee the update Signed-off-by: ganyi <pleaplusone.gy@gmail.com>
This commit is contained in:
@@ -38,7 +38,7 @@ By using vLLM Ascend plugin, popular open-source models, including Transformer-l
|
||||
- Software:
|
||||
* Python >= 3.9, < 3.12
|
||||
* CANN >= 8.1.RC1
|
||||
* PyTorch >= 2.5.1, torch-npu >= 2.5.1.post1.dev20250528
|
||||
* PyTorch >= 2.5.1, torch-npu >= 2.5.1.post1.dev20250619
|
||||
* vLLM (the same version as vllm-ascend)
|
||||
|
||||
## Getting Started
|
||||
|
||||
@@ -39,7 +39,7 @@ vLLM 昇腾插件 (`vllm-ascend`) 是一个由社区维护的让vLLM在Ascend NP
|
||||
- 软件:
|
||||
* Python >= 3.9, < 3.12
|
||||
* CANN >= 8.1.RC1
|
||||
* PyTorch >= 2.5.1, torch-npu >= 2.5.1.post1.dev20250528
|
||||
* PyTorch >= 2.5.1, torch-npu >= 2.5.1.post1.dev20250619
|
||||
* vLLM (与vllm-ascend版本一致)
|
||||
|
||||
## 开始使用
|
||||
|
||||
@@ -12,7 +12,7 @@ This document describes how to install vllm-ascend manually.
|
||||
| Software | Supported version | Note |
|
||||
|---------------|----------------------------------|-------------------------------------------|
|
||||
| CANN | >= 8.1.RC1 | Required for vllm-ascend and torch-npu |
|
||||
| torch-npu | >= 2.5.1.post1.dev20250528 | Required for vllm-ascend |
|
||||
| torch-npu | >= 2.5.1.post1.dev20250619 | Required for vllm-ascend |
|
||||
| torch | >= 2.5.1 | Required for torch-npu and vllm |
|
||||
|
||||
You have 2 way to install:
|
||||
|
||||
@@ -24,4 +24,4 @@ numba
|
||||
# Install torch_npu
|
||||
--pre
|
||||
--extra-index-url https://mirrors.huaweicloud.com/ascend/repos/pypi
|
||||
torch-npu==2.5.1.post1.dev20250528
|
||||
torch-npu==2.5.1.post1.dev20250619
|
||||
|
||||
Reference in New Issue
Block a user