[CI/Build] Bump torch_npu to dev20250307.3 (#265)
Update torch-npu version to fix torch npu exponential_ accuracy With this update, the percision issue when setting `temperature > 0` is fixed. --------- Signed-off-by: Mengqing Cao <cmq0113@163.com>
This commit is contained in:
4
.github/workflows/vllm_ascend_test.yaml
vendored
4
.github/workflows/vllm_ascend_test.yaml
vendored
@@ -133,9 +133,9 @@ jobs:
|
|||||||
run: |
|
run: |
|
||||||
mkdir pta
|
mkdir pta
|
||||||
cd pta
|
cd pta
|
||||||
wget https://pytorch-package.obs.cn-north-4.myhuaweicloud.com/pta/Daily/v2.5.1/20250226.4/pytorch_v2.5.1_py310.tar.gz
|
wget https://pytorch-package.obs.cn-north-4.myhuaweicloud.com/pta/Daily/v2.5.1/20250307.3/pytorch_v2.5.1_py310.tar.gz
|
||||||
tar -xvf pytorch_v2.5.1_py310.tar.gz
|
tar -xvf pytorch_v2.5.1_py310.tar.gz
|
||||||
pip install ./torch_npu-2.5.1.dev20250226-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
|
pip install ./torch_npu-2.5.1.dev20250307-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
|
||||||
cd ..
|
cd ..
|
||||||
rm -rf pta
|
rm -rf pta
|
||||||
|
|
||||||
|
|||||||
@@ -36,7 +36,7 @@ By using vLLM Ascend plugin, popular open-source models, including Transformer-l
|
|||||||
- Software:
|
- Software:
|
||||||
* Python >= 3.9
|
* Python >= 3.9
|
||||||
* CANN >= 8.0.0
|
* CANN >= 8.0.0
|
||||||
* PyTorch >= 2.5.1, torch-npu >= 2.5.1.dev20250226
|
* PyTorch >= 2.5.1, torch-npu >= 2.5.1.dev20250307
|
||||||
* vLLM (the same version as vllm-ascend)
|
* vLLM (the same version as vllm-ascend)
|
||||||
|
|
||||||
Find more about how to setup your environment step by step in [here](docs/source/installation.md).
|
Find more about how to setup your environment step by step in [here](docs/source/installation.md).
|
||||||
|
|||||||
@@ -36,7 +36,7 @@ vLLM 昇腾插件 (`vllm-ascend`) 是一个让vLLM在Ascend NPU无缝运行的
|
|||||||
- 软件:
|
- 软件:
|
||||||
* Python >= 3.9
|
* Python >= 3.9
|
||||||
* CANN >= 8.0.RC2
|
* CANN >= 8.0.RC2
|
||||||
* PyTorch >= 2.5.1, torch-npu >= 2.5.1.dev20250226
|
* PyTorch >= 2.5.1, torch-npu >= 2.5.1.dev20250307
|
||||||
* vLLM (与vllm-ascend版本一致)
|
* vLLM (与vllm-ascend版本一致)
|
||||||
|
|
||||||
在[此处](docs/source/installation.md),您可以了解如何逐步准备环境。
|
在[此处](docs/source/installation.md),您可以了解如何逐步准备环境。
|
||||||
|
|||||||
@@ -12,7 +12,7 @@ This document describes how to install vllm-ascend manually.
|
|||||||
| Software | Supported version | Note |
|
| Software | Supported version | Note |
|
||||||
| ------------ | ----------------- | ---- |
|
| ------------ | ----------------- | ---- |
|
||||||
| CANN | >= 8.0.0 | Required for vllm-ascend and torch-npu |
|
| CANN | >= 8.0.0 | Required for vllm-ascend and torch-npu |
|
||||||
| torch-npu | >= 2.5.1.dev20250226 | Required for vllm-ascend |
|
| torch-npu | >= 2.5.1.dev20250307 | Required for vllm-ascend |
|
||||||
| torch | >= 2.5.1 | Required for torch-npu and vllm |
|
| torch | >= 2.5.1 | Required for torch-npu and vllm |
|
||||||
|
|
||||||
You have 2 way to install:
|
You have 2 way to install:
|
||||||
@@ -150,15 +150,15 @@ Current version depends on a unreleased `torch-npu`, you need to install manuall
|
|||||||
#
|
#
|
||||||
# Here we take python 3.10 on aarch64 as an example. Feel free to install the correct version for your environment. See:
|
# Here we take python 3.10 on aarch64 as an example. Feel free to install the correct version for your environment. See:
|
||||||
#
|
#
|
||||||
# https://pytorch-package.obs.cn-north-4.myhuaweicloud.com/pta/Daily/v2.5.1/20250226.4/pytorch_v2.5.1_py39.tar.gz
|
# https://pytorch-package.obs.cn-north-4.myhuaweicloud.com/pta/Daily/v2.5.1/20250307.3/pytorch_v2.5.1_py39.tar.gz
|
||||||
# https://pytorch-package.obs.cn-north-4.myhuaweicloud.com/pta/Daily/v2.5.1/20250226.4/pytorch_v2.5.1_py310.tar.gz
|
# https://pytorch-package.obs.cn-north-4.myhuaweicloud.com/pta/Daily/v2.5.1/20250307.3/pytorch_v2.5.1_py310.tar.gz
|
||||||
# https://pytorch-package.obs.cn-north-4.myhuaweicloud.com/pta/Daily/v2.5.1/20250226.4/pytorch_v2.5.1_py311.tar.gz
|
# https://pytorch-package.obs.cn-north-4.myhuaweicloud.com/pta/Daily/v2.5.1/20250307.3/pytorch_v2.5.1_py311.tar.gz
|
||||||
#
|
#
|
||||||
mkdir pta
|
mkdir pta
|
||||||
cd pta
|
cd pta
|
||||||
wget https://pytorch-package.obs.cn-north-4.myhuaweicloud.com/pta/Daily/v2.5.1/20250226.4/pytorch_v2.5.1_py310.tar.gz
|
wget https://pytorch-package.obs.cn-north-4.myhuaweicloud.com/pta/Daily/v2.5.1/20250307.3/pytorch_v2.5.1_py310.tar.gz
|
||||||
tar -xvf pytorch_v2.5.1_py310.tar.gz
|
tar -xvf pytorch_v2.5.1_py310.tar.gz
|
||||||
pip install ./torch_npu-2.5.1.dev20250226-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
|
pip install ./torch_npu-2.5.1.dev20250307-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
|
||||||
```
|
```
|
||||||
::::
|
::::
|
||||||
|
|
||||||
|
|||||||
@@ -212,7 +212,9 @@ Prompt: 'The future of AI is', Generated text: ' following you. As the technolog
|
|||||||
|
|
||||||
Run docker container on each machine:
|
Run docker container on each machine:
|
||||||
|
|
||||||
```shell
|
```{code-block} bash
|
||||||
|
:substitutions:
|
||||||
|
|
||||||
docker run \
|
docker run \
|
||||||
--name vllm-ascend \
|
--name vllm-ascend \
|
||||||
--device /dev/davinci0 \
|
--device /dev/davinci0 \
|
||||||
@@ -233,7 +235,7 @@ docker run \
|
|||||||
-v /etc/ascend_install.info:/etc/ascend_install.info \
|
-v /etc/ascend_install.info:/etc/ascend_install.info \
|
||||||
-v /root/.cache:/root/.cache \
|
-v /root/.cache:/root/.cache \
|
||||||
-p 8000:8000 \
|
-p 8000:8000 \
|
||||||
-it quay.io/ascend/vllm-ascend:v0.7.1rc1 bash
|
-it quay.io/ascend/vllm-ascend:|vllm_ascend_version| bash
|
||||||
```
|
```
|
||||||
|
|
||||||
Choose one machine as head node, the other are worker nodes, then start ray on each machine:
|
Choose one machine as head node, the other are worker nodes, then start ray on each machine:
|
||||||
|
|||||||
@@ -2,14 +2,14 @@
|
|||||||
set -ex
|
set -ex
|
||||||
mkdir pta
|
mkdir pta
|
||||||
cd pta || exit
|
cd pta || exit
|
||||||
wget https://pytorch-package.obs.cn-north-4.myhuaweicloud.com/pta/Daily/v2.5.1/20250226.4/pytorch_v2.5.1_py310.tar.gz
|
wget https://pytorch-package.obs.cn-north-4.myhuaweicloud.com/pta/Daily/v2.5.1/20250307.3/pytorch_v2.5.1_py310.tar.gz
|
||||||
tar -zxvf pytorch_v2.5.1_py310.tar.gz
|
tar -zxvf pytorch_v2.5.1_py310.tar.gz
|
||||||
|
|
||||||
if [ "$(uname -i)" == "aarch64" ]
|
if [ "$(uname -i)" == "aarch64" ]
|
||||||
then
|
then
|
||||||
pip install ./torch_npu-2.5.1.dev20250226-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
|
pip install ./torch_npu-2.5.1.dev20250307-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
|
||||||
else
|
else
|
||||||
pip install ./torch_npu-2.5.1.dev20250226-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl --extra-index https://download.pytorch.org/whl/cpu/
|
pip install ./torch_npu-2.5.1.dev20250307-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl --extra-index https://download.pytorch.org/whl/cpu/
|
||||||
fi
|
fi
|
||||||
|
|
||||||
cd ..
|
cd ..
|
||||||
|
|||||||
Reference in New Issue
Block a user