[Doc] update readme (#147)

Fix doc issue in README

---------

Signed-off-by: wangxiyuan <wangxiyuan1007@gmail.com>
Co-authored-by: Yikun Jiang <yikunkero@gmail.com>
This commit is contained in:
wangxiyuan
2025-02-25 11:00:58 +08:00
committed by GitHub
parent 3a7882208f
commit 51ae37b22a
3 changed files with 10 additions and 7 deletions

View File

@@ -35,8 +35,8 @@ By using vLLM Ascend plugin, popular open-source models, including Transformer-l
- Hardware: Atlas 800I A2 Inference series, Atlas A2 Training series - Hardware: Atlas 800I A2 Inference series, Atlas A2 Training series
- Software: - Software:
* Python >= 3.9 * Python >= 3.9
* CANN >= 8.0.RC2 * CANN >= 8.0.0
* PyTorch >= 2.4.0, torch-npu >= 2.4.0 * PyTorch >= 2.5.1, torch-npu >= 2.5.1.dev20250218
* vLLM (the same version as vllm-ascend) * vLLM (the same version as vllm-ascend)
Find more about how to setup your environment step by step in [here](docs/source/installation.md). Find more about how to setup your environment step by step in [here](docs/source/installation.md).
@@ -83,13 +83,14 @@ We welcome and value any contributions and collaborations:
vllm-ascend has main branch and dev branch. vllm-ascend has main branch and dev branch.
- **main**: main branchcorresponds to the vLLM main branch, and is continuously monitored for quality through Ascend CI. - **main**: main branchcorresponds to the vLLM main branch, and is continuously monitored for quality through Ascend CI.
- **vX.Y.Z-dev**: development branch, created with part of new releases of vLLM. For example, `v0.7.1-dev` is the dev branch for vLLM `v0.7.1` version. - **vX.Y.Z-dev**: development branch, created with part of new releases of vLLM. For example, `v0.7.3-dev` is the dev branch for vLLM `v0.7.3` version.
Below is maintained branches: Below is maintained branches:
| Branch | Status | Note | | Branch | Status | Note |
|------------|--------------|--------------------------------------| |------------|--------------|--------------------------------------|
| main | Maintained | CI commitment for vLLM main branch | | main | Maintained | CI commitment for vLLM main branch |
| v0.7.1-dev | Unmaintained | Only doc fixed is allowed |
| v0.7.3-dev | Maintained | CI commitment for vLLM 0.7.3 version | | v0.7.3-dev | Maintained | CI commitment for vLLM 0.7.3 version |
Please refer to [Versioning policy](docs/source/developer_guide/versioning_policy.md) for more details. Please refer to [Versioning policy](docs/source/developer_guide/versioning_policy.md) for more details.

View File

@@ -36,7 +36,7 @@ vLLM 昇腾插件 (`vllm-ascend`) 是一个让vLLM在Ascend NPU无缝运行的
- 软件: - 软件:
* Python >= 3.9 * Python >= 3.9
* CANN >= 8.0.RC2 * CANN >= 8.0.RC2
* PyTorch >= 2.4.0, torch-npu >= 2.4.0 * PyTorch >= 2.5.1, torch-npu >= 2.5.1.dev20250218
* vLLM (与vllm-ascend版本一致) * vLLM (与vllm-ascend版本一致)
在[此处](docs/source/installation.md),您可以了解如何逐步准备环境。 在[此处](docs/source/installation.md),您可以了解如何逐步准备环境。
@@ -76,13 +76,14 @@ curl http://localhost:8000/v1/models
vllm-ascend有主干分支和开发分支。 vllm-ascend有主干分支和开发分支。
- **main**: 主干分支与vLLM的主干分支对应并通过昇腾CI持续进行质量看护。 - **main**: 主干分支与vLLM的主干分支对应并通过昇腾CI持续进行质量看护。
- **vX.Y.Z-dev**: 开发分支随vLLM部分新版本发布而创建比如`v0.7.1-dev`是vllm-asend针对vLLM `v0.7.1`版本的开发分支。 - **vX.Y.Z-dev**: 开发分支随vLLM部分新版本发布而创建比如`v0.7.3-dev`是vllm-asend针对vLLM `v0.7.3`版本的开发分支。
下面是维护中的分支: 下面是维护中的分支:
| 分支 | 状态 | 备注 | | 分支 | 状态 | 备注 |
|------------|------------|---------------------| |------------|------------|---------------------|
| main | Maintained | 基于vLLM main分支CI看护 | | main | Maintained | 基于vLLM main分支CI看护 |
| v0.7.1-dev | Unmaintained | 只允许文档修复 |
| v0.7.3-dev | Maintained | 基于vLLM v0.7.3版本CI看护 | | v0.7.3-dev | Maintained | 基于vLLM v0.7.3版本CI看护 |
请参阅[版本策略](docs/source/developer_guide/versioning_policy.zh.md)了解更多详细信息。 请参阅[版本策略](docs/source/developer_guide/versioning_policy.zh.md)了解更多详细信息。

View File

@@ -12,7 +12,7 @@ This document describes how to install vllm-ascend manually.
| Software | Supported version | Note | | Software | Supported version | Note |
| ------------ | ----------------- | ---- | | ------------ | ----------------- | ---- |
| CANN | >= 8.0.0 | Required for vllm-ascend and torch-npu | | CANN | >= 8.0.0 | Required for vllm-ascend and torch-npu |
| torch-npu | >= 2.5.1rc1 | Required for vllm-ascend | | torch-npu | >= 2.5.1.dev20250218 | Required for vllm-ascend |
| torch | >= 2.5.1 | Required for torch-npu and vllm | | torch | >= 2.5.1 | Required for torch-npu and vllm |
You have 2 way to install: You have 2 way to install:
@@ -81,6 +81,8 @@ wget https://ascend-repo.obs.cn-east-2.myhuaweicloud.com/CANN/CANN%208.0.0/Ascen
chmod +x ./Ascend-cann-toolkit_8.0.0_linux-aarch64.run chmod +x ./Ascend-cann-toolkit_8.0.0_linux-aarch64.run
./Ascend-cann-toolkit_8.0.0_linux-aarch64.run --full ./Ascend-cann-toolkit_8.0.0_linux-aarch64.run --full
source /usr/local/Ascend/ascend-toolkit/set_env.sh
wget https://ascend-repo.obs.cn-east-2.myhuaweicloud.com/CANN/CANN%208.0.0/Ascend-cann-kernels-910b_8.0.0_linux-aarch64.run wget https://ascend-repo.obs.cn-east-2.myhuaweicloud.com/CANN/CANN%208.0.0/Ascend-cann-kernels-910b_8.0.0_linux-aarch64.run
chmod +x ./Ascend-cann-kernels-910b_8.0.0_linux-aarch64.run chmod +x ./Ascend-cann-kernels-910b_8.0.0_linux-aarch64.run
./Ascend-cann-kernels-910b_8.0.0_linux-aarch64.run --install ./Ascend-cann-kernels-910b_8.0.0_linux-aarch64.run --install
@@ -89,7 +91,6 @@ wget https://ascend-repo.obs.cn-east-2.myhuaweicloud.com/CANN/CANN%208.0.0/Ascen
chmod +x. /Ascend-cann-nnal_8.0.0_linux-aarch64.run chmod +x. /Ascend-cann-nnal_8.0.0_linux-aarch64.run
./Ascend-cann-nnal_8.0.0_linux-aarch64.run --install ./Ascend-cann-nnal_8.0.0_linux-aarch64.run --install
source /usr/local/Ascend/ascend-toolkit/set_env.sh
source /usr/local/Ascend/nnal/atb/set_env.sh source /usr/local/Ascend/nnal/atb/set_env.sh
``` ```