Followup fix on official doc update (#34)

### What this PR does / why we need it?
- Fix typos: vllm-ascned --> vllm-ascend
- For version info

### Does this PR introduce _any_ user-facing change?
No


### How was this patch tested?
preview

Signed-off-by: Yikun Jiang <yikunkero@gmail.com>
This commit is contained in:
Yikun Jiang
2025-02-11 14:28:26 +08:00
committed by GitHub
parent 51eadc68b9
commit eb189aac81
3 changed files with 30 additions and 25 deletions

View File

@@ -33,7 +33,11 @@ By using vLLM Ascend plugin, popular open-source models, including Transformer-l
## Prerequisites
- Hardware: Atlas 800I A2 Inference series, Atlas A2 Training series
- Software: vLLM (the same version as vllm-ascned), Python >= 3.9, CANN >= 8.0.RC2, PyTorch >= 2.4.0, torch-npu >= 2.4.0
- Software:
* Python >= 3.9
* CANN >= 8.0.RC2
* PyTorch >= 2.4.0, torch-npu >= 2.4.0
* vLLM (the same version as vllm-ascend)
Find more about how to setup your environment step by step in [here](docs/installation.md).
@@ -64,7 +68,7 @@ Run the following command to start the vLLM server with the [Qwen/Qwen2.5-0.5B-I
vllm serve Qwen/Qwen2.5-0.5B-Instruct
curl http://localhost:8000/v1/models
```
**Please refer to [Official Docs](./docs/index.md) for more details.**
**Please refer to [official docs](./docs/index.md) for more details.**
## Contributing
See [CONTRIBUTING](./CONTRIBUTING.md) for more details, which is a step-by-step guide to help you set up development environment, build and test.

View File

@@ -33,9 +33,13 @@ vLLM 昇腾插件 (`vllm-ascend`) 是一个让vLLM在Ascend NPU无缝运行的
## 准备
- 硬件Atlas 800I A2 Inference系列、Atlas A2 Training系列
- 软件:vLLM与vllm-ascned版本相同Python >= 3.9CANN >= 8.0.RC2PyTorch >= 2.4.0torch-npu >= 2.4.0
- 软件:
* Python >= 3.9
* CANN >= 8.0.RC2
* PyTorch >= 2.4.0, torch-npu >= 2.4.0
* vLLM (与vllm-ascend版本一致)
在[此处](docs/installation.md) 中查找有关如何逐步设置环境的更多信息
在[此处](docs/installation.md),您可以了解如何逐步准备环境
## 开始使用

View File

@@ -1,26 +1,6 @@
# Installation
## Building
#### Build Python package from source
```bash
git clone https://github.com/vllm-project/vllm-ascend.git
cd vllm-ascend
pip install -e .
```
#### Build container image from source
```bash
git clone https://github.com/vllm-project/vllm-ascend.git
cd vllm-ascend
docker build -t vllm-ascend-dev-image -f ./Dockerfile .
```
### Prepare Ascend NPU environment
### Dependencies
### 1. Dependencies
| Requirement | Supported version | Recommended version | Note |
| ------------ | ------- | ----------- | ----------- |
| Python | >= 3.9 | [3.10](https://www.python.org/downloads/) | Required for vllm |
@@ -28,6 +8,7 @@ docker build -t vllm-ascend-dev-image -f ./Dockerfile .
| torch-npu | >= 2.4.0 | [2.5.1rc1](https://gitee.com/ascend/pytorch/releases/tag/v6.0.0.alpha001-pytorch2.5.1) | Required for vllm-ascend |
| torch | >= 2.4.0 | [2.5.1](https://github.com/pytorch/pytorch/releases/tag/v2.5.1) | Required for torch-npu and vllm required |
### 2. Prepare Ascend NPU environment
Below is a quick note to install recommended version software:
@@ -56,3 +37,19 @@ You do not need to install `torch` and `torch_npu` manually, they will be automa
Or follow the instructions provided in the [Ascend Installation Guide](https://ascend.github.io/docs/sources/ascend/quick_install.html) to set up the environment.
### 3. Building
#### Build Python package from source
```bash
git clone https://github.com/vllm-project/vllm-ascend.git
cd vllm-ascend
pip install -e .
```
#### Build container image from source
```bash
git clone https://github.com/vllm-project/vllm-ascend.git
cd vllm-ascend
docker build -t vllm-ascend-dev-image -f ./Dockerfile .
```