From cff08f9df8db2e369f1a83838f89af9463b36f35 Mon Sep 17 00:00:00 2001 From: Yikun Jiang Date: Thu, 6 Mar 2025 10:42:42 +0800 Subject: [PATCH] [Doc] Add initial FAQs (#247) ### What this PR does / why we need it? Add initial FAQs ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? Preview Signed-off-by: Yikun Jiang --- docs/source/faqs.md | 21 +++++++++++++++++++++ docs/source/index.md | 1 + 2 files changed, 22 insertions(+) create mode 100644 docs/source/faqs.md diff --git a/docs/source/faqs.md b/docs/source/faqs.md new file mode 100644 index 0000000..bb4ba5f --- /dev/null +++ b/docs/source/faqs.md @@ -0,0 +1,21 @@ +# FAQs + +## Version Specific FAQs + +- [[v0.7.1rc1] FAQ & Feedback](https://github.com/vllm-project/vllm-ascend/issues/19) + +## General FAQs + +### 1. What devices are currently supported? + +Currently, **ONLY Atlas A2 series** (Ascend-cann-kernels-910b) are supported: + +- Atlas A2 Training series (Atlas 800T A2, Atlas 900 A2 PoD, Atlas 200T A2 Box16, Atlas 300T A2) +- Atlas 800I A2 Inference series (Atlas 800I A2) + +Below series are NOT supported yet: +- Atlas 300I Duo态Atlas 300I Pro (Ascend-cann-kernels-310p) might be supported on 2025.Q2 +- Atlas 200I A2 (Ascend-cann-kernels-310b) unplanned yet +- Ascend 910, Ascend 910 Pro B (Ascend-cann-kernels-910) unplanned yet + +From a technical view, vllm-ascend support would be possible if the torch-npu is supported. Otherwise, we have to implement it by using custom ops. We are also welcome to join us to improve together. diff --git a/docs/source/index.md b/docs/source/index.md index 05d53fe..e5f9b41 100644 --- a/docs/source/index.md +++ b/docs/source/index.md @@ -36,6 +36,7 @@ By using vLLM Ascend plugin, popular open-source models, including Transformer-l quick_start installation tutorials +faqs ::: % What does vLLM Ascend Plugin support?