From 065486820bb772f27823fbf82d569afa3ebb97f6 Mon Sep 17 00:00:00 2001 From: weiguihua2 Date: Mon, 29 Sep 2025 12:02:23 +0800 Subject: [PATCH] [Doc] add faqs:install vllm-ascend will overwrite existing torch-npu (#3245) ### What this PR does / why we need it? add faqs:install vllm-ascend will overwrite existing torch-npu ### Does this PR introduce _any_ user-facing change? ### How was this patch tested? - vLLM version: v0.10.2 - vLLM main: https://github.com/vllm-project/vllm/commit/releases/v0.11.0 Signed-off-by: weiguihua2 --- docs/source/faqs.md | 3 +++ 1 file changed, 3 insertions(+) diff --git a/docs/source/faqs.md b/docs/source/faqs.md index 41ac033..bd2b479 100644 --- a/docs/source/faqs.md +++ b/docs/source/faqs.md @@ -211,3 +211,6 @@ Recommended mitigation strategies: Root cause analysis: The current stream requirement calculation for size captures only accounts for measurable factors including: data parallel size, tensor parallel size, expert parallel configuration, piece graph count, multistream overlap shared expert settings, and HCCL communication mode (AIV/AICPU). However, numerous unquantifiable elements - such as operator characteristics and specific hardware features - consume additional streams outside of this calculation framework, resulting in stream resource exhaustion during size capture operations. + +### 21. Installing vllm-ascend will overwrite the existing torch-npu package? +Installing vllm-ascend will overwrite the existing torch-npu package. If you need to install a specific version of torch-npu, you can manually install the specified version of torch-npu after installing vllm-ascend.