From c3c8c9317c9afa2b3ff52dd1cd3b34367fe32b45 Mon Sep 17 00:00:00 2001 From: yupeng <507435917@qq.com> Date: Wed, 2 Jul 2025 14:41:31 +0800 Subject: [PATCH] [DOC] add LoRA user guide (#1265) ### What this PR does / why we need it? Add LoRA user guide to DOC. The content refers to [LoRA Adapters](https://docs.vllm.ai/en/latest/features/lora.html). ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? No --------- Signed-off-by: paulyu12 <507435917@qq.com> --- docs/source/index.md | 1 + docs/source/user_guide/lora.md | 8 ++++++++ 2 files changed, 9 insertions(+) create mode 100644 docs/source/user_guide/lora.md diff --git a/docs/source/index.md b/docs/source/index.md index 4030709..3842b1e 100644 --- a/docs/source/index.md +++ b/docs/source/index.md @@ -49,6 +49,7 @@ user_guide/env_vars user_guide/additional_config user_guide/sleep_mode user_guide/graph_mode.md +user_guide/lora.md user_guide/quantization.md user_guide/release_notes user_guide/structured_output diff --git a/docs/source/user_guide/lora.md b/docs/source/user_guide/lora.md new file mode 100644 index 0000000..d13ba2b --- /dev/null +++ b/docs/source/user_guide/lora.md @@ -0,0 +1,8 @@ +# LoRA Adapters + +Like vLLM, vllm-scend supports LoRA as well. The usage and more details can be found in [vLLM official document](https://docs.vllm.ai/en/latest/features/lora.html). + +You can also refer to [this](https://docs.vllm.ai/en/latest/models/supported_models.html#list-of-text-only-language-models) to find which models support LoRA in vLLM. + +## Tips +If you fail to run vllm-ascend with LoRA, you may follow [this instruction](https://vllm-ascend.readthedocs.io/en/latest/user_guide/graph_mode.html#fallback-to-eager-mode) to disable graph mode and try again.