diff --git a/docs/source/tutorials/multi_node_kimi.md b/docs/source/tutorials/multi_node_kimi.md index 4848eaa..dfada85 100644 --- a/docs/source/tutorials/multi_node_kimi.md +++ b/docs/source/tutorials/multi_node_kimi.md @@ -5,7 +5,7 @@ referring to [multi_node.md](https://vllm-ascend.readthedocs.io/en/latest/tutorials/multi_node.html#verification-process) ## Run with docker -Assume you have two Atlas 800 A3(64G*16) nodes(or 4 *A2* 8), and want to deploy the `Kimi-K2-Instruct-W8A8` quantitative model across multi-node. +Assume you have two Atlas 800 A3(64G*16) nodes(or 4 * A2), and want to deploy the `Kimi-K2-Instruct-W8A8` quantitative model across multi-node. ```{code-block} bash :substitutions: diff --git a/docs/source/tutorials/single_npu_qwen3_quantization.md b/docs/source/tutorials/single_npu_qwen3_quantization.md index 8f23d2c..56b4443 100644 --- a/docs/source/tutorials/single_npu_qwen3_quantization.md +++ b/docs/source/tutorials/single_npu_qwen3_quantization.md @@ -32,12 +32,15 @@ see https://www.modelscope.cn/models/vllm-ascend/Qwen3-8B-W4A8 ::: ```bash -# Optional, this commit has been verified -git clone https://gitee.com/ascend/msit -b f8ab35a772a6c1ee7675368a2aa4bafba3bedd1a - +git clone https://gitee.com/ascend/msit cd msit/msmodelslim + +# Optional, this commit has been verified +git checkout f8ab35a772a6c1ee7675368a2aa4bafba3bedd1a + # Install by run this script bash install.sh +pip install accelerate cd example/Qwen # Original weight path, Replace with your local model path diff --git a/docs/source/user_guide/feature_guide/quantization.md b/docs/source/user_guide/feature_guide/quantization.md index bde91ea..1caa491 100644 --- a/docs/source/user_guide/feature_guide/quantization.md +++ b/docs/source/user_guide/feature_guide/quantization.md @@ -12,10 +12,11 @@ Install modelslim: ```bash git clone https://gitee.com/ascend/msit +cd msit/msmodelslim + # Optional, this commit has been verified git checkout f8ab35a772a6c1ee7675368a2aa4bafba3bedd1a -cd msit/msmodelslim bash install.sh pip install accelerate ```