### What this PR does / why we need it?
Add Qwen3-235B tutorial including the following examples
- Single-node Online Deployment for 128k context inference
- Multi-node Deployment with MP
- vLLM version: v0.12.0
- vLLM main:
ad32e3e19c
---------
Signed-off-by: xuyexiong <xuyexiong@huawei.com>
Co-authored-by: wangxiyuan <wangxiyuan1007@gmail.com>
539 B
539 B
Tutorials
:::{toctree} :caption: Deployment :maxdepth: 1 single_npu single_npu_qwen2.5_vl single_npu_qwen2_audio single_npu_qwen3_embedding single_npu_qwen3_quantization single_npu_qwen3_w4a4 single_node_pd_disaggregation_llmdatadist multi_npu_qwen3_next multi_npu multi_npu_moge multi_npu_qwen3_moe multi_npu_quantization single_node_300i DeepSeek-V3.1.md DeepSeek-V3.2-Exp.md Qwen3-235B-A22B.md Qwen3-Coder-30B-A3B multi_node multi_node_kimi multi_node_qwen3vl multi_node_pd_disaggregation_mooncake multi_node_ray Qwen2.5-Omni.md :::