[v0.18.0][Doc] Translated Doc files 2026-04-14 (#8257)

## Auto-Translation Summary

Translated **102** file(s):

-
<code>docs/source/locale/zh_CN/LC_MESSAGES/community/contributors.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/community/governance.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/community/user_stories/index.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/community/user_stories/llamafactory.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/community/versioning_policy.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/developer_guide/Design_Documents/patch.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/developer_guide/contribution/index.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/developer_guide/contribution/testing.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/developer_guide/evaluation/using_evalscope.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/developer_guide/evaluation/using_lm_eval.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/developer_guide/evaluation/using_opencompass.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/developer_guide/performance_and_debug/msprobe_guide.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/developer_guide/performance_and_debug/performance_benchmark.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/developer_guide/performance_and_debug/service_profiling_guide.po</code>
- <code>docs/source/locale/zh_CN/LC_MESSAGES/faqs.po</code>
- <code>docs/source/locale/zh_CN/LC_MESSAGES/index.po</code>
- <code>docs/source/locale/zh_CN/LC_MESSAGES/installation.po</code>
- <code>docs/source/locale/zh_CN/LC_MESSAGES/quick_start.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/user_guide/configuration/additional_config.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/user_guide/feature_guide/graph_mode.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/user_guide/feature_guide/lora.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/user_guide/feature_guide/quantization.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/user_guide/feature_guide/sleep_mode.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/user_guide/feature_guide/structured_output.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/user_guide/release_notes.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/user_guide/support_matrix/index.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/user_guide/support_matrix/supported_features.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/user_guide/support_matrix/supported_models.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/developer_guide/Design_Documents/ACL_Graph.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/developer_guide/Design_Documents/KV_Cache_Pool_Guide.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/developer_guide/Design_Documents/ModelRunner_prepare_inputs.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/developer_guide/Design_Documents/add_custom_aclnn_op.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/developer_guide/Design_Documents/context_parallel.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/developer_guide/Design_Documents/cpu_binding.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/developer_guide/Design_Documents/disaggregated_prefill.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/developer_guide/Design_Documents/eplb_swift_balancer.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/developer_guide/Design_Documents/npugraph_ex.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/developer_guide/Design_Documents/quantization.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/developer_guide/contribution/multi_node_test.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/developer_guide/evaluation/using_ais_bench.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/developer_guide/performance_and_debug/optimization_and_tuning.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/features/index.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/features/long_sequence_context_parallel_multi_node.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/features/long_sequence_context_parallel_single_node.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/features/pd_colocated_mooncake_multi_instance.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/features/pd_disaggregation_mooncake_multi_node.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/features/pd_disaggregation_mooncake_single_node.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/features/ray.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/features/suffix_speculative_decoding.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/hardwares/310p.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/hardwares/index.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/models/DeepSeek-R1.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/models/DeepSeek-V3.1.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/models/DeepSeek-V3.2.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/models/GLM4.x.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/models/GLM5.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/models/Kimi-K2-Thinking.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/models/Kimi-K2.5.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/models/MiniMax-M2.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/models/PaddleOCR-VL.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/models/Qwen-VL-Dense.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/models/Qwen2.5-7B.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/models/Qwen2.5-Omni.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/models/Qwen3-235B-A22B.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/models/Qwen3-30B-A3B.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/models/Qwen3-32B-W4A4.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/models/Qwen3-8B-W4A8.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/models/Qwen3-Coder-30B-A3B.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/models/Qwen3-Dense.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/models/Qwen3-Next.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/models/Qwen3-Omni-30B-A3B-Thinking.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/models/Qwen3-VL-235B-A22B-Instruct.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/models/Qwen3-VL-30B-A3B-Instruct.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/models/Qwen3-VL-Embedding.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/models/Qwen3-VL-Reranker.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/models/Qwen3.5-27B.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/models/Qwen3.5-397B-A17B.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/models/Qwen3_embedding.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/models/Qwen3_reranker.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/tutorials/models/index.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/user_guide/deployment_guide/index.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/user_guide/deployment_guide/using_volcano_kthena.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/user_guide/feature_guide/Fine_grained_TP.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/user_guide/feature_guide/Multi_Token_Prediction.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/user_guide/feature_guide/batch_invariance.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/user_guide/feature_guide/context_parallel.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/user_guide/feature_guide/cpu_binding.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/user_guide/feature_guide/dynamic_batch.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/user_guide/feature_guide/epd_disaggregation.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/user_guide/feature_guide/eplb_swift_balancer.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/user_guide/feature_guide/external_dp.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/user_guide/feature_guide/kv_pool.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/user_guide/feature_guide/large_scale_ep.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/user_guide/feature_guide/layer_sharding.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/user_guide/feature_guide/lmcache_ascend_deployment.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/user_guide/feature_guide/netloader.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/user_guide/feature_guide/npugraph_ex.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/user_guide/feature_guide/rfork.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/user_guide/feature_guide/sequence_parallelism.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/user_guide/feature_guide/speculative_decoding.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/user_guide/feature_guide/ucm_deployment.po</code>
-
<code>docs/source/locale/zh_CN/LC_MESSAGES/user_guide/feature_guide/weight_prefetch.po</code>

---

[Workflow
run](https://github.com/vllm-project/vllm-ascend/actions/runs/24390263284)

Signed-off-by: vllm-ascend-ci <vllm-ascend-ci@users.noreply.github.com>
Co-authored-by: vllm-ascend-ci <vllm-ascend-ci@users.noreply.github.com>
This commit is contained in:
vllm-ascend-ci
2026-04-15 15:27:09 +08:00
committed by GitHub
parent b6aa5bbdbf
commit 147b589f62
102 changed files with 41760 additions and 6023 deletions

View File

@@ -4,146 +4,150 @@
# package.
# FIRST AUTHOR <EMAIL@ADDRESS>, 2025.
#
#, fuzzy
msgid ""
msgstr ""
"Project-Id-Version: vllm-ascend\n"
"Project-Id-Version: vllm-ascend\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2025-07-18 09:01+0800\n"
"POT-Creation-Date: 2026-04-14 09:08+0000\n"
"PO-Revision-Date: 2025-07-18 10:09+0800\n"
"Last-Translator: \n"
"Language-Team: zh_CN <LL@li.org>\n"
"Language: zh_CN\n"
"Language-Team: zh_CN <LL@li.org>\n"
"Plural-Forms: nplurals=1; plural=0;\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=utf-8\n"
"Content-Transfer-Encoding: 8bit\n"
"Plural-Forms: nplurals=1; plural=0;\n"
"Generated-By: Babel 2.17.0\n"
"X-Generator: Poedit 3.5\n"
"Generated-By: Babel 2.18.0\n"
#: ../../quick_start.md:1
#: ../../source/quick_start.md:1
msgid "Quickstart"
msgstr "快速入门"
#: ../../quick_start.md:3
#: ../../source/quick_start.md:3
msgid "Prerequisites"
msgstr "先决条件"
#: ../../quick_start.md:5
#: ../../source/quick_start.md:5
msgid "Supported Devices"
msgstr "支持的设备"
#: ../../quick_start.md:6
#: ../../source/quick_start.md:7
msgid ""
"Atlas A2 Training series (Atlas 800T A2, Atlas 900 A2 PoD, Atlas 200T A2 "
"Atlas A2 training series (Atlas 800T A2, Atlas 900 A2 PoD, Atlas 200T A2 "
"Box16, Atlas 300T A2)"
msgstr ""
"Atlas A2 训练系列Atlas 800T A2Atlas 900 A2 PoDAtlas 200T A2 Box16"
"Atlas 300T A2"
"Atlas A2 训练系列Atlas 800T A2Atlas 900 A2 PoDAtlas 200T A2 Box16、Atlas "
"300T A2"
#: ../../quick_start.md:7
msgid "Atlas 800I A2 Inference series (Atlas 800I A2)"
#: ../../source/quick_start.md:8
msgid "Atlas 800I A2 inference series (Atlas 800I A2)"
msgstr "Atlas 800I A2 推理系列Atlas 800I A2"
#: ../../quick_start.md:9
#: ../../source/quick_start.md:9
msgid ""
"Atlas A3 training series (Atlas 800T A3, Atlas 900 A3 SuperPoD, Atlas "
"9000 A3 SuperPoD)"
msgstr ""
"Atlas A3 训练系列Atlas 800T A3、Atlas 900 A3 SuperPoD、Atlas 9000 A3 SuperPoD"
#: ../../source/quick_start.md:10
msgid "Atlas 800I A3 inference series (Atlas 800I A3)"
msgstr "Atlas 800I A3 推理系列Atlas 800I A3"
#: ../../source/quick_start.md:11
msgid "[Experimental] Atlas 300I inference series (Atlas 300I Duo)"
msgstr "[实验性] Atlas 300I 推理系列Atlas 300I Duo"
#: ../../source/quick_start.md:13
msgid "Setup environment using container"
msgstr "使用容器设置环境"
#: ../../quick_start.md
#: ../../source/quick_start.md
msgid "Ubuntu"
msgstr "Ubuntu"
#: ../../quick_start.md
#: ../../source/quick_start.md
msgid "openEuler"
msgstr "openEuler"
#: ../../quick_start.md:69
#: ../../source/quick_start.md:85
msgid ""
"The default workdir is `/workspace`, vLLM and vLLM Ascend code are placed "
"in `/vllm-workspace` and installed in [development mode](https://setuptools."
"pypa.io/en/latest/userguide/development_mode.html)(`pip install -e`) to "
"help developer immediately take place changes without requiring a new "
"installation."
"The default workdir is `/workspace`, vLLM and vLLM Ascend code are placed"
" in `/vllm-workspace` and installed in [development "
"mode](https://setuptools.pypa.io/en/latest/userguide/development_mode.html)"
" (`pip install -e`) to help developers make changes effective immediately"
" without requiring a new installation."
msgstr ""
"默认工作目录 `/workspace`vLLM 和 vLLM Ascend 代码被放置在 `/vllm-"
"workspace`,并以[开发模式](https://setuptools.pypa.io/en/latest/userguide/"
"development_mode.html)`pip install -e`)安装,以便开发者能够即时生效更改,"
"而无需重新安装。"
"默认工作目录 `/workspace`vLLM 和 vLLM Ascend 代码位于 `/vllm-workspace` 目录下,并以[开发模式](https://setuptools.pypa.io/en/latest/userguide/development_mode.html)`pip install -e`)安装,以便开发者能够即时生效更改,而无需重新安装。"
#: ../../quick_start.md:71
#: ../../source/quick_start.md:87
msgid "Usage"
msgstr "用法"
#: ../../quick_start.md:73
msgid "You can use Modelscope mirror to speed up download:"
msgstr "可以使用 Modelscope 镜像来加速下载:"
#: ../../source/quick_start.md:89
msgid "You can use ModelScope mirror to speed up download:"
msgstr "可以使用 ModelScope 镜像来加速下载:"
#: ../../quick_start.md:80
#: ../../source/quick_start.md:97
msgid "There are two ways to start vLLM on Ascend NPU:"
msgstr "在昇腾 NPU 上启动 vLLM 有两种方式:"
#: ../../quick_start.md
#: ../../source/quick_start.md
msgid "Offline Batched Inference"
msgstr "离线批量推理"
#: ../../quick_start.md:86
#: ../../source/quick_start.md:103
msgid ""
"With vLLM installed, you can start generating texts for list of input "
"prompts (i.e. offline batch inferencing)."
msgstr ""
"安装了 vLLM 后,您可以开始为一系列输入提示生成文本(即离线批量推理)。"
"prompts (i.e. offline batch inference)."
msgstr "安装 vLLM 后,您可以开始为一系列输入提示生成文本(即离线批量推理)。"
#: ../../quick_start.md:88
#: ../../source/quick_start.md:105
msgid ""
"Try to run below Python script directly or use `python3` shell to generate "
"texts:"
msgstr ""
"尝试直接运行下面的 Python 脚本,或者使用 `python3` 交互式命令行来生成文本:"
"Try to run below Python script directly or use `python3` shell to "
"generate texts:"
msgstr "尝试直接运行下面的 Python 脚本,或者使用 `python3` 交互式环境来生成文本:"
#: ../../quick_start.md
#: ../../source/quick_start.md
msgid "OpenAI Completions API"
msgstr "OpenAI Completions API"
#: ../../quick_start.md:114
#: ../../source/quick_start.md:132
msgid ""
"vLLM can also be deployed as a server that implements the OpenAI API "
"protocol. Run the following command to start the vLLM server with the [Qwen/"
"Qwen2.5-0.5B-Instruct](https://huggingface.co/Qwen/Qwen2.5-0.5B-Instruct) "
"model:"
"protocol. Run the following command to start the vLLM server with the "
"[Qwen/Qwen3-0.6B](https://huggingface.co/Qwen/Qwen3-0.6B) model:"
msgstr ""
"vLLM 也可以为实现 OpenAI API 协议的服务器进行部署。运行以下命令,使用 "
"[Qwen/Qwen2.5-0.5B-Instruct](https://huggingface.co/Qwen/Qwen2.5-0.5B-"
"Instruct) 模型启动 vLLM 服务器:"
"vLLM 也可以部署为实现 OpenAI API 协议的服务器。运行以下命令,使用 [Qwen/Qwen3-0.6B](https://huggingface.co/Qwen/Qwen3-0.6B) 模型启动 vLLM 服务器:"
#: ../../quick_start.md:124
msgid "If you see log as below:"
msgstr "如果看到如下日志:"
#: ../../source/quick_start.md:143
msgid "If you see a log as below:"
msgstr "如果看到如下日志:"
#: ../../quick_start.md:132
#: ../../source/quick_start.md:152
msgid "Congratulations, you have successfully started the vLLM server!"
msgstr "恭喜,你已经成功启动 vLLM 服务器!"
msgstr "恭喜,您已成功启动 vLLM 服务器!"
#: ../../quick_start.md:134
msgid "You can query the list the models:"
msgstr "可以查询模型列表:"
#: ../../source/quick_start.md:154
msgid "You can query the list of models:"
msgstr "可以查询模型列表:"
#: ../../quick_start.md:141
#: ../../source/quick_start.md:162
msgid "You can also query the model with input prompts:"
msgstr "也可以通过输入提示来查询模型:"
msgstr "也可以通过输入提示来查询模型:"
#: ../../quick_start.md:155
#: ../../source/quick_start.md:177
msgid ""
"vLLM is serving as background process, you can use `kill -2 $VLLM_PID` to "
"stop the background process gracefully, it's equal to `Ctrl-C` to stop "
"foreground vLLM process:"
"vLLM is serving as a background process, you can use `kill -2 $VLLM_PID` "
"to stop the background process gracefully, which is similar to `Ctrl-C` "
"for stopping the foreground vLLM process:"
msgstr ""
"vLLM 正作为后台进程运行,可以使用 `kill -2 $VLLM_PID` 来优雅地停止后台进"
"程,这等同于使用 `Ctrl-C` 停止前台 vLLM 进程:"
"vLLM 正作为后台进程运行,可以使用 `kill -2 $VLLM_PID` 来优雅地停止后台进程,这类似于使用 `Ctrl-C` 停止前台 vLLM 进程:"
#: ../../quick_start.md:164
msgid "You will see output as below:"
msgstr "你将会看到如下输出:"
#: ../../source/quick_start.md:186
msgid "The output is as below:"
msgstr "输出如下"
#: ../../quick_start.md:172
msgid "Finally, you can exit container by using `ctrl-D`."
msgstr "最后,可以通过按 `ctrl-D` 退出容器。"
#: ../../source/quick_start.md:195
msgid "Finally, you can exit the container by using `ctrl-D`."
msgstr "最后,可以通过按 `ctrl-D` 退出容器。"