From de49fb3deba8746384e0bb9b6fa7eb2012745fe6 Mon Sep 17 00:00:00 2001 From: Canlin Guo Date: Mon, 10 Nov 2025 11:50:12 +0800 Subject: [PATCH] [Feature][Build] Upgrade the minimum version to 3.10 (#3926) ### What this PR does / why we need it? Closes #3728, #3657. The main branch is now aligned with the vllm `releases/v0.11.1` branch, which no longer supports `Python 3.9`. Check it [here](https://github.com/vllm-project/vllm/blob/releases/v0.11.1/pyproject.toml). ### Does this PR introduce _any_ user-facing change? The newest version of vllm-ascend don't support Python 3.9. ### How was this patch tested? - vLLM version: v0.11.0 - vLLM main: https://github.com/vllm-project/vllm/commit/83f478bb19489b41e9d208b47b4bb5a95ac171ac Signed-off-by: gcanlin --- README.md | 2 +- README.zh.md | 2 +- docs/source/installation.md | 2 +- .../mooncake_connector_deployment_guide.md | 2 +- setup.py | 3 +-- 5 files changed, 5 insertions(+), 6 deletions(-) diff --git a/README.md b/README.md index 1d0529bf..281c6375 100644 --- a/README.md +++ b/README.md @@ -41,7 +41,7 @@ By using vLLM Ascend plugin, popular open-source models, including Transformer-l - Hardware: Atlas 800I A2 Inference series, Atlas A2 Training series, Atlas 800I A3 Inference series, Atlas A3 Training series, Atlas 300I Duo (Experimental) - OS: Linux - Software: - * Python >= 3.9, < 3.12 + * Python >= 3.10, < 3.12 * CANN >= 8.3.rc1 (Ascend HDK version refers to [here](https://www.hiascend.com/document/detail/zh/canncommercial/83RC1/releasenote/releasenote_0000.html)) * PyTorch == 2.7.1, torch-npu == 2.7.1 * vLLM (the same version as vllm-ascend) diff --git a/README.zh.md b/README.zh.md index a28056fa..77bf2363 100644 --- a/README.zh.md +++ b/README.zh.md @@ -42,7 +42,7 @@ vLLM 昇腾插件 (`vllm-ascend`) 是一个由社区维护的让vLLM在Ascend NP - 硬件:Atlas 800I A2 Inference系列、Atlas A2 Training系列、Atlas 800I A3 Inference系列、Atlas A3 Training系列、Atlas 300I Duo(实验性支持) - 操作系统:Linux - 软件: - * Python >= 3.9, < 3.12 + * Python >= 3.10, < 3.12 * CANN >= 8.3.rc1 (Ascend HDK 版本参考[这里](https://www.hiascend.com/document/detail/zh/canncommercial/83RC1/releasenote/releasenote_0000.html)) * PyTorch == 2.7.1, torch-npu == 2.7.1 * vLLM (与vllm-ascend版本一致) diff --git a/docs/source/installation.md b/docs/source/installation.md index fbc1c391..39f77d9c 100644 --- a/docs/source/installation.md +++ b/docs/source/installation.md @@ -5,7 +5,7 @@ This document describes how to install vllm-ascend manually. ## Requirements - OS: Linux -- Python: >= 3.9, < 3.12 +- Python: >= 3.10, < 3.12 - A hardware with Ascend NPU. It's usually the Atlas 800 A2 series. - Software: diff --git a/examples/disaggregated_prefill_v1/mooncake_connector_deployment_guide.md b/examples/disaggregated_prefill_v1/mooncake_connector_deployment_guide.md index 563357f8..45948a4d 100644 --- a/examples/disaggregated_prefill_v1/mooncake_connector_deployment_guide.md +++ b/examples/disaggregated_prefill_v1/mooncake_connector_deployment_guide.md @@ -3,7 +3,7 @@ ## Environmental Dependencies * Software: - * Python >= 3.9, < 3.12 + * Python >= 3.10, < 3.12 * CANN >= 8.3.rc1 * PyTorch == 2.7.1, torch-npu == 2.7.1 * vLLM (same version as vllm-ascend) diff --git a/setup.py b/setup.py index 5a823e7a..8c49c96a 100644 --- a/setup.py +++ b/setup.py @@ -373,7 +373,6 @@ setup( }, # TODO: Add 3.12 back when torch-npu support 3.12 classifiers=[ - "Programming Language :: Python :: 3.9", "Programming Language :: Python :: 3.10", "Programming Language :: Python :: 3.11", "License :: OSI Approved :: Apache Software License", @@ -384,7 +383,7 @@ setup( "Topic :: Scientific/Engineering :: Information Analysis", ], packages=find_packages(exclude=("docs", "examples", "tests*", "csrc")), - python_requires=">=3.9", + python_requires=">=3.10", install_requires=get_requirements(), ext_modules=ext_modules, cmdclass=cmdclass,