[misc]Add Kimi-K2 series to CI model list (#5656)

### What this PR does / why we need it?
Add the model to CI for subsequent addition of nightly test cases:
- moonshotai/Kimi-K2-Thinking
- vllm-ascend/Kimi-K2-Instruct-W8A8

### How was this patch tested?

- vLLM version: v0.13.0
- vLLM main:
2f4e6548ef

---------

Signed-off-by: MrZ20 <2609716663@qq.com>
Signed-off-by: wangli <wangli858794774@gmail.com>
Co-authored-by: wangli <wangli858794774@gmail.com>
This commit is contained in:
SILONG ZENG
2026-01-07 11:32:48 +08:00
committed by GitHub
parent d6bb17f10e
commit 1afbc01ed4
2 changed files with 14 additions and 7 deletions

View File

@@ -19,26 +19,28 @@ jobs:
download-models:
if: contains(github.event.pull_request.labels.*.name, 'model-download')
name: Download models from ModelScope
runs-on: linux-aarch64-a3-0
runs-on: linux-aarch64-a2-0
container:
image: swr.cn-southwest-2.myhuaweicloud.com/base_image/ascend-ci/vllm-ascend:nightly-cpu
steps:
- name: Install dependencies
run: |
apt-get update -y && apt-get install git jq -y
pip config set global.index-url https://mirrors.tuna.tsinghua.edu.cn/pypi/web/simple
pip install modelscope
- name: Checkout PR branch
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Install dependencies
run: |
pip config set global.index-url https://mirrors.tuna.tsinghua.edu.cn/pypi/web/simple
pip install modelscope jq
- name: Extract new models from PR
id: diff
run: |
set -euo pipefail
git config --global --add safe.directory /__w/vllm-ascend/vllm-ascend
JSON_PATH=".github/workflows/misc/model_list.json"
git fetch origin main
@@ -59,9 +61,12 @@ jobs:
cat /tmp/new_models.txt || true
- name: Download new models (CLI)
if: hashFiles('/tmp/new_models.txt') != ''
run: |
set -euo pipefail
if [ ! -s /tmp/new_models.txt ]; then
echo "No new models to download."
exit 0
fi
while read -r model; do
[ -z "$model" ] && continue

View File

@@ -147,6 +147,7 @@
"mistralai/Mistral-Small-3.1-24B-Instruct-2503",
"mlx-community/DeepSeek-V3-3bit-bf16",
"moonshotai/..__temp",
"moonshotai/Kimi-K2-Thinking",
"moonshotai/Kimi-Linear-48B-A3B-Instruct",
"neuralmagic/Qwen2.5-3B-quantized.w8a8",
"nv-community/audio-flamingo-3",
@@ -180,6 +181,7 @@
"vllm-ascend/DeepSeek-V3.2-W8A8-Pruning",
"vllm-ascend/EAGLE-LLaMA3.1-Instruct-8B",
"vllm-ascend/EAGLE3-LLaMA3.1-Instruct-8B",
"vllm-ascend/Kimi-K2-Instruct-W8A8",
"vllm-ascend/Kimi-K2-Thinking-Pruning",
"vllm-ascend/Llama-2-7b-hf",
"vllm-ascend/Llama-3.2-3B-Instruct",