[CI] enable custom ops build (#466)

### What this PR does / why we need it?
This PR enable custom ops build  by default. 

### Does this PR introduce _any_ user-facing change?

Yes, users now install vllm-ascend from source will trigger custom ops
build step.

### How was this patch tested?
By image build and e2e CI

---------

Signed-off-by: wangxiyuan <wangxiyuan1007@gmail.com>
This commit is contained in:
wangxiyuan
2025-04-12 10:24:53 +08:00
committed by GitHub
parent d05ea17427
commit 9c7428b3d5
22 changed files with 165 additions and 342 deletions

View File

@@ -166,7 +166,7 @@ python -m vllm.entrypoints.openai.api_server \
```
:::{note}
If you're running DeepSeek V3/R1, please remove `quantization_config` section in `config.json` file since it's not supported by vllm-ascend currentlly.
If you're running DeepSeek V3/R1, please remove `quantization_config` section in `config.json` file since it's not supported by vllm-ascend currently.
:::
Once your server is started, you can query the model with input prompts: