[CI] enable custom ops build (#466)
### What this PR does / why we need it? This PR enable custom ops build by default. ### Does this PR introduce _any_ user-facing change? Yes, users now install vllm-ascend from source will trigger custom ops build step. ### How was this patch tested? By image build and e2e CI --------- Signed-off-by: wangxiyuan <wangxiyuan1007@gmail.com>
This commit is contained in:
@@ -166,7 +166,7 @@ python -m vllm.entrypoints.openai.api_server \
|
||||
```
|
||||
|
||||
:::{note}
|
||||
If you're running DeepSeek V3/R1, please remove `quantization_config` section in `config.json` file since it's not supported by vllm-ascend currentlly.
|
||||
If you're running DeepSeek V3/R1, please remove `quantization_config` section in `config.json` file since it's not supported by vllm-ascend currently.
|
||||
:::
|
||||
|
||||
Once your server is started, you can query the model with input prompts:
|
||||
|
||||
Reference in New Issue
Block a user