[Build] Add installation script of fused_infer_attention_score kernel with flash decoding (#5402)
### What this PR does / why we need it?
Add installation script of `fused_infer_attention_score` kernel with
flash decoding
### Userface changes
Users can install the kernel `fused_infer_attention_score` with flash
decoding feature by `bash
tools/install_flash_infer_attention_score_ops_a2.sh` or `bash
tools/install_flash_infer_attention_score_ops_a3.sh`
- vLLM version: release/v0.13.0
- vLLM main:
254f6b9867
---------
Signed-off-by: MengqingCao <cmq0113@163.com>
This commit is contained in:
@@ -13,7 +13,7 @@ repos:
|
||||
args: [
|
||||
--toml, pyproject.toml,
|
||||
'--skip', 'csrc/**,tests/prompts/**,./benchmarks/sonnet.txt,*tests/lora/data/**,build/**,./vllm_ascend.egg-info/**,.github/**,typos.toml',
|
||||
'-L', 'CANN,cann,NNAL,nnal,ASCEND,ascend,EnQue,CopyIn,ArchType,AND,ND'
|
||||
'-L', 'CANN,cann,NNAL,nnal,ASCEND,ascend,EnQue,CopyIn,ArchType,AND,ND,tbe'
|
||||
]
|
||||
additional_dependencies:
|
||||
- tomli
|
||||
|
||||
Reference in New Issue
Block a user