Refactor AscendMultiHeadLatentAttention (#2826)

### What this PR does / why we need it?
Register AscendMultiHeadLatentAttention as CustomOP, following vllm changes

### Does this PR introduce _any_ user-facing change?
N/A

### How was this patch tested?
CI passed with new added/existing test.


- vLLM version: main
- vLLM main:
b23fb78623

---------

Signed-off-by: Icey <1790571317@qq.com>
This commit is contained in:
Icey
2025-09-10 11:26:11 +08:00
committed by GitHub
parent 168ad600b5
commit aa4d2a91ed
4 changed files with 170 additions and 48 deletions

View File