### What this PR does / why we need it? Support KV-Sharing feature in CLA (cross layer attention) models, which sharing kv cache in some layers. - vLLM version: v0.12.0 - vLLM main: ad32e3e19c --------- Signed-off-by: MengqingCao <cmq0113@163.com>
ad32e3e19c