[BugFix]Support redundant experts in EPLB (#3473)

This PR adds support for redundant experts in the EPLB. 

Key points: 
- Use global_num_experts = num_experts + num_redundant_experts
consistently.
- Backward compatible when num_redundant_experts=0. 

Tested 
On a 16-rank setup (W8A8) with static EPLB and expert_map_path,
verifying router logits shape and successful requests.

- vLLM version: v0.11.0rc3
- vLLM main: https://github.com/vllm-project/vllm/commit/v0.11.0

Signed-off-by: yechao237 <yechao20180411@gmail.com>
This commit is contained in:
yechao237
2025-10-18 00:09:16 +08:00
committed by GitHub
parent 07ca1b9b78
commit 4750d45d86
12 changed files with 23 additions and 35 deletions

View File

@@ -34,8 +34,8 @@ def test_determine_default_expert_map_multiple_worlds_with_redundant():
rank_id=0,
global_redundant_expert_num=1)
assert count == 3
assert torch.all(expert_map[0:3] >= 0)
assert count == 2
assert torch.all(expert_map[0:2] >= 0)
def test_generate_log2phy_map_single_rank_holding():