[Quickfix] Fix dp+ep+tp error when sp chunked the hidden_states (#3246)
### What this PR does / why we need it?
Fix dp+ep+tp inplace copy error when sp chunked the `hidden_states`.
### How was this patch tested?
test locally with the following scripts
```bash
python examples/offline_data_parallel.py \
--model="Qwen/Qwen3-30B-A3B" \
--dp-size=2 \
--tp-size=2 \
--enable-expert-parallel
```
Signed-off-by: MengqingCao <cmq0113@163.com>
This commit is contained in:
@@ -295,6 +295,7 @@ class AscendFusedMoE(FusedMoE):
|
||||
in_dtype=params_dtype,
|
||||
)
|
||||
self.moe_config = moe
|
||||
# TODO: The self.moe_config.tp_size here is not correct, fixme soon
|
||||
|
||||
if quant_config is None:
|
||||
self.quant_method = AscendUnquantizedFusedMoEMethod(moe)
|
||||
|
||||
Reference in New Issue
Block a user