Logo
Explore Help
Register Sign In
EngineX/xc-llm-ascend
3
0
Fork 0
You've already forked xc-llm-ascend
Code Issues Pull Requests Actions Projects Releases Wiki Activity
Files
973a7cfdf08b6e2e568f244c8006305900f76456
xc-llm-ascend/tests/ut/ops
History
sherie 3f867ee708 refactor allgather/mc2-related fused_experts (#2369)
### What this PR does / why we need it?
refactor allgather/mc2-related fused_experts

- vLLM version: v0.10.0
- vLLM main:
de7b67a023

Signed-off-by: wangxiaoxin-sherie <wangxiaoxin7@huawei.com>
Co-authored-by: wangxiaoxin-sherie <wangxiaoxin7@huawei.com>
2025-08-20 14:20:46 +08:00
..
expert_map.json
Add unit test local cpu guide and enable base testcase (#1566)
2025-07-06 10:42:27 +08:00
test_activation.py
[1/N][CustomOp] Register activation customop instead of overwrite forward_oot (#1841)
2025-07-18 23:07:14 +08:00
test_expert_load_balancer.py
Add unit test local cpu guide and enable base testcase (#1566)
2025-07-06 10:42:27 +08:00
test_fused_ops.py
Fix some ci issue and refactor modelrunner (#2445)
2025-08-20 09:01:04 +08:00
test_layernorm.py
[CustomOp] Register RMSNorm instead of overwrite forward_oot (#2284)
2025-08-14 17:18:30 +08:00
test_rotary_embedding.py
[FOLLOWUP] Use base test to avoid patch everwhere (#1634)
2025-07-22 09:03:40 +08:00
test_token_dispatcher.py
refactor allgather/mc2-related fused_experts (#2369)
2025-08-20 14:20:46 +08:00
test_vocab_parallel_embedding.py
add ut for vocab_parallel_embedding (#2067)
2025-07-30 14:35:45 +08:00
Powered by Gitea Version: 1.24.3 Page: 144ms Template: 7ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API