Logo
Explore Help
Register Sign In
EngineX-Hygon/sglang
5
0
Fork 0
You've already forked sglang
Code Issues Pull Requests Actions 7 Projects Releases Wiki Activity
Files
180ff5eecc2da2231eb3ef29f70aa8d62fd8e168
sglang/sgl-kernel/csrc/moe
History
Cheng Wan 8a5480528d [Refactor] Rename n_share_experts_fusion as num_fused_shared_experts (#6735)
2025-06-03 17:48:24 -07:00
..
cutlass_moe_helper.cu
[1/2] Add FP8 Blockscale MoE CUTLASS kernel for Blackwell (#5281)
2025-04-22 22:28:20 -07:00
ep_moe_reorder_kernel.cu
[EP] Add cuda kernel for moe_ep_pre_reorder (#6699)
2025-06-01 20:49:01 -07:00
fp8_blockwise_moe_kernel.cu
[2/2] Add python wrapper for CUTLASS FP8 Blockscale MoE Kernel. (#5694)
2025-05-16 13:14:07 -07:00
moe_align_kernel.cu
reduce moe_align_block_size_kernel small batch mode overhead (#5086)
2025-04-09 17:59:35 -07:00
moe_fused_gate.cu
[Refactor] Rename n_share_experts_fusion as num_fused_shared_experts (#6735)
2025-06-03 17:48:24 -07:00
moe_topk_softmax_kernels.cu
Add moe topk softmax templated from vllm (#4302)
2025-03-14 12:03:33 -07:00
nvfp4_blockwise_moe.cu
[1/2] Add Kernel support for Cutlass based Fused FP4 MoE (#6093)
2025-06-02 13:48:03 -07:00
prepare_moe_input.cu
[1/2] Add Kernel support for Cutlass based Fused FP4 MoE (#6093)
2025-06-02 13:48:03 -07:00
Powered by Gitea Version: 1.24.3 Page: 130ms Template: 11ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API