This website requires JavaScript.
Explore
Help
Register
Sign In
EngineX
/
xc-llm-kunlun
Watch
3
Star
0
Fork
0
You've already forked xc-llm-kunlun
Code
Issues
Pull Requests
Projects
Releases
Wiki
Activity
Files
e8f4e1337ca645dc2e8344cf12aca4a70c9dc6c3
xc-llm-kunlun
/
vllm_kunlun
/
v1
/
attention
History
ldh2020
58c1db5073
[Bugfix] fix the bug of the flash_attention in Qwen3-Next
2025-12-21 10:34:43 +08:00
..
backends
[Bugfix] fix the bug of the flash_attention in Qwen3-Next
2025-12-21 10:34:43 +08:00
__init__.py
Initial commit for vLLM-Kunlun Plugin
2025-12-10 12:05:39 +08:00