This website requires JavaScript.
Explore
Help
Register
Sign In
EngineX
/
xc-llm-kunlun
Watch
3
Star
0
Fork
0
You've already forked xc-llm-kunlun
Code
Issues
Pull Requests
Actions
Projects
Releases
Wiki
Activity
Files
ded24f50264a889c808971636e7a7d4206ae5563
xc-llm-kunlun
/
vllm_kunlun
/
v1
History
hanhaowen
b015bb76fd
remove qwen2.py llama.py fix llama output
2025-12-31 11:39:37 +08:00
..
attention
remove qwen2.py llama.py fix llama output
2025-12-31 11:39:37 +08:00
sample
/ops
[Bug] Fixed the issue where an error occurred when the request included a seed.
2025-12-18 13:03:34 +08:00
worker
提交vllm0.11.0开发分支
2025-12-10 17:51:24 +08:00
__init__.py
Initial commit for vLLM-Kunlun Plugin
2025-12-10 12:05:39 +08:00