Logo
Explore Help
Register Sign In
EngineX/xc-llm-kunlun
3
0
Fork 0
You've already forked xc-llm-kunlun
Code Issues Pull Requests Actions Projects Releases Wiki Activity
Files
ff7131678a7bffb707b0649f03375bb1787c4c83
xc-llm-kunlun/vllm_kunlun
History
chenyili0619 2e2933d217 [Bug] Fixed the issue where an error occurred when the request included a seed.
2025-12-18 13:03:34 +08:00
..
compilation
提交vllm0.11.0开发分支
2025-12-10 17:51:24 +08:00
csrc
Initial commit for vLLM-Kunlun Plugin
2025-12-10 12:05:39 +08:00
distributed
提交vllm0.11.0开发分支
2025-12-10 17:51:24 +08:00
models
[Model] Update llama.py
2025-12-15 21:28:56 +08:00
ops
[Kernel] Optimize the performance of causal_conv1d.
2025-12-12 17:22:35 +08:00
patches
提交vllm0.11.0开发分支
2025-12-10 17:51:24 +08:00
platforms
提交vllm0.11.0开发分支
2025-12-10 17:51:24 +08:00
tests
Initial commit for vLLM-Kunlun Plugin
2025-12-10 12:05:39 +08:00
v1
[Bug] Fixed the issue where an error occurred when the request included a seed.
2025-12-18 13:03:34 +08:00
worker
提交vllm0.11.0开发分支
2025-12-10 17:51:24 +08:00
__init__.py
提交vllm0.11.0开发分支
2025-12-10 17:51:24 +08:00
utils.py
提交vllm0.11.0开发分支
2025-12-10 17:51:24 +08:00
vllm_utils_wrapper.py
提交vllm0.11.0开发分支
2025-12-10 17:51:24 +08:00
Powered by Gitea Version: 1.24.3 Page: 158ms Template: 7ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API