Logo
Explore Help
Register Sign In
EngineX/xc-llm-kunlun
3
0
Fork 0
You've already forked xc-llm-kunlun
Code Issues Pull Requests Projects Releases Wiki Activity
Files
c91134fd09c2bf6bf9309202a9d1281d7d0f8bb1
xc-llm-kunlun/vllm_kunlun
History
hanhaowen a4b9e92ca1 [Kernel] Replace native torch solve_tril by solve_tril_fwd kernel op
2025-12-22 17:37:19 +08:00
..
compilation
提交vllm0.11.0开发分支
2025-12-10 17:51:24 +08:00
csrc
Initial commit for vLLM-Kunlun Plugin
2025-12-10 12:05:39 +08:00
distributed
提交vllm0.11.0开发分支
2025-12-10 17:51:24 +08:00
models
[Model] Update llama.py
2025-12-15 21:28:56 +08:00
ops
[Kernel] Replace native torch solve_tril by solve_tril_fwd kernel op
2025-12-22 17:37:19 +08:00
patches
提交vllm0.11.0开发分支
2025-12-10 17:51:24 +08:00
platforms
提交vllm0.11.0开发分支
2025-12-10 17:51:24 +08:00
tests
Initial commit for vLLM-Kunlun Plugin
2025-12-10 12:05:39 +08:00
v1
[Bug] Fixed the issue where an error occurred when the request included a seed.
2025-12-18 13:03:34 +08:00
worker
提交vllm0.11.0开发分支
2025-12-10 17:51:24 +08:00
__init__.py
提交vllm0.11.0开发分支
2025-12-10 17:51:24 +08:00
utils.py
提交vllm0.11.0开发分支
2025-12-10 17:51:24 +08:00
vllm_utils_wrapper.py
提交vllm0.11.0开发分支
2025-12-10 17:51:24 +08:00
Powered by Gitea Version: 1.24.3 Page: 131ms Template: 7ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API