Logo
Explore Help
Register Sign In
EngineX-Ascend/enginex-ascend-910-llama.cpp
10
0
Fork 0
You've already forked enginex-ascend-910-llama.cpp
Code Issues Pull Requests Actions 4 Projects Releases Wiki Activity
Files
c250ecb3157f3bae0a45f44c3c953b5414d4c2f7
enginex-ascend-910-llama.cpp/.devops
History
Rudi Servo 7c0e285858 devops : add docker-multi-stage builds (#10832)
2024-12-22 23:22:58 +01:00
..
nix
nix: allow to override rocm gpu targets (#10794)
2024-12-14 10:17:36 -08:00
cloud-v-pipeline
build: rename main → llama-cli, server → llama-server, llava-cli → llama-llava-cli, etc... (#7809)
2024-06-13 00:41:52 +01:00
cpu.Dockerfile
devops : add docker-multi-stage builds (#10832)
2024-12-22 23:22:58 +01:00
cuda.Dockerfile
devops : add docker-multi-stage builds (#10832)
2024-12-22 23:22:58 +01:00
intel.Dockerfile
devops : add docker-multi-stage builds (#10832)
2024-12-22 23:22:58 +01:00
llama-cli-cann.Dockerfile
docker: use GGML_NATIVE=OFF (#10368)
2024-11-18 00:21:53 +01:00
llama-cpp-cuda.srpm.spec
devops : remove clblast + LLAMA_CUDA -> GGML_CUDA (#8139)
2024-06-26 19:32:07 +03:00
llama-cpp.srpm.spec
build: rename main → llama-cli, server → llama-server, llava-cli → llama-llava-cli, etc... (#7809)
2024-06-13 00:41:52 +01:00
musa.Dockerfile
devops : add docker-multi-stage builds (#10832)
2024-12-22 23:22:58 +01:00
rocm.Dockerfile
devops : add docker-multi-stage builds (#10832)
2024-12-22 23:22:58 +01:00
tools.sh
fix: graceful shutdown for Docker images (#10815)
2024-12-13 18:23:50 +01:00
vulkan.Dockerfile
devops : add docker-multi-stage builds (#10832)
2024-12-22 23:22:58 +01:00
Powered by Gitea Version: 1.24.3 Page: 87ms Template: 5ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API