初始化项目,由ModelHub XC社区提供模型

Model: mradermacher/Geneva-12B-GCv2-500k-i1-GGUF
Source: Original Platform
This commit is contained in:
ModelHub XC
2026-04-20 00:55:46 +08:00
commit 51650a8fca
27 changed files with 231 additions and 0 deletions

60
.gitattributes vendored Normal file
View File

@@ -0,0 +1,60 @@
*.7z filter=lfs diff=lfs merge=lfs -text
*.arrow filter=lfs diff=lfs merge=lfs -text
*.bin filter=lfs diff=lfs merge=lfs -text
*.bz2 filter=lfs diff=lfs merge=lfs -text
*.ckpt filter=lfs diff=lfs merge=lfs -text
*.ftz filter=lfs diff=lfs merge=lfs -text
*.gz filter=lfs diff=lfs merge=lfs -text
*.h5 filter=lfs diff=lfs merge=lfs -text
*.joblib filter=lfs diff=lfs merge=lfs -text
*.lfs.* filter=lfs diff=lfs merge=lfs -text
*.mlmodel filter=lfs diff=lfs merge=lfs -text
*.model filter=lfs diff=lfs merge=lfs -text
*.msgpack filter=lfs diff=lfs merge=lfs -text
*.npy filter=lfs diff=lfs merge=lfs -text
*.npz filter=lfs diff=lfs merge=lfs -text
*.onnx filter=lfs diff=lfs merge=lfs -text
*.ot filter=lfs diff=lfs merge=lfs -text
*.parquet filter=lfs diff=lfs merge=lfs -text
*.pb filter=lfs diff=lfs merge=lfs -text
*.pickle filter=lfs diff=lfs merge=lfs -text
*.pkl filter=lfs diff=lfs merge=lfs -text
*.pt filter=lfs diff=lfs merge=lfs -text
*.pth filter=lfs diff=lfs merge=lfs -text
*.rar filter=lfs diff=lfs merge=lfs -text
*.safetensors filter=lfs diff=lfs merge=lfs -text
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
*.tar.* filter=lfs diff=lfs merge=lfs -text
*.tar filter=lfs diff=lfs merge=lfs -text
*.tflite filter=lfs diff=lfs merge=lfs -text
*.tgz filter=lfs diff=lfs merge=lfs -text
*.wasm filter=lfs diff=lfs merge=lfs -text
*.xz filter=lfs diff=lfs merge=lfs -text
*.zip filter=lfs diff=lfs merge=lfs -text
*.zst filter=lfs diff=lfs merge=lfs -text
*tfevents* filter=lfs diff=lfs merge=lfs -text
imatrix.dat filter=lfs diff=lfs merge=lfs -text
Geneva-12B-GCv2-500k.i1-Q2_K.gguf filter=lfs diff=lfs merge=lfs -text
Geneva-12B-GCv2-500k.i1-IQ3_M.gguf filter=lfs diff=lfs merge=lfs -text
Geneva-12B-GCv2-500k.i1-Q4_K_S.gguf filter=lfs diff=lfs merge=lfs -text
Geneva-12B-GCv2-500k.i1-IQ3_XXS.gguf filter=lfs diff=lfs merge=lfs -text
Geneva-12B-GCv2-500k.i1-Q3_K_M.gguf filter=lfs diff=lfs merge=lfs -text
Geneva-12B-GCv2-500k.i1-IQ4_NL.gguf filter=lfs diff=lfs merge=lfs -text
Geneva-12B-GCv2-500k.i1-Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text
Geneva-12B-GCv2-500k.i1-IQ2_M.gguf filter=lfs diff=lfs merge=lfs -text
Geneva-12B-GCv2-500k.i1-Q6_K.gguf filter=lfs diff=lfs merge=lfs -text
Geneva-12B-GCv2-500k.i1-IQ4_XS.gguf filter=lfs diff=lfs merge=lfs -text
Geneva-12B-GCv2-500k.i1-Q2_K_S.gguf filter=lfs diff=lfs merge=lfs -text
Geneva-12B-GCv2-500k.i1-IQ1_M.gguf filter=lfs diff=lfs merge=lfs -text
Geneva-12B-GCv2-500k.i1-Q3_K_S.gguf filter=lfs diff=lfs merge=lfs -text
Geneva-12B-GCv2-500k.i1-IQ2_XXS.gguf filter=lfs diff=lfs merge=lfs -text
Geneva-12B-GCv2-500k.i1-Q3_K_L.gguf filter=lfs diff=lfs merge=lfs -text
Geneva-12B-GCv2-500k.i1-IQ2_XS.gguf filter=lfs diff=lfs merge=lfs -text
Geneva-12B-GCv2-500k.i1-Q5_K_S.gguf filter=lfs diff=lfs merge=lfs -text
Geneva-12B-GCv2-500k.i1-IQ2_S.gguf filter=lfs diff=lfs merge=lfs -text
Geneva-12B-GCv2-500k.i1-IQ1_S.gguf filter=lfs diff=lfs merge=lfs -text
Geneva-12B-GCv2-500k.i1-Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text
Geneva-12B-GCv2-500k.i1-Q4_0.gguf filter=lfs diff=lfs merge=lfs -text
Geneva-12B-GCv2-500k.i1-IQ3_XS.gguf filter=lfs diff=lfs merge=lfs -text
Geneva-12B-GCv2-500k.i1-Q4_1.gguf filter=lfs diff=lfs merge=lfs -text
Geneva-12B-GCv2-500k.i1-IQ3_S.gguf filter=lfs diff=lfs merge=lfs -text

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:3d349294fb388b7ca629ae411cfc1d72d5808b7aa4d645f5243a61417ee576e6
size 3221629280

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:2cda43cd60027ba151fedfeca7813c55db459c5d453c3f2bf7cae729ca073f7f
size 2999216480

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:373940d791d42c1423a2848fba309d83f71328efe2e3751ec064232c30a167d6
size 4435028320

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:bbb8809587b6cd7aed41ad73c87188e906bad3e8c06f90be0eacba68fb79425e
size 4138477920

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:95c56a40c7f59788164f64fd06233746d7d4646c0550c130125812a4cdc84657
size 3915082080

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:d8acd9e9eada18006b1398734032660fec9321e4ca852cb74c9bbfdf9b2a620c
size 3592317280

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:be2055e6d42e301ec389d9ea55c15c7a677998b381dad865b655d0389982ca02
size 5722237280

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:ffabc3f5a8bf223d62a3d99b77328aacdc72802dd738adb4b70545b02bd2a6af
size 5562083680

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:fa976fe7523644d09f37afb2d3527d6bad045ab74a873615e53f01c712c1a87a
size 5306493280

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:04bdc14ec34fdbfeca493395610d7848d72d24378e8e43e4015b596af9265bb5
size 4945389920

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:96f851520378df64005cb5d32462054206d37ceca0f3135478b1f9266be89750
size 7097919840

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:97b284fad1d287633c68596e5e811750d58495a03507ca66c6e6a861992dd9e0
size 6742714720

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:de386e008f24bbd9d719a83e3d71b00b142ff22fc43a35c97b1e768bd7aaa761
size 4791052640

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:ec300755ddb3d074e57453a6d7e1c310ff187052c73c039d24618926796955c9
size 4493683040

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:a7941fec64c3ba936094b4500ab4f10bcf6fbf232d63f2a1e706bbd8f74a2d31
size 6561507680

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:47f7cc1bd30c752220d8f37dccb1121ff4fa9eda4442077c1355fde92c3c3c57
size 6083094880

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:d2347172833a1b7a5381328587b3b648453e2d1e83fa51f85a5a66d4cb4091eb
size 5534230880

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:33e85b87325c4fb9d79334ecff6fbce2f227eba91a62c0579efb89a93b32ba98
size 7094643040

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:d7d7919b40299ded11d4a662e2654d6a166f31615ddaba9d82f7a9438abdb735
size 7795222880

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:1b87a182d21b056ef34bc6771ee22aa5cda6aa490c6c869daba7a7392905ea0f
size 7477209440

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:4d665c8e59b20d3de0d2ea05c088f7cbf8778b275dd4cfba6754bd4f9f0e63b5
size 7120202080

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:0b56b92528ea984d70b453d7bff0baedb8bf957398feed382151573cc21eb076
size 8727636320

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:0de8176d089d21729cdff0d93bdf64cfcd5948cfe7375d7c5b7ae7e6383a40cd
size 8518740320

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:2dc0b9d1bab2dfe3310936b853e9ab4dc1e2f7fcc8a51a759052c7a2b608bb0e
size 10056214880

96
README.md Normal file
View File

@@ -0,0 +1,96 @@
---
base_model: rubenroy/Geneva-12B-GCv2-500k
datasets:
- rubenroy/GammaCorpus-v2-500k
language:
- en
- fr
- de
- es
- it
- pt
- ru
- zh
- ja
library_name: transformers
license: apache-2.0
quantized_by: mradermacher
tags:
- text-generation-inference
- transformers
- unsloth
- trl
- gammacorpus
- geneva
- chat
- mistral
- conversational
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/rubenroy/Geneva-12B-GCv2-500k
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/Geneva-12B-GCv2-500k-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Geneva-12B-GCv2-500k-i1-GGUF/resolve/main/Geneva-12B-GCv2-500k.i1-IQ1_S.gguf) | i1-IQ1_S | 3.1 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/Geneva-12B-GCv2-500k-i1-GGUF/resolve/main/Geneva-12B-GCv2-500k.i1-IQ1_M.gguf) | i1-IQ1_M | 3.3 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/Geneva-12B-GCv2-500k-i1-GGUF/resolve/main/Geneva-12B-GCv2-500k.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 3.7 | |
| [GGUF](https://huggingface.co/mradermacher/Geneva-12B-GCv2-500k-i1-GGUF/resolve/main/Geneva-12B-GCv2-500k.i1-IQ2_XS.gguf) | i1-IQ2_XS | 4.0 | |
| [GGUF](https://huggingface.co/mradermacher/Geneva-12B-GCv2-500k-i1-GGUF/resolve/main/Geneva-12B-GCv2-500k.i1-IQ2_S.gguf) | i1-IQ2_S | 4.2 | |
| [GGUF](https://huggingface.co/mradermacher/Geneva-12B-GCv2-500k-i1-GGUF/resolve/main/Geneva-12B-GCv2-500k.i1-IQ2_M.gguf) | i1-IQ2_M | 4.5 | |
| [GGUF](https://huggingface.co/mradermacher/Geneva-12B-GCv2-500k-i1-GGUF/resolve/main/Geneva-12B-GCv2-500k.i1-Q2_K_S.gguf) | i1-Q2_K_S | 4.6 | very low quality |
| [GGUF](https://huggingface.co/mradermacher/Geneva-12B-GCv2-500k-i1-GGUF/resolve/main/Geneva-12B-GCv2-500k.i1-Q2_K.gguf) | i1-Q2_K | 4.9 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/Geneva-12B-GCv2-500k-i1-GGUF/resolve/main/Geneva-12B-GCv2-500k.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 5.0 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Geneva-12B-GCv2-500k-i1-GGUF/resolve/main/Geneva-12B-GCv2-500k.i1-IQ3_XS.gguf) | i1-IQ3_XS | 5.4 | |
| [GGUF](https://huggingface.co/mradermacher/Geneva-12B-GCv2-500k-i1-GGUF/resolve/main/Geneva-12B-GCv2-500k.i1-Q3_K_S.gguf) | i1-Q3_K_S | 5.6 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/Geneva-12B-GCv2-500k-i1-GGUF/resolve/main/Geneva-12B-GCv2-500k.i1-IQ3_S.gguf) | i1-IQ3_S | 5.7 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/Geneva-12B-GCv2-500k-i1-GGUF/resolve/main/Geneva-12B-GCv2-500k.i1-IQ3_M.gguf) | i1-IQ3_M | 5.8 | |
| [GGUF](https://huggingface.co/mradermacher/Geneva-12B-GCv2-500k-i1-GGUF/resolve/main/Geneva-12B-GCv2-500k.i1-Q3_K_M.gguf) | i1-Q3_K_M | 6.2 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/Geneva-12B-GCv2-500k-i1-GGUF/resolve/main/Geneva-12B-GCv2-500k.i1-Q3_K_L.gguf) | i1-Q3_K_L | 6.7 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/Geneva-12B-GCv2-500k-i1-GGUF/resolve/main/Geneva-12B-GCv2-500k.i1-IQ4_XS.gguf) | i1-IQ4_XS | 6.8 | |
| [GGUF](https://huggingface.co/mradermacher/Geneva-12B-GCv2-500k-i1-GGUF/resolve/main/Geneva-12B-GCv2-500k.i1-Q4_0.gguf) | i1-Q4_0 | 7.2 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/Geneva-12B-GCv2-500k-i1-GGUF/resolve/main/Geneva-12B-GCv2-500k.i1-IQ4_NL.gguf) | i1-IQ4_NL | 7.2 | prefer IQ4_XS |
| [GGUF](https://huggingface.co/mradermacher/Geneva-12B-GCv2-500k-i1-GGUF/resolve/main/Geneva-12B-GCv2-500k.i1-Q4_K_S.gguf) | i1-Q4_K_S | 7.2 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/Geneva-12B-GCv2-500k-i1-GGUF/resolve/main/Geneva-12B-GCv2-500k.i1-Q4_K_M.gguf) | i1-Q4_K_M | 7.6 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Geneva-12B-GCv2-500k-i1-GGUF/resolve/main/Geneva-12B-GCv2-500k.i1-Q4_1.gguf) | i1-Q4_1 | 7.9 | |
| [GGUF](https://huggingface.co/mradermacher/Geneva-12B-GCv2-500k-i1-GGUF/resolve/main/Geneva-12B-GCv2-500k.i1-Q5_K_S.gguf) | i1-Q5_K_S | 8.6 | |
| [GGUF](https://huggingface.co/mradermacher/Geneva-12B-GCv2-500k-i1-GGUF/resolve/main/Geneva-12B-GCv2-500k.i1-Q5_K_M.gguf) | i1-Q5_K_M | 8.8 | |
| [GGUF](https://huggingface.co/mradermacher/Geneva-12B-GCv2-500k-i1-GGUF/resolve/main/Geneva-12B-GCv2-500k.i1-Q6_K.gguf) | i1-Q6_K | 10.2 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):
![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png)
And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->

3
imatrix.dat Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:03275a5ddd03ec8a81d8e8dd5107496cd9be5cd7faef663818502fe356616aa1
size 7054405