初始化项目,由ModelHub XC社区提供模型

Model: mradermacher/EasyContext-256K-danube2-1.8b-i1-GGUF
Source: Original Platform
This commit is contained in:
ModelHub XC
2026-05-03 10:20:16 +08:00
commit 5409a5b0fa
27 changed files with 210 additions and 0 deletions

60
.gitattributes vendored Normal file
View File

@@ -0,0 +1,60 @@
*.7z filter=lfs diff=lfs merge=lfs -text
*.arrow filter=lfs diff=lfs merge=lfs -text
*.bin filter=lfs diff=lfs merge=lfs -text
*.bz2 filter=lfs diff=lfs merge=lfs -text
*.ckpt filter=lfs diff=lfs merge=lfs -text
*.ftz filter=lfs diff=lfs merge=lfs -text
*.gz filter=lfs diff=lfs merge=lfs -text
*.h5 filter=lfs diff=lfs merge=lfs -text
*.joblib filter=lfs diff=lfs merge=lfs -text
*.lfs.* filter=lfs diff=lfs merge=lfs -text
*.mlmodel filter=lfs diff=lfs merge=lfs -text
*.model filter=lfs diff=lfs merge=lfs -text
*.msgpack filter=lfs diff=lfs merge=lfs -text
*.npy filter=lfs diff=lfs merge=lfs -text
*.npz filter=lfs diff=lfs merge=lfs -text
*.onnx filter=lfs diff=lfs merge=lfs -text
*.ot filter=lfs diff=lfs merge=lfs -text
*.parquet filter=lfs diff=lfs merge=lfs -text
*.pb filter=lfs diff=lfs merge=lfs -text
*.pickle filter=lfs diff=lfs merge=lfs -text
*.pkl filter=lfs diff=lfs merge=lfs -text
*.pt filter=lfs diff=lfs merge=lfs -text
*.pth filter=lfs diff=lfs merge=lfs -text
*.rar filter=lfs diff=lfs merge=lfs -text
*.safetensors filter=lfs diff=lfs merge=lfs -text
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
*.tar.* filter=lfs diff=lfs merge=lfs -text
*.tar filter=lfs diff=lfs merge=lfs -text
*.tflite filter=lfs diff=lfs merge=lfs -text
*.tgz filter=lfs diff=lfs merge=lfs -text
*.wasm filter=lfs diff=lfs merge=lfs -text
*.xz filter=lfs diff=lfs merge=lfs -text
*.zip filter=lfs diff=lfs merge=lfs -text
*.zst filter=lfs diff=lfs merge=lfs -text
*tfevents* filter=lfs diff=lfs merge=lfs -text
imatrix.dat filter=lfs diff=lfs merge=lfs -text
EasyContext-256K-danube2-1.8b.i1-IQ1_M.gguf filter=lfs diff=lfs merge=lfs -text
EasyContext-256K-danube2-1.8b.i1-IQ1_S.gguf filter=lfs diff=lfs merge=lfs -text
EasyContext-256K-danube2-1.8b.i1-IQ2_M.gguf filter=lfs diff=lfs merge=lfs -text
EasyContext-256K-danube2-1.8b.i1-IQ2_S.gguf filter=lfs diff=lfs merge=lfs -text
EasyContext-256K-danube2-1.8b.i1-IQ2_XS.gguf filter=lfs diff=lfs merge=lfs -text
EasyContext-256K-danube2-1.8b.i1-IQ2_XXS.gguf filter=lfs diff=lfs merge=lfs -text
EasyContext-256K-danube2-1.8b.i1-IQ3_M.gguf filter=lfs diff=lfs merge=lfs -text
EasyContext-256K-danube2-1.8b.i1-IQ3_S.gguf filter=lfs diff=lfs merge=lfs -text
EasyContext-256K-danube2-1.8b.i1-IQ3_XS.gguf filter=lfs diff=lfs merge=lfs -text
EasyContext-256K-danube2-1.8b.i1-IQ3_XXS.gguf filter=lfs diff=lfs merge=lfs -text
EasyContext-256K-danube2-1.8b.i1-IQ4_NL.gguf filter=lfs diff=lfs merge=lfs -text
EasyContext-256K-danube2-1.8b.i1-IQ4_XS.gguf filter=lfs diff=lfs merge=lfs -text
EasyContext-256K-danube2-1.8b.i1-Q2_K.gguf filter=lfs diff=lfs merge=lfs -text
EasyContext-256K-danube2-1.8b.i1-Q2_K_S.gguf filter=lfs diff=lfs merge=lfs -text
EasyContext-256K-danube2-1.8b.i1-Q3_K_L.gguf filter=lfs diff=lfs merge=lfs -text
EasyContext-256K-danube2-1.8b.i1-Q3_K_M.gguf filter=lfs diff=lfs merge=lfs -text
EasyContext-256K-danube2-1.8b.i1-Q3_K_S.gguf filter=lfs diff=lfs merge=lfs -text
EasyContext-256K-danube2-1.8b.i1-Q4_0.gguf filter=lfs diff=lfs merge=lfs -text
EasyContext-256K-danube2-1.8b.i1-Q4_1.gguf filter=lfs diff=lfs merge=lfs -text
EasyContext-256K-danube2-1.8b.i1-Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text
EasyContext-256K-danube2-1.8b.i1-Q4_K_S.gguf filter=lfs diff=lfs merge=lfs -text
EasyContext-256K-danube2-1.8b.i1-Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text
EasyContext-256K-danube2-1.8b.i1-Q5_K_S.gguf filter=lfs diff=lfs merge=lfs -text
EasyContext-256K-danube2-1.8b.i1-Q6_K.gguf filter=lfs diff=lfs merge=lfs -text

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:dc9e04de8cff78a1b9b5cbaa7d14d264c02c88c6a4e0b1df3526d4381aa0852a
size 474608160

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:4e33a04946970c4680d954edc8f649ac8ad7d7047e8b14f1863636562d8f2462
size 441384480

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:0c35fd09c0ec987444c51211e157bc12d9ed04fc6b754f64768025d3085ad149
size 659322400

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:7ae272100cf74a34329285ec804f346f55add8fa9d987530e14c34c05e42b389
size 615024160

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:a94285c924e61378bdb28ca83c04043dfee6cf02b0c7f226ceb9cc974e2ebada
size 579194400

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:1e8ed9ecad80a8429cd8a3bbe7f6d67a25c257969d491c7d600e76618f719a5a
size 529980960

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:9821364cb250f1e20f4b8606979acb8933bde363b6f76411ff860b762e80cd37
size 853186080

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:1ea754481ea9f19257247381a1dcf77adb317368a2fa5f2dcd892b116bfb1d05
size 825246240

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:ae221eeeec2ae1de6aa2a6245b5287f1f52b1956866ccb96d20d32cc09058ba9
size 786170400

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:5eff94ca33eb62e2876a11a8d34c9f173c73ba5fb9e1e859a866ea7141293651
size 733142560

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:453c336b4761cf60acd9d584f21c577725b8427fb1a1068bdb7a3971dc23b5af
size 1057248800

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:2814285fdbc80c9a6720ae92dcc45442da22b4d390302280aed3b17440ffae27
size 1003816480

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:d549b0b86849b0d368abd119e56ef4af868b8feaf6bc3059c0d5bd46d59cc6b1
size 710696480

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:a0764df5fcab0812d057f13aeda9b52955d94f240a9a2f27ab916e782e6e85b2
size 664032800

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:b187946befd76b8af9ea1017a9bc6e1b6bd3705c93999247046c649523ffbc59
size 980197920

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:22aa831c2de643b901dee2b63b56f423328378283cc5c4d9513324c835999ba0
size 905159200

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:51d0fef0aab709e3decb25b13787a0abf32850798456b376132bb00a08122d14
size 820023840

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:a625f31eae6db53a34664c9ab0d9f8024f95e1f9e9270e09531e8e4321bae09a
size 1055651360

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:42d8d2c1aa5cc1b0fb2fb894b4fd82b6edd5dd03f9457388234dffb728caf77c
size 1161655840

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:8e9cdaba1e43fd9641cdecf859bcdbf1a39d13d995b70f1ee04bec1a6ddce2ca
size 1112145440

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:3ac6913667d3b02e76a39d2094fe18c1a87294bd047ef6a8e98a36b9b4a21ea5
size 1059788320

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:515aa170850f60f378be41875fc80f5c46b44873f7f0d03488a7192ef070c00a
size 1301790240

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:b55aacd762a63858d0f8d4cb631aa64c45d324744f0c620d61d3e8292c2ffaa1
size 1270978080

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:61d61530db096b960343b52393c6d8529919127b3f8187ba2cfd47288db6ea92
size 1503287840

75
README.md Normal file
View File

@@ -0,0 +1,75 @@
---
base_model: PY007/EasyContext-256K-danube2-1.8b
language:
- en
library_name: transformers
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/PY007/EasyContext-256K-danube2-1.8b
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/EasyContext-256K-danube2-1.8b-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/EasyContext-256K-danube2-1.8b-i1-GGUF/resolve/main/EasyContext-256K-danube2-1.8b.i1-IQ1_S.gguf) | i1-IQ1_S | 0.5 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/EasyContext-256K-danube2-1.8b-i1-GGUF/resolve/main/EasyContext-256K-danube2-1.8b.i1-IQ1_M.gguf) | i1-IQ1_M | 0.6 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/EasyContext-256K-danube2-1.8b-i1-GGUF/resolve/main/EasyContext-256K-danube2-1.8b.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 0.6 | |
| [GGUF](https://huggingface.co/mradermacher/EasyContext-256K-danube2-1.8b-i1-GGUF/resolve/main/EasyContext-256K-danube2-1.8b.i1-IQ2_XS.gguf) | i1-IQ2_XS | 0.7 | |
| [GGUF](https://huggingface.co/mradermacher/EasyContext-256K-danube2-1.8b-i1-GGUF/resolve/main/EasyContext-256K-danube2-1.8b.i1-IQ2_S.gguf) | i1-IQ2_S | 0.7 | |
| [GGUF](https://huggingface.co/mradermacher/EasyContext-256K-danube2-1.8b-i1-GGUF/resolve/main/EasyContext-256K-danube2-1.8b.i1-IQ2_M.gguf) | i1-IQ2_M | 0.8 | |
| [GGUF](https://huggingface.co/mradermacher/EasyContext-256K-danube2-1.8b-i1-GGUF/resolve/main/EasyContext-256K-danube2-1.8b.i1-Q2_K_S.gguf) | i1-Q2_K_S | 0.8 | very low quality |
| [GGUF](https://huggingface.co/mradermacher/EasyContext-256K-danube2-1.8b-i1-GGUF/resolve/main/EasyContext-256K-danube2-1.8b.i1-Q2_K.gguf) | i1-Q2_K | 0.8 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/EasyContext-256K-danube2-1.8b-i1-GGUF/resolve/main/EasyContext-256K-danube2-1.8b.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 0.8 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/EasyContext-256K-danube2-1.8b-i1-GGUF/resolve/main/EasyContext-256K-danube2-1.8b.i1-IQ3_XS.gguf) | i1-IQ3_XS | 0.9 | |
| [GGUF](https://huggingface.co/mradermacher/EasyContext-256K-danube2-1.8b-i1-GGUF/resolve/main/EasyContext-256K-danube2-1.8b.i1-Q3_K_S.gguf) | i1-Q3_K_S | 0.9 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/EasyContext-256K-danube2-1.8b-i1-GGUF/resolve/main/EasyContext-256K-danube2-1.8b.i1-IQ3_S.gguf) | i1-IQ3_S | 0.9 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/EasyContext-256K-danube2-1.8b-i1-GGUF/resolve/main/EasyContext-256K-danube2-1.8b.i1-IQ3_M.gguf) | i1-IQ3_M | 1.0 | |
| [GGUF](https://huggingface.co/mradermacher/EasyContext-256K-danube2-1.8b-i1-GGUF/resolve/main/EasyContext-256K-danube2-1.8b.i1-Q3_K_M.gguf) | i1-Q3_K_M | 1.0 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/EasyContext-256K-danube2-1.8b-i1-GGUF/resolve/main/EasyContext-256K-danube2-1.8b.i1-Q3_K_L.gguf) | i1-Q3_K_L | 1.1 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/EasyContext-256K-danube2-1.8b-i1-GGUF/resolve/main/EasyContext-256K-danube2-1.8b.i1-IQ4_XS.gguf) | i1-IQ4_XS | 1.1 | |
| [GGUF](https://huggingface.co/mradermacher/EasyContext-256K-danube2-1.8b-i1-GGUF/resolve/main/EasyContext-256K-danube2-1.8b.i1-Q4_0.gguf) | i1-Q4_0 | 1.2 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/EasyContext-256K-danube2-1.8b-i1-GGUF/resolve/main/EasyContext-256K-danube2-1.8b.i1-IQ4_NL.gguf) | i1-IQ4_NL | 1.2 | prefer IQ4_XS |
| [GGUF](https://huggingface.co/mradermacher/EasyContext-256K-danube2-1.8b-i1-GGUF/resolve/main/EasyContext-256K-danube2-1.8b.i1-Q4_K_S.gguf) | i1-Q4_K_S | 1.2 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/EasyContext-256K-danube2-1.8b-i1-GGUF/resolve/main/EasyContext-256K-danube2-1.8b.i1-Q4_K_M.gguf) | i1-Q4_K_M | 1.2 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/EasyContext-256K-danube2-1.8b-i1-GGUF/resolve/main/EasyContext-256K-danube2-1.8b.i1-Q4_1.gguf) | i1-Q4_1 | 1.3 | |
| [GGUF](https://huggingface.co/mradermacher/EasyContext-256K-danube2-1.8b-i1-GGUF/resolve/main/EasyContext-256K-danube2-1.8b.i1-Q5_K_S.gguf) | i1-Q5_K_S | 1.4 | |
| [GGUF](https://huggingface.co/mradermacher/EasyContext-256K-danube2-1.8b-i1-GGUF/resolve/main/EasyContext-256K-danube2-1.8b.i1-Q5_K_M.gguf) | i1-Q5_K_M | 1.4 | |
| [GGUF](https://huggingface.co/mradermacher/EasyContext-256K-danube2-1.8b-i1-GGUF/resolve/main/EasyContext-256K-danube2-1.8b.i1-Q6_K.gguf) | i1-Q6_K | 1.6 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):
![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png)
And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->

3
imatrix.dat Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:28448ccd7df44f1e54a984c802099db104fa1d9f22bdc0df2dd638b1e986cbb2
size 2143669