初始化项目,由ModelHub XC社区提供模型

Model: mradermacher/LFM2.5-350M-heretic-xagressive-i1-GGUF
Source: Original Platform
This commit is contained in:
ModelHub XC
2026-05-07 17:06:03 +08:00
commit 6c4d939a5b
27 changed files with 238 additions and 0 deletions

60
.gitattributes vendored Normal file
View File

@@ -0,0 +1,60 @@
*.7z filter=lfs diff=lfs merge=lfs -text
*.arrow filter=lfs diff=lfs merge=lfs -text
*.bin filter=lfs diff=lfs merge=lfs -text
*.bz2 filter=lfs diff=lfs merge=lfs -text
*.ckpt filter=lfs diff=lfs merge=lfs -text
*.ftz filter=lfs diff=lfs merge=lfs -text
*.gz filter=lfs diff=lfs merge=lfs -text
*.h5 filter=lfs diff=lfs merge=lfs -text
*.joblib filter=lfs diff=lfs merge=lfs -text
*.lfs.* filter=lfs diff=lfs merge=lfs -text
*.mlmodel filter=lfs diff=lfs merge=lfs -text
*.model filter=lfs diff=lfs merge=lfs -text
*.msgpack filter=lfs diff=lfs merge=lfs -text
*.npy filter=lfs diff=lfs merge=lfs -text
*.npz filter=lfs diff=lfs merge=lfs -text
*.onnx filter=lfs diff=lfs merge=lfs -text
*.ot filter=lfs diff=lfs merge=lfs -text
*.parquet filter=lfs diff=lfs merge=lfs -text
*.pb filter=lfs diff=lfs merge=lfs -text
*.pickle filter=lfs diff=lfs merge=lfs -text
*.pkl filter=lfs diff=lfs merge=lfs -text
*.pt filter=lfs diff=lfs merge=lfs -text
*.pth filter=lfs diff=lfs merge=lfs -text
*.rar filter=lfs diff=lfs merge=lfs -text
*.safetensors filter=lfs diff=lfs merge=lfs -text
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
*.tar.* filter=lfs diff=lfs merge=lfs -text
*.tar filter=lfs diff=lfs merge=lfs -text
*.tflite filter=lfs diff=lfs merge=lfs -text
*.tgz filter=lfs diff=lfs merge=lfs -text
*.wasm filter=lfs diff=lfs merge=lfs -text
*.xz filter=lfs diff=lfs merge=lfs -text
*.zip filter=lfs diff=lfs merge=lfs -text
*.zst filter=lfs diff=lfs merge=lfs -text
*tfevents* filter=lfs diff=lfs merge=lfs -text
LFM2.5-350M-heretic-xagressive.imatrix.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-350M-heretic-xagressive.i1-IQ1_M.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-350M-heretic-xagressive.i1-IQ1_S.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-350M-heretic-xagressive.i1-IQ2_M.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-350M-heretic-xagressive.i1-IQ2_S.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-350M-heretic-xagressive.i1-IQ2_XS.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-350M-heretic-xagressive.i1-IQ2_XXS.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-350M-heretic-xagressive.i1-IQ3_M.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-350M-heretic-xagressive.i1-IQ3_S.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-350M-heretic-xagressive.i1-IQ3_XS.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-350M-heretic-xagressive.i1-IQ3_XXS.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-350M-heretic-xagressive.i1-IQ4_NL.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-350M-heretic-xagressive.i1-IQ4_XS.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-350M-heretic-xagressive.i1-Q2_K.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-350M-heretic-xagressive.i1-Q2_K_S.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-350M-heretic-xagressive.i1-Q3_K_L.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-350M-heretic-xagressive.i1-Q3_K_M.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-350M-heretic-xagressive.i1-Q3_K_S.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-350M-heretic-xagressive.i1-Q4_0.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-350M-heretic-xagressive.i1-Q4_1.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-350M-heretic-xagressive.i1-Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-350M-heretic-xagressive.i1-Q4_K_S.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-350M-heretic-xagressive.i1-Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-350M-heretic-xagressive.i1-Q5_K_S.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-350M-heretic-xagressive.i1-Q6_K.gguf filter=lfs diff=lfs merge=lfs -text

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:c02f4ea8e62fd568dd718616ef3229415547cbdcbbd9db50365144ebfd36ba42
size 113257088

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:edaf1a71cbf6b5065de1e8e7a70e34b8d18bc21ec84432fc8af42aae6ad58543
size 106965632

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:da65e1903ed613c65f7307f20e1cde2c2a4d397d8c09a40b97d411ed3a9de71c
size 142879360

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:cb1297aa216a00dceec4028ae9038cdd6fa6d37f308c185e707a62acb7493a7b
size 134490752

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:d08e0cdbf7ba47ce9dbb8f401f3c03d2c398ec79c54ed62830cbb09c953b2dff
size 132328064

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:269244cd78613451f55b42c51578256c07e6678b337bb4a4bb133b93d3f4b6db
size 123742848

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:e868a874496a6132290ac64a2fb680392ff81c6c614afd98d8a34353783c6f28
size 183659136

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:6c13a059a18467a1de98e3c174e566ff04693b6dba0f3162aa0e84be2899ee13
size 181152384

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:aa8bbc40262cb4937e6587712b56b0e14629900b06800e6d161d685187cd3e3f
size 175401600

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:726c83a709f49f8f571ea083da70135a45e40e360a7f6e0031fa0a00b413cd2c
size 158476928

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:2ef7576136449b250db54f1efeb5f96637b1288dfb346d59028981735c32e298
size 219310720

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:110e1611fb7661c40ae741b245d3b3061dcbb34945a09e3127171cd0f330288d
size 210332288

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:8094a40a018e69e66f2b0310348dfaf2f3e8f476acb13f7a14a0af8a22f07aee
size 160598656

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:c44eb234bbf7cff6afef460ed0d407c3326863a7bb0029417139043dac2ae4be
size 154184320

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:90bc5c833338e9a9a2a8265899a6ba1ce7ad0963c640c21dd2dbf28289c59eb6
size 203049600

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:3ce40f47c0631231cd315135c5ef3a7b63d4b4dacdd1b147ff1ddceb97a2fc52
size 193153664

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:129b937dfe430ef7c31e5d1d135a54749ae4eae612875e6c11c59a61ded3fdbb
size 181152384

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:95381fa76ee3371c9ed8cdb12918ecc1e8f152a5686e458dd3efd5c9a8fbe23a
size 219900544

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:235d86f13b337d61c13b3d8ebc2f0bbfd57b2c6030b9e717c30376b08a706251
size 237267584

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:f84e9f1057e6dba69fe39f623c7bb9b94532029f15dfd5f37321abdab1d1d1ed
size 229313152

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:0e2a76f5fdffa5924f76d4569d01492edab16e5e5739386e81e01ecfb93477f8
size 220752512

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:e620ca49dd97c20c25ad71bb04222d9cf9b455e78a84d594d56f6eff21fb3ca3
size 260377216

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:0a6c7d78769fd0b3ed37c3792ba0cda8221e6b0574ffc7553489c9db94adeea5
size 255224448

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:9ab98384ff91ba65e9860f410b2fed349925e482206acee23cc9a7841b6942ce
size 293382784

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:c0c9bf1daac832c66fb70f56f6a62d6de32737a0e57a2a3b35fec7ec7b3b7707
size 620864

103
README.md Normal file
View File

@@ -0,0 +1,103 @@
---
base_model: MihaiPopa-1/LFM2.5-350M-heretic-xagressive
language:
- en
- ar
- zh
- fr
- de
- ja
- ko
- es
- pt
library_name: transformers
license: other
license_link: LICENSE
license_name: lfm1.0
mradermacher:
readme_rev: 1
quantized_by: mradermacher
tags:
- liquid
- lfm2.5
- edge
- heretic
- uncensored
- decensored
- abliterated
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
<!-- ### quants: Q2_K IQ3_M Q4_K_S IQ3_XXS Q3_K_M small-IQ4_NL Q4_K_M IQ2_M Q6_K IQ4_XS Q2_K_S IQ1_M Q3_K_S IQ2_XXS Q3_K_L IQ2_XS Q5_K_S IQ2_S IQ1_S Q5_K_M Q4_0 IQ3_XS Q4_1 IQ3_S -->
<!-- ### quants_skip: -->
<!-- ### skip_mmproj: -->
weighted/imatrix quants of https://huggingface.co/MihaiPopa-1/LFM2.5-350M-heretic-xagressive
<!-- provided-files -->
***For a convenient overview and download list, visit our [model page for this model](https://hf.tst.eu/model#LFM2.5-350M-heretic-xagressive-i1-GGUF).***
static quants are available at https://huggingface.co/mradermacher/LFM2.5-350M-heretic-xagressive-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-350M-heretic-xagressive-i1-GGUF/resolve/main/LFM2.5-350M-heretic-xagressive.imatrix.gguf) | imatrix | 0.1 | imatrix file (for creating your own quants) |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-350M-heretic-xagressive-i1-GGUF/resolve/main/LFM2.5-350M-heretic-xagressive.i1-IQ1_S.gguf) | i1-IQ1_S | 0.2 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-350M-heretic-xagressive-i1-GGUF/resolve/main/LFM2.5-350M-heretic-xagressive.i1-IQ1_M.gguf) | i1-IQ1_M | 0.2 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-350M-heretic-xagressive-i1-GGUF/resolve/main/LFM2.5-350M-heretic-xagressive.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 0.2 | |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-350M-heretic-xagressive-i1-GGUF/resolve/main/LFM2.5-350M-heretic-xagressive.i1-IQ2_XS.gguf) | i1-IQ2_XS | 0.2 | |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-350M-heretic-xagressive-i1-GGUF/resolve/main/LFM2.5-350M-heretic-xagressive.i1-IQ2_S.gguf) | i1-IQ2_S | 0.2 | |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-350M-heretic-xagressive-i1-GGUF/resolve/main/LFM2.5-350M-heretic-xagressive.i1-IQ2_M.gguf) | i1-IQ2_M | 0.2 | |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-350M-heretic-xagressive-i1-GGUF/resolve/main/LFM2.5-350M-heretic-xagressive.i1-Q2_K_S.gguf) | i1-Q2_K_S | 0.3 | very low quality |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-350M-heretic-xagressive-i1-GGUF/resolve/main/LFM2.5-350M-heretic-xagressive.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 0.3 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-350M-heretic-xagressive-i1-GGUF/resolve/main/LFM2.5-350M-heretic-xagressive.i1-Q2_K.gguf) | i1-Q2_K | 0.3 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-350M-heretic-xagressive-i1-GGUF/resolve/main/LFM2.5-350M-heretic-xagressive.i1-IQ3_XS.gguf) | i1-IQ3_XS | 0.3 | |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-350M-heretic-xagressive-i1-GGUF/resolve/main/LFM2.5-350M-heretic-xagressive.i1-IQ3_S.gguf) | i1-IQ3_S | 0.3 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-350M-heretic-xagressive-i1-GGUF/resolve/main/LFM2.5-350M-heretic-xagressive.i1-Q3_K_S.gguf) | i1-Q3_K_S | 0.3 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-350M-heretic-xagressive-i1-GGUF/resolve/main/LFM2.5-350M-heretic-xagressive.i1-IQ3_M.gguf) | i1-IQ3_M | 0.3 | |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-350M-heretic-xagressive-i1-GGUF/resolve/main/LFM2.5-350M-heretic-xagressive.i1-Q3_K_M.gguf) | i1-Q3_K_M | 0.3 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-350M-heretic-xagressive-i1-GGUF/resolve/main/LFM2.5-350M-heretic-xagressive.i1-Q3_K_L.gguf) | i1-Q3_K_L | 0.3 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-350M-heretic-xagressive-i1-GGUF/resolve/main/LFM2.5-350M-heretic-xagressive.i1-IQ4_XS.gguf) | i1-IQ4_XS | 0.3 | |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-350M-heretic-xagressive-i1-GGUF/resolve/main/LFM2.5-350M-heretic-xagressive.i1-IQ4_NL.gguf) | i1-IQ4_NL | 0.3 | prefer IQ4_XS |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-350M-heretic-xagressive-i1-GGUF/resolve/main/LFM2.5-350M-heretic-xagressive.i1-Q4_0.gguf) | i1-Q4_0 | 0.3 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-350M-heretic-xagressive-i1-GGUF/resolve/main/LFM2.5-350M-heretic-xagressive.i1-Q4_K_S.gguf) | i1-Q4_K_S | 0.3 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-350M-heretic-xagressive-i1-GGUF/resolve/main/LFM2.5-350M-heretic-xagressive.i1-Q4_K_M.gguf) | i1-Q4_K_M | 0.3 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-350M-heretic-xagressive-i1-GGUF/resolve/main/LFM2.5-350M-heretic-xagressive.i1-Q4_1.gguf) | i1-Q4_1 | 0.3 | |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-350M-heretic-xagressive-i1-GGUF/resolve/main/LFM2.5-350M-heretic-xagressive.i1-Q5_K_S.gguf) | i1-Q5_K_S | 0.4 | |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-350M-heretic-xagressive-i1-GGUF/resolve/main/LFM2.5-350M-heretic-xagressive.i1-Q5_K_M.gguf) | i1-Q5_K_M | 0.4 | |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-350M-heretic-xagressive-i1-GGUF/resolve/main/LFM2.5-350M-heretic-xagressive.i1-Q6_K.gguf) | i1-Q6_K | 0.4 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):
![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png)
And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->