commit 3c6791a66cb44ff638d86a6793d4e2ed2e35036d Author: ModelHub XC Date: Sun Apr 12 03:46:58 2026 +0800 初始化项目,由ModelHub XC社区提供模型 Model: mradermacher/G-Human-1B-i1-GGUF Source: Original Platform diff --git a/.gitattributes b/.gitattributes new file mode 100644 index 0000000..587e75c --- /dev/null +++ b/.gitattributes @@ -0,0 +1,60 @@ +*.7z filter=lfs diff=lfs merge=lfs -text +*.arrow filter=lfs diff=lfs merge=lfs -text +*.bin filter=lfs diff=lfs merge=lfs -text +*.bz2 filter=lfs diff=lfs merge=lfs -text +*.ckpt filter=lfs diff=lfs merge=lfs -text +*.ftz filter=lfs diff=lfs merge=lfs -text +*.gz filter=lfs diff=lfs merge=lfs -text +*.h5 filter=lfs diff=lfs merge=lfs -text +*.joblib filter=lfs diff=lfs merge=lfs -text +*.lfs.* filter=lfs diff=lfs merge=lfs -text +*.mlmodel filter=lfs diff=lfs merge=lfs -text +*.model filter=lfs diff=lfs merge=lfs -text +*.msgpack filter=lfs diff=lfs merge=lfs -text +*.npy filter=lfs diff=lfs merge=lfs -text +*.npz filter=lfs diff=lfs merge=lfs -text +*.onnx filter=lfs diff=lfs merge=lfs -text +*.ot filter=lfs diff=lfs merge=lfs -text +*.parquet filter=lfs diff=lfs merge=lfs -text +*.pb filter=lfs diff=lfs merge=lfs -text +*.pickle filter=lfs diff=lfs merge=lfs -text +*.pkl filter=lfs diff=lfs merge=lfs -text +*.pt filter=lfs diff=lfs merge=lfs -text +*.pth filter=lfs diff=lfs merge=lfs -text +*.rar filter=lfs diff=lfs merge=lfs -text +*.safetensors filter=lfs diff=lfs merge=lfs -text +saved_model/**/* filter=lfs diff=lfs merge=lfs -text +*.tar.* filter=lfs diff=lfs merge=lfs -text +*.tar filter=lfs diff=lfs merge=lfs -text +*.tflite filter=lfs diff=lfs merge=lfs -text +*.tgz filter=lfs diff=lfs merge=lfs -text +*.wasm filter=lfs diff=lfs merge=lfs -text +*.xz filter=lfs diff=lfs merge=lfs -text +*.zip filter=lfs diff=lfs merge=lfs -text +*.zst filter=lfs diff=lfs merge=lfs -text +*tfevents* filter=lfs diff=lfs merge=lfs -text +G-Human-1B.imatrix.gguf filter=lfs diff=lfs merge=lfs -text +G-Human-1B.i1-IQ1_M.gguf filter=lfs diff=lfs merge=lfs -text +G-Human-1B.i1-IQ1_S.gguf filter=lfs diff=lfs merge=lfs -text +G-Human-1B.i1-IQ2_M.gguf filter=lfs diff=lfs merge=lfs -text +G-Human-1B.i1-IQ2_S.gguf filter=lfs diff=lfs merge=lfs -text +G-Human-1B.i1-IQ2_XS.gguf filter=lfs diff=lfs merge=lfs -text +G-Human-1B.i1-IQ2_XXS.gguf filter=lfs diff=lfs merge=lfs -text +G-Human-1B.i1-IQ3_M.gguf filter=lfs diff=lfs merge=lfs -text +G-Human-1B.i1-IQ3_S.gguf filter=lfs diff=lfs merge=lfs -text +G-Human-1B.i1-IQ3_XS.gguf filter=lfs diff=lfs merge=lfs -text +G-Human-1B.i1-IQ3_XXS.gguf filter=lfs diff=lfs merge=lfs -text +G-Human-1B.i1-IQ4_NL.gguf filter=lfs diff=lfs merge=lfs -text +G-Human-1B.i1-IQ4_XS.gguf filter=lfs diff=lfs merge=lfs -text +G-Human-1B.i1-Q2_K.gguf filter=lfs diff=lfs merge=lfs -text +G-Human-1B.i1-Q2_K_S.gguf filter=lfs diff=lfs merge=lfs -text +G-Human-1B.i1-Q3_K_L.gguf filter=lfs diff=lfs merge=lfs -text +G-Human-1B.i1-Q3_K_M.gguf filter=lfs diff=lfs merge=lfs -text +G-Human-1B.i1-Q3_K_S.gguf filter=lfs diff=lfs merge=lfs -text +G-Human-1B.i1-Q4_0.gguf filter=lfs diff=lfs merge=lfs -text +G-Human-1B.i1-Q4_1.gguf filter=lfs diff=lfs merge=lfs -text +G-Human-1B.i1-Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text +G-Human-1B.i1-Q4_K_S.gguf filter=lfs diff=lfs merge=lfs -text +G-Human-1B.i1-Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text +G-Human-1B.i1-Q5_K_S.gguf filter=lfs diff=lfs merge=lfs -text +G-Human-1B.i1-Q6_K.gguf filter=lfs diff=lfs merge=lfs -text diff --git a/G-Human-1B.i1-IQ1_M.gguf b/G-Human-1B.i1-IQ1_M.gguf new file mode 100644 index 0000000..ba3b141 --- /dev/null +++ b/G-Human-1B.i1-IQ1_M.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:3e1b45cf63e62d72ac9773bd86a3fffb1a2639de167a32347091c94bf439a121 +size 643485056 diff --git a/G-Human-1B.i1-IQ1_S.gguf b/G-Human-1B.i1-IQ1_S.gguf new file mode 100644 index 0000000..47a8143 --- /dev/null +++ b/G-Human-1B.i1-IQ1_S.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:7e3c8c0054b6cb4c26604400fe91e10e6921fd0e4181702080a3f6ab0e50a70c +size 639192704 diff --git a/G-Human-1B.i1-IQ2_M.gguf b/G-Human-1B.i1-IQ2_M.gguf new file mode 100644 index 0000000..cb9938e --- /dev/null +++ b/G-Human-1B.i1-IQ2_M.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:33c937d24367c90c3eaa984759e390d0c68ae6b91757b3774000e9d3f1bd9fe4 +size 669782912 diff --git a/G-Human-1B.i1-IQ2_S.gguf b/G-Human-1B.i1-IQ2_S.gguf new file mode 100644 index 0000000..aa96af0 --- /dev/null +++ b/G-Human-1B.i1-IQ2_S.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:dc60a1d8df012de9b39afeb92f7697f26049e3a4134e38ae0981a4257a55bc97 +size 664059776 diff --git a/G-Human-1B.i1-IQ2_XS.gguf b/G-Human-1B.i1-IQ2_XS.gguf new file mode 100644 index 0000000..0dab398 --- /dev/null +++ b/G-Human-1B.i1-IQ2_XS.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:5f7e2e0377ef5276e01dd74c8b29d397243b35e880c7fcfe9992b259cc3f52a3 +size 657320576 diff --git a/G-Human-1B.i1-IQ2_XXS.gguf b/G-Human-1B.i1-IQ2_XXS.gguf new file mode 100644 index 0000000..bf10574 --- /dev/null +++ b/G-Human-1B.i1-IQ2_XXS.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:c60c7fcf050478d95d5fb40503aa333ae691d72b6ba7c6ec3f4ed126ddc26c81 +size 650638976 diff --git a/G-Human-1B.i1-IQ3_M.gguf b/G-Human-1B.i1-IQ3_M.gguf new file mode 100644 index 0000000..09482bf --- /dev/null +++ b/G-Human-1B.i1-IQ3_M.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:9bf0e1f8f966b233358b10f43ee0dd5d8f03153b1b1f318c3ac994749e55f1d1 +size 697059968 diff --git a/G-Human-1B.i1-IQ3_S.gguf b/G-Human-1B.i1-IQ3_S.gguf new file mode 100644 index 0000000..3f24bad --- /dev/null +++ b/G-Human-1B.i1-IQ3_S.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:f1c5303da7abf6e02a134499c1effcbeef36d168fc29225c16c6c988efb2bd1d +size 689813888 diff --git a/G-Human-1B.i1-IQ3_XS.gguf b/G-Human-1B.i1-IQ3_XS.gguf new file mode 100644 index 0000000..0a1ffb3 --- /dev/null +++ b/G-Human-1B.i1-IQ3_XS.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:c5ebc32916a2f05b908fc74b446a992fd402d1142297deff972738ce6809cf79 +size 689813888 diff --git a/G-Human-1B.i1-IQ3_XXS.gguf b/G-Human-1B.i1-IQ3_XXS.gguf new file mode 100644 index 0000000..ad31015 --- /dev/null +++ b/G-Human-1B.i1-IQ3_XXS.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:f9317ce1655f0218d0b79a29662841f5a1ec8d5ebbf7c7a14173f9784841efea +size 680109440 diff --git a/G-Human-1B.i1-IQ4_NL.gguf b/G-Human-1B.i1-IQ4_NL.gguf new file mode 100644 index 0000000..069d614 --- /dev/null +++ b/G-Human-1B.i1-IQ4_NL.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:ce245f1e01259ee23d3a7ebd6c0d81e16c85eb6d385bcb07ab6f973fe9700e19 +size 721862528 diff --git a/G-Human-1B.i1-IQ4_XS.gguf b/G-Human-1B.i1-IQ4_XS.gguf new file mode 100644 index 0000000..850ff3f --- /dev/null +++ b/G-Human-1B.i1-IQ4_XS.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:5fe2813580353116c8b6927bca2d88757dabb97b164f99c546e6387a94d68b48 +size 714434432 diff --git a/G-Human-1B.i1-Q2_K.gguf b/G-Human-1B.i1-Q2_K.gguf new file mode 100644 index 0000000..ab373d5 --- /dev/null +++ b/G-Human-1B.i1-Q2_K.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:6a2e95391823453ac1eb66c12017f2acbee18a32d5cca0f262bb6ee08232a639 +size 689813888 diff --git a/G-Human-1B.i1-Q2_K_S.gguf b/G-Human-1B.i1-Q2_K_S.gguf new file mode 100644 index 0000000..9a5a083 --- /dev/null +++ b/G-Human-1B.i1-Q2_K_S.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:aceb603bca0b008347bcb5066efbd4ed9fa6b7cb9f48bbe48d0429c5a4b578ba +size 671271296 diff --git a/G-Human-1B.i1-Q3_K_L.gguf b/G-Human-1B.i1-Q3_K_L.gguf new file mode 100644 index 0000000..f08b756 --- /dev/null +++ b/G-Human-1B.i1-Q3_K_L.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:24b640466b0aee7f736cb1455afb973d0b1c7d3e47a6f0bdac5f8c861daaf470 +size 751574912 diff --git a/G-Human-1B.i1-Q3_K_M.gguf b/G-Human-1B.i1-Q3_K_M.gguf new file mode 100644 index 0000000..92acc85 --- /dev/null +++ b/G-Human-1B.i1-Q3_K_M.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:cc8723458d9c6c560a162735fd5ddd0151b95bdd54771b5a3be3ebb3c817e3b1 +size 722415488 diff --git a/G-Human-1B.i1-Q3_K_S.gguf b/G-Human-1B.i1-Q3_K_S.gguf new file mode 100644 index 0000000..51d844e --- /dev/null +++ b/G-Human-1B.i1-Q3_K_S.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:42b100a4d187a63c046e20b06d474ab8bf50555e2e20fb2bb2aa05374716698c +size 688855424 diff --git a/G-Human-1B.i1-Q4_0.gguf b/G-Human-1B.i1-Q4_0.gguf new file mode 100644 index 0000000..b0e0475 --- /dev/null +++ b/G-Human-1B.i1-Q4_0.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:1e2ee56a155bd44749c84c6f95dd8c52cd511ec0f9fc413f2ac4da5057c99c63 +size 721917824 diff --git a/G-Human-1B.i1-Q4_1.gguf b/G-Human-1B.i1-Q4_1.gguf new file mode 100644 index 0000000..100dd63 --- /dev/null +++ b/G-Human-1B.i1-Q4_1.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:f5ce7ba26b1c83b89305daf978e5df0b7e833c048658c24661825c86177865fa +size 764034944 diff --git a/G-Human-1B.i1-Q4_K_M.gguf b/G-Human-1B.i1-Q4_K_M.gguf new file mode 100644 index 0000000..1e1c937 --- /dev/null +++ b/G-Human-1B.i1-Q4_K_M.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:3e070c513238ae0212dd6dd495a20f9a2bc5204e9fdf0f7d2b7e263dc93a26e5 +size 806057600 diff --git a/G-Human-1B.i1-Q4_K_S.gguf b/G-Human-1B.i1-Q4_K_S.gguf new file mode 100644 index 0000000..31d23ea --- /dev/null +++ b/G-Human-1B.i1-Q4_K_S.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:fa1c3b006d5617c3e1c6d00210e734b81c84b7e9182a9ea1615d01b9ab330a1e +size 780992384 diff --git a/G-Human-1B.i1-Q5_K_M.gguf b/G-Human-1B.i1-Q5_K_M.gguf new file mode 100644 index 0000000..0302930 --- /dev/null +++ b/G-Human-1B.i1-Q5_K_M.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:358d8ea972992e801f63c67a36105c20a0cb54648f87edaff57508fb81471395 +size 851345024 diff --git a/G-Human-1B.i1-Q5_K_S.gguf b/G-Human-1B.i1-Q5_K_S.gguf new file mode 100644 index 0000000..a4c1cf7 --- /dev/null +++ b/G-Human-1B.i1-Q5_K_S.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:daea1352fe92c6886ebfc7725825893643f1943d63d6a5e409d47d49ff940842 +size 836398976 diff --git a/G-Human-1B.i1-Q6_K.gguf b/G-Human-1B.i1-Q6_K.gguf new file mode 100644 index 0000000..7264683 --- /dev/null +++ b/G-Human-1B.i1-Q6_K.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:57842016e5adc875615199f4263c03e16d30ee9159887b92f6083efc9cba6382 +size 1011737984 diff --git a/G-Human-1B.imatrix.gguf b/G-Human-1B.imatrix.gguf new file mode 100644 index 0000000..35f1e7b --- /dev/null +++ b/G-Human-1B.imatrix.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:fe9108f20e2f1b82c3b78575cd644029602d55a64f24688508549f25a014cbec +size 1452416 diff --git a/README.md b/README.md new file mode 100644 index 0000000..d4964c1 --- /dev/null +++ b/README.md @@ -0,0 +1,87 @@ +--- +base_model: UmbrellaInc/G-Human-1B +language: +- en +library_name: transformers +mradermacher: + readme_rev: 1 +quantized_by: mradermacher +tags: +- mergekit +- merge +--- +## About + + + + + + + + + +weighted/imatrix quants of https://huggingface.co/UmbrellaInc/G-Human-1B + + + +***For a convenient overview and download list, visit our [model page for this model](https://hf.tst.eu/model#G-Human-1B-i1-GGUF).*** + +static quants are available at https://huggingface.co/mradermacher/G-Human-1B-GGUF +## Usage + +If you are unsure how to use GGUF files, refer to one of [TheBloke's +READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for +more details, including on how to concatenate multi-part files. + +## Provided Quants + +(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants) + +| Link | Type | Size/GB | Notes | +|:-----|:-----|--------:|:------| +| [GGUF](https://huggingface.co/mradermacher/G-Human-1B-i1-GGUF/resolve/main/G-Human-1B.imatrix.gguf) | imatrix | 0.1 | imatrix file (for creating your own quants) | +| [GGUF](https://huggingface.co/mradermacher/G-Human-1B-i1-GGUF/resolve/main/G-Human-1B.i1-IQ1_S.gguf) | i1-IQ1_S | 0.7 | for the desperate | +| [GGUF](https://huggingface.co/mradermacher/G-Human-1B-i1-GGUF/resolve/main/G-Human-1B.i1-IQ1_M.gguf) | i1-IQ1_M | 0.7 | mostly desperate | +| [GGUF](https://huggingface.co/mradermacher/G-Human-1B-i1-GGUF/resolve/main/G-Human-1B.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 0.8 | | +| [GGUF](https://huggingface.co/mradermacher/G-Human-1B-i1-GGUF/resolve/main/G-Human-1B.i1-IQ2_XS.gguf) | i1-IQ2_XS | 0.8 | | +| [GGUF](https://huggingface.co/mradermacher/G-Human-1B-i1-GGUF/resolve/main/G-Human-1B.i1-IQ2_S.gguf) | i1-IQ2_S | 0.8 | | +| [GGUF](https://huggingface.co/mradermacher/G-Human-1B-i1-GGUF/resolve/main/G-Human-1B.i1-IQ2_M.gguf) | i1-IQ2_M | 0.8 | | +| [GGUF](https://huggingface.co/mradermacher/G-Human-1B-i1-GGUF/resolve/main/G-Human-1B.i1-Q2_K_S.gguf) | i1-Q2_K_S | 0.8 | very low quality | +| [GGUF](https://huggingface.co/mradermacher/G-Human-1B-i1-GGUF/resolve/main/G-Human-1B.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 0.8 | lower quality | +| [GGUF](https://huggingface.co/mradermacher/G-Human-1B-i1-GGUF/resolve/main/G-Human-1B.i1-Q3_K_S.gguf) | i1-Q3_K_S | 0.8 | IQ3_XS probably better | +| [GGUF](https://huggingface.co/mradermacher/G-Human-1B-i1-GGUF/resolve/main/G-Human-1B.i1-IQ3_S.gguf) | i1-IQ3_S | 0.8 | beats Q3_K* | +| [GGUF](https://huggingface.co/mradermacher/G-Human-1B-i1-GGUF/resolve/main/G-Human-1B.i1-IQ3_XS.gguf) | i1-IQ3_XS | 0.8 | | +| [GGUF](https://huggingface.co/mradermacher/G-Human-1B-i1-GGUF/resolve/main/G-Human-1B.i1-Q2_K.gguf) | i1-Q2_K | 0.8 | IQ3_XXS probably better | +| [GGUF](https://huggingface.co/mradermacher/G-Human-1B-i1-GGUF/resolve/main/G-Human-1B.i1-IQ3_M.gguf) | i1-IQ3_M | 0.8 | | +| [GGUF](https://huggingface.co/mradermacher/G-Human-1B-i1-GGUF/resolve/main/G-Human-1B.i1-IQ4_XS.gguf) | i1-IQ4_XS | 0.8 | | +| [GGUF](https://huggingface.co/mradermacher/G-Human-1B-i1-GGUF/resolve/main/G-Human-1B.i1-IQ4_NL.gguf) | i1-IQ4_NL | 0.8 | prefer IQ4_XS | +| [GGUF](https://huggingface.co/mradermacher/G-Human-1B-i1-GGUF/resolve/main/G-Human-1B.i1-Q4_0.gguf) | i1-Q4_0 | 0.8 | fast, low quality | +| [GGUF](https://huggingface.co/mradermacher/G-Human-1B-i1-GGUF/resolve/main/G-Human-1B.i1-Q3_K_M.gguf) | i1-Q3_K_M | 0.8 | IQ3_S probably better | +| [GGUF](https://huggingface.co/mradermacher/G-Human-1B-i1-GGUF/resolve/main/G-Human-1B.i1-Q3_K_L.gguf) | i1-Q3_K_L | 0.9 | IQ3_M probably better | +| [GGUF](https://huggingface.co/mradermacher/G-Human-1B-i1-GGUF/resolve/main/G-Human-1B.i1-Q4_1.gguf) | i1-Q4_1 | 0.9 | | +| [GGUF](https://huggingface.co/mradermacher/G-Human-1B-i1-GGUF/resolve/main/G-Human-1B.i1-Q4_K_S.gguf) | i1-Q4_K_S | 0.9 | optimal size/speed/quality | +| [GGUF](https://huggingface.co/mradermacher/G-Human-1B-i1-GGUF/resolve/main/G-Human-1B.i1-Q4_K_M.gguf) | i1-Q4_K_M | 0.9 | fast, recommended | +| [GGUF](https://huggingface.co/mradermacher/G-Human-1B-i1-GGUF/resolve/main/G-Human-1B.i1-Q5_K_S.gguf) | i1-Q5_K_S | 0.9 | | +| [GGUF](https://huggingface.co/mradermacher/G-Human-1B-i1-GGUF/resolve/main/G-Human-1B.i1-Q5_K_M.gguf) | i1-Q5_K_M | 1.0 | | +| [GGUF](https://huggingface.co/mradermacher/G-Human-1B-i1-GGUF/resolve/main/G-Human-1B.i1-Q6_K.gguf) | i1-Q6_K | 1.1 | practically like static Q6_K | + +Here is a handy graph by ikawrakow comparing some lower-quality quant +types (lower is better): + +![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png) + +And here are Artefact2's thoughts on the matter: +https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9 + +## FAQ / Model Request + +See https://huggingface.co/mradermacher/model_requests for some answers to +questions you might have and/or if you want some other model quantized. + +## Thanks + +I thank my company, [nethype GmbH](https://www.nethype.de/), for letting +me use its servers and providing upgrades to my workstation to enable +this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to. + +