commit b9859f248e53b513db5497cb5d8885f593f3ffe4 Author: ModelHub XC Date: Fri May 1 12:43:13 2026 +0800 初始化项目,由ModelHub XC社区提供模型 Model: mradermacher/HumanFlow-i1-GGUF Source: Original Platform diff --git a/.gitattributes b/.gitattributes new file mode 100644 index 0000000..6794d19 --- /dev/null +++ b/.gitattributes @@ -0,0 +1,60 @@ +*.7z filter=lfs diff=lfs merge=lfs -text +*.arrow filter=lfs diff=lfs merge=lfs -text +*.bin filter=lfs diff=lfs merge=lfs -text +*.bz2 filter=lfs diff=lfs merge=lfs -text +*.ckpt filter=lfs diff=lfs merge=lfs -text +*.ftz filter=lfs diff=lfs merge=lfs -text +*.gz filter=lfs diff=lfs merge=lfs -text +*.h5 filter=lfs diff=lfs merge=lfs -text +*.joblib filter=lfs diff=lfs merge=lfs -text +*.lfs.* filter=lfs diff=lfs merge=lfs -text +*.mlmodel filter=lfs diff=lfs merge=lfs -text +*.model filter=lfs diff=lfs merge=lfs -text +*.msgpack filter=lfs diff=lfs merge=lfs -text +*.npy filter=lfs diff=lfs merge=lfs -text +*.npz filter=lfs diff=lfs merge=lfs -text +*.onnx filter=lfs diff=lfs merge=lfs -text +*.ot filter=lfs diff=lfs merge=lfs -text +*.parquet filter=lfs diff=lfs merge=lfs -text +*.pb filter=lfs diff=lfs merge=lfs -text +*.pickle filter=lfs diff=lfs merge=lfs -text +*.pkl filter=lfs diff=lfs merge=lfs -text +*.pt filter=lfs diff=lfs merge=lfs -text +*.pth filter=lfs diff=lfs merge=lfs -text +*.rar filter=lfs diff=lfs merge=lfs -text +*.safetensors filter=lfs diff=lfs merge=lfs -text +saved_model/**/* filter=lfs diff=lfs merge=lfs -text +*.tar.* filter=lfs diff=lfs merge=lfs -text +*.tar filter=lfs diff=lfs merge=lfs -text +*.tflite filter=lfs diff=lfs merge=lfs -text +*.tgz filter=lfs diff=lfs merge=lfs -text +*.wasm filter=lfs diff=lfs merge=lfs -text +*.xz filter=lfs diff=lfs merge=lfs -text +*.zip filter=lfs diff=lfs merge=lfs -text +*.zst filter=lfs diff=lfs merge=lfs -text +*tfevents* filter=lfs diff=lfs merge=lfs -text +HumanFlow.imatrix.gguf filter=lfs diff=lfs merge=lfs -text +HumanFlow.i1-Q2_K.gguf filter=lfs diff=lfs merge=lfs -text +HumanFlow.i1-IQ3_M.gguf filter=lfs diff=lfs merge=lfs -text +HumanFlow.i1-Q4_K_S.gguf filter=lfs diff=lfs merge=lfs -text +HumanFlow.i1-IQ3_XXS.gguf filter=lfs diff=lfs merge=lfs -text +HumanFlow.i1-Q3_K_M.gguf filter=lfs diff=lfs merge=lfs -text +HumanFlow.i1-IQ4_NL.gguf filter=lfs diff=lfs merge=lfs -text +HumanFlow.i1-Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text +HumanFlow.i1-IQ2_M.gguf filter=lfs diff=lfs merge=lfs -text +HumanFlow.i1-Q6_K.gguf filter=lfs diff=lfs merge=lfs -text +HumanFlow.i1-IQ4_XS.gguf filter=lfs diff=lfs merge=lfs -text +HumanFlow.i1-Q2_K_S.gguf filter=lfs diff=lfs merge=lfs -text +HumanFlow.i1-IQ1_M.gguf filter=lfs diff=lfs merge=lfs -text +HumanFlow.i1-Q3_K_S.gguf filter=lfs diff=lfs merge=lfs -text +HumanFlow.i1-IQ2_XXS.gguf filter=lfs diff=lfs merge=lfs -text +HumanFlow.i1-Q3_K_L.gguf filter=lfs diff=lfs merge=lfs -text +HumanFlow.i1-IQ2_XS.gguf filter=lfs diff=lfs merge=lfs -text +HumanFlow.i1-Q5_K_S.gguf filter=lfs diff=lfs merge=lfs -text +HumanFlow.i1-IQ2_S.gguf filter=lfs diff=lfs merge=lfs -text +HumanFlow.i1-IQ1_S.gguf filter=lfs diff=lfs merge=lfs -text +HumanFlow.i1-Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text +HumanFlow.i1-Q4_0.gguf filter=lfs diff=lfs merge=lfs -text +HumanFlow.i1-IQ3_XS.gguf filter=lfs diff=lfs merge=lfs -text +HumanFlow.i1-Q4_1.gguf filter=lfs diff=lfs merge=lfs -text +HumanFlow.i1-IQ3_S.gguf filter=lfs diff=lfs merge=lfs -text diff --git a/HumanFlow.i1-IQ1_M.gguf b/HumanFlow.i1-IQ1_M.gguf new file mode 100644 index 0000000..a95f91e --- /dev/null +++ b/HumanFlow.i1-IQ1_M.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:ca35747eddbf2a39cfef2bed2d74fcc9caa92c6bec1447fb0b29794b8e773185 +size 2161972992 diff --git a/HumanFlow.i1-IQ1_S.gguf b/HumanFlow.i1-IQ1_S.gguf new file mode 100644 index 0000000..7a547fe --- /dev/null +++ b/HumanFlow.i1-IQ1_S.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:9d9a1bc47fbf3d9f0b1f6857738e27537f4b253f8e446e3a86aaf9ec91f40e73 +size 2019628800 diff --git a/HumanFlow.i1-IQ2_M.gguf b/HumanFlow.i1-IQ2_M.gguf new file mode 100644 index 0000000..8bdf84a --- /dev/null +++ b/HumanFlow.i1-IQ2_M.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:2641b6ca07b7e10b5109bafd9f115c69b561dc62fb7a189bd061329ee5f3f8b3 +size 2948282112 diff --git a/HumanFlow.i1-IQ2_S.gguf b/HumanFlow.i1-IQ2_S.gguf new file mode 100644 index 0000000..e7fd1e9 --- /dev/null +++ b/HumanFlow.i1-IQ2_S.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:2409c99cd7e5f973ff72cbbc84344f62347584ab629c0fe5ca14a0dbdd7c4e4d +size 2758489856 diff --git a/HumanFlow.i1-IQ2_XS.gguf b/HumanFlow.i1-IQ2_XS.gguf new file mode 100644 index 0000000..cd095ce --- /dev/null +++ b/HumanFlow.i1-IQ2_XS.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:1c641dbbed0a61bf6c807dcc7829facdae700643788c609e2a2b2b68492b81af +size 2605782784 diff --git a/HumanFlow.i1-IQ2_XXS.gguf b/HumanFlow.i1-IQ2_XXS.gguf new file mode 100644 index 0000000..bc608bc --- /dev/null +++ b/HumanFlow.i1-IQ2_XXS.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:93e9f7cc6a64a6c1966d8acca925c5cbb34efddb301b591598c0b140e3dcd46d +size 2399213312 diff --git a/HumanFlow.i1-IQ3_M.gguf b/HumanFlow.i1-IQ3_M.gguf new file mode 100644 index 0000000..ed24c06 --- /dev/null +++ b/HumanFlow.i1-IQ3_M.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:7865617903ba43fcb130d1bef658643b88fce0df0f3f11c99aa2cc1eb1e17564 +size 3784824576 diff --git a/HumanFlow.i1-IQ3_S.gguf b/HumanFlow.i1-IQ3_S.gguf new file mode 100644 index 0000000..f68f93e --- /dev/null +++ b/HumanFlow.i1-IQ3_S.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:d7f627dc4dba55c1dd3a32cc34900426d7fe1539578bd2851a96e767f9bff998 +size 3682326272 diff --git a/HumanFlow.i1-IQ3_XS.gguf b/HumanFlow.i1-IQ3_XS.gguf new file mode 100644 index 0000000..e581cb0 --- /dev/null +++ b/HumanFlow.i1-IQ3_XS.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:454365b9291423703fdcd173a7f0e8e19551d607ff26ad74664a1ce04a29e0ec +size 3518748416 diff --git a/HumanFlow.i1-IQ3_XXS.gguf b/HumanFlow.i1-IQ3_XXS.gguf new file mode 100644 index 0000000..16793b1 --- /dev/null +++ b/HumanFlow.i1-IQ3_XXS.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:63bdc5c16f17f81f3bc050161734fc806a801019f40adc93bb71632d464a5412 +size 3274913536 diff --git a/HumanFlow.i1-IQ4_NL.gguf b/HumanFlow.i1-IQ4_NL.gguf new file mode 100644 index 0000000..c98b797 --- /dev/null +++ b/HumanFlow.i1-IQ4_NL.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:8e1100e3b1afa090f4e662bfc542b76f41a70df3acb33457cc7302c89ee28cff +size 4677990144 diff --git a/HumanFlow.i1-IQ4_XS.gguf b/HumanFlow.i1-IQ4_XS.gguf new file mode 100644 index 0000000..fb4ffd7 --- /dev/null +++ b/HumanFlow.i1-IQ4_XS.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:01c668d46fc9299ca89ff714444c7acb6bc596952b14526a7dc01d32ab76b748 +size 4447663872 diff --git a/HumanFlow.i1-Q2_K.gguf b/HumanFlow.i1-Q2_K.gguf new file mode 100644 index 0000000..91f0f4a --- /dev/null +++ b/HumanFlow.i1-Q2_K.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:636d4185428f6d74dfffa9da02417f5419dd194dc0877e486a37be9345aaf936 +size 3179132672 diff --git a/HumanFlow.i1-Q2_K_S.gguf b/HumanFlow.i1-Q2_K_S.gguf new file mode 100644 index 0000000..5fc37a2 --- /dev/null +++ b/HumanFlow.i1-Q2_K_S.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:f033946011840e5a4fe3888def0d3fd0b13e29146b6f004946049e9d37c90732 +size 2988816128 diff --git a/HumanFlow.i1-Q3_K_L.gguf b/HumanFlow.i1-Q3_K_L.gguf new file mode 100644 index 0000000..cba650c --- /dev/null +++ b/HumanFlow.i1-Q3_K_L.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:311b3bee4cfe2c8b91b34a84a49715d9e9e2423b2b6ad5f02901a2a58fa4328c +size 4321957632 diff --git a/HumanFlow.i1-Q3_K_M.gguf b/HumanFlow.i1-Q3_K_M.gguf new file mode 100644 index 0000000..b71db01 --- /dev/null +++ b/HumanFlow.i1-Q3_K_M.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:9859ebcba1f32034189144e82bc64be19cbb0356bf85d8825a62af743a3ad3e4 +size 4018919168 diff --git a/HumanFlow.i1-Q3_K_S.gguf b/HumanFlow.i1-Q3_K_S.gguf new file mode 100644 index 0000000..064661c --- /dev/null +++ b/HumanFlow.i1-Q3_K_S.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:5c205df920ba5f10870f885ade56f1f2259f54157f2f0bf39e4b70f05a71c7e6 +size 3664500480 diff --git a/HumanFlow.i1-Q4_0.gguf b/HumanFlow.i1-Q4_0.gguf new file mode 100644 index 0000000..e42d73f --- /dev/null +++ b/HumanFlow.i1-Q4_0.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:d5ccc8024baf39eb1023315d2517a9162b5bf0a743fdceb5745774af51f7ece5 +size 4675892992 diff --git a/HumanFlow.i1-Q4_1.gguf b/HumanFlow.i1-Q4_1.gguf new file mode 100644 index 0000000..66f8a55 --- /dev/null +++ b/HumanFlow.i1-Q4_1.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:966ee55bf684077161fe6a2c52b141330dda38f2016a82ce3bcadebefc63e472 +size 5130254080 diff --git a/HumanFlow.i1-Q4_K_M.gguf b/HumanFlow.i1-Q4_K_M.gguf new file mode 100644 index 0000000..e4c7ee8 --- /dev/null +++ b/HumanFlow.i1-Q4_K_M.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:bf34ac96983dccb2d2cdb081f136ebb6f9331aff2b68ec8103ab055e972e9463 +size 4920735488 diff --git a/HumanFlow.i1-Q4_K_S.gguf b/HumanFlow.i1-Q4_K_S.gguf new file mode 100644 index 0000000..64d21e2 --- /dev/null +++ b/HumanFlow.i1-Q4_K_S.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:bc1f5164d932c263ad35e4e8c09473f7ac6500c74b6e5348bff77282c79e0578 +size 4692670208 diff --git a/HumanFlow.i1-Q5_K_M.gguf b/HumanFlow.i1-Q5_K_M.gguf new file mode 100644 index 0000000..824c5f2 --- /dev/null +++ b/HumanFlow.i1-Q5_K_M.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:b5111e373f4e75c74189c6b24fb807bbd7a8c30020b2a179b3258a17dea73ea4 +size 5732988672 diff --git a/HumanFlow.i1-Q5_K_S.gguf b/HumanFlow.i1-Q5_K_S.gguf new file mode 100644 index 0000000..c87b16d --- /dev/null +++ b/HumanFlow.i1-Q5_K_S.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:b6e0d41fc27a54b74598db53d54e94f0558a11cc4839132840893dfea1519ebd +size 5599295232 diff --git a/HumanFlow.i1-Q6_K.gguf b/HumanFlow.i1-Q6_K.gguf new file mode 100644 index 0000000..d54fe49 --- /dev/null +++ b/HumanFlow.i1-Q6_K.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:6527a6e9d2813e15f3caa825a06b1e20fe329f9f30d0a6c96cad8c5cff6da5ac +size 6596007680 diff --git a/HumanFlow.imatrix.gguf b/HumanFlow.imatrix.gguf new file mode 100644 index 0000000..3d2d28d --- /dev/null +++ b/HumanFlow.imatrix.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:2090347fb7cb2219884a6c24c5a2768d22bd5949b79645db82970c584e5e253a +size 5015200 diff --git a/README.md b/README.md new file mode 100644 index 0000000..c60ef01 --- /dev/null +++ b/README.md @@ -0,0 +1,94 @@ +--- +base_model: randhir302/HumanFlow +language: +- en +library_name: transformers +license: apache-2.0 +mradermacher: + readme_rev: 1 +quantized_by: mradermacher +tags: +- text-generation +- llama3 +- humanizer +- rewriting +- conversational +- merged +- sft +- editorial +--- +## About + + + + + + + + + +weighted/imatrix quants of https://huggingface.co/randhir302/HumanFlow + + + +***For a convenient overview and download list, visit our [model page for this model](https://hf.tst.eu/model#HumanFlow-i1-GGUF).*** + +static quants are available at https://huggingface.co/mradermacher/HumanFlow-GGUF +## Usage + +If you are unsure how to use GGUF files, refer to one of [TheBloke's +READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for +more details, including on how to concatenate multi-part files. + +## Provided Quants + +(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants) + +| Link | Type | Size/GB | Notes | +|:-----|:-----|--------:|:------| +| [GGUF](https://huggingface.co/mradermacher/HumanFlow-i1-GGUF/resolve/main/HumanFlow.imatrix.gguf) | imatrix | 0.1 | imatrix file (for creating your own quants) | +| [GGUF](https://huggingface.co/mradermacher/HumanFlow-i1-GGUF/resolve/main/HumanFlow.i1-IQ1_S.gguf) | i1-IQ1_S | 2.1 | for the desperate | +| [GGUF](https://huggingface.co/mradermacher/HumanFlow-i1-GGUF/resolve/main/HumanFlow.i1-IQ1_M.gguf) | i1-IQ1_M | 2.3 | mostly desperate | +| [GGUF](https://huggingface.co/mradermacher/HumanFlow-i1-GGUF/resolve/main/HumanFlow.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 2.5 | | +| [GGUF](https://huggingface.co/mradermacher/HumanFlow-i1-GGUF/resolve/main/HumanFlow.i1-IQ2_XS.gguf) | i1-IQ2_XS | 2.7 | | +| [GGUF](https://huggingface.co/mradermacher/HumanFlow-i1-GGUF/resolve/main/HumanFlow.i1-IQ2_S.gguf) | i1-IQ2_S | 2.9 | | +| [GGUF](https://huggingface.co/mradermacher/HumanFlow-i1-GGUF/resolve/main/HumanFlow.i1-IQ2_M.gguf) | i1-IQ2_M | 3.0 | | +| [GGUF](https://huggingface.co/mradermacher/HumanFlow-i1-GGUF/resolve/main/HumanFlow.i1-Q2_K_S.gguf) | i1-Q2_K_S | 3.1 | very low quality | +| [GGUF](https://huggingface.co/mradermacher/HumanFlow-i1-GGUF/resolve/main/HumanFlow.i1-Q2_K.gguf) | i1-Q2_K | 3.3 | IQ3_XXS probably better | +| [GGUF](https://huggingface.co/mradermacher/HumanFlow-i1-GGUF/resolve/main/HumanFlow.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 3.4 | lower quality | +| [GGUF](https://huggingface.co/mradermacher/HumanFlow-i1-GGUF/resolve/main/HumanFlow.i1-IQ3_XS.gguf) | i1-IQ3_XS | 3.6 | | +| [GGUF](https://huggingface.co/mradermacher/HumanFlow-i1-GGUF/resolve/main/HumanFlow.i1-Q3_K_S.gguf) | i1-Q3_K_S | 3.8 | IQ3_XS probably better | +| [GGUF](https://huggingface.co/mradermacher/HumanFlow-i1-GGUF/resolve/main/HumanFlow.i1-IQ3_S.gguf) | i1-IQ3_S | 3.8 | beats Q3_K* | +| [GGUF](https://huggingface.co/mradermacher/HumanFlow-i1-GGUF/resolve/main/HumanFlow.i1-IQ3_M.gguf) | i1-IQ3_M | 3.9 | | +| [GGUF](https://huggingface.co/mradermacher/HumanFlow-i1-GGUF/resolve/main/HumanFlow.i1-Q3_K_M.gguf) | i1-Q3_K_M | 4.1 | IQ3_S probably better | +| [GGUF](https://huggingface.co/mradermacher/HumanFlow-i1-GGUF/resolve/main/HumanFlow.i1-Q3_K_L.gguf) | i1-Q3_K_L | 4.4 | IQ3_M probably better | +| [GGUF](https://huggingface.co/mradermacher/HumanFlow-i1-GGUF/resolve/main/HumanFlow.i1-IQ4_XS.gguf) | i1-IQ4_XS | 4.5 | | +| [GGUF](https://huggingface.co/mradermacher/HumanFlow-i1-GGUF/resolve/main/HumanFlow.i1-Q4_0.gguf) | i1-Q4_0 | 4.8 | fast, low quality | +| [GGUF](https://huggingface.co/mradermacher/HumanFlow-i1-GGUF/resolve/main/HumanFlow.i1-IQ4_NL.gguf) | i1-IQ4_NL | 4.8 | prefer IQ4_XS | +| [GGUF](https://huggingface.co/mradermacher/HumanFlow-i1-GGUF/resolve/main/HumanFlow.i1-Q4_K_S.gguf) | i1-Q4_K_S | 4.8 | optimal size/speed/quality | +| [GGUF](https://huggingface.co/mradermacher/HumanFlow-i1-GGUF/resolve/main/HumanFlow.i1-Q4_K_M.gguf) | i1-Q4_K_M | 5.0 | fast, recommended | +| [GGUF](https://huggingface.co/mradermacher/HumanFlow-i1-GGUF/resolve/main/HumanFlow.i1-Q4_1.gguf) | i1-Q4_1 | 5.2 | | +| [GGUF](https://huggingface.co/mradermacher/HumanFlow-i1-GGUF/resolve/main/HumanFlow.i1-Q5_K_S.gguf) | i1-Q5_K_S | 5.7 | | +| [GGUF](https://huggingface.co/mradermacher/HumanFlow-i1-GGUF/resolve/main/HumanFlow.i1-Q5_K_M.gguf) | i1-Q5_K_M | 5.8 | | +| [GGUF](https://huggingface.co/mradermacher/HumanFlow-i1-GGUF/resolve/main/HumanFlow.i1-Q6_K.gguf) | i1-Q6_K | 6.7 | practically like static Q6_K | + +Here is a handy graph by ikawrakow comparing some lower-quality quant +types (lower is better): + +![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png) + +And here are Artefact2's thoughts on the matter: +https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9 + +## FAQ / Model Request + +See https://huggingface.co/mradermacher/model_requests for some answers to +questions you might have and/or if you want some other model quantized. + +## Thanks + +I thank my company, [nethype GmbH](https://www.nethype.de/), for letting +me use its servers and providing upgrades to my workstation to enable +this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to. + +