From eba20e1d271fd098a1d55956fb9b335bb37de116 Mon Sep 17 00:00:00 2001 From: ModelHub XC Date: Sat, 25 Apr 2026 01:00:25 +0800 Subject: [PATCH] =?UTF-8?q?=E5=88=9D=E5=A7=8B=E5=8C=96=E9=A1=B9=E7=9B=AE?= =?UTF-8?q?=EF=BC=8C=E7=94=B1ModelHub=20XC=E7=A4=BE=E5=8C=BA=E6=8F=90?= =?UTF-8?q?=E4=BE=9B=E6=A8=A1=E5=9E=8B?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Model: mradermacher/Deep-Reasoning-Llama-3.2-BlackSheep-3B-i1-GGUF Source: Original Platform --- .gitattributes | 60 +++++++++++++ ...ning-Llama-3.2-BlackSheep-3B.i1-IQ1_M.gguf | 3 + ...ning-Llama-3.2-BlackSheep-3B.i1-IQ1_S.gguf | 3 + ...ning-Llama-3.2-BlackSheep-3B.i1-IQ2_M.gguf | 3 + ...ning-Llama-3.2-BlackSheep-3B.i1-IQ2_S.gguf | 3 + ...ing-Llama-3.2-BlackSheep-3B.i1-IQ2_XS.gguf | 3 + ...ng-Llama-3.2-BlackSheep-3B.i1-IQ2_XXS.gguf | 3 + ...ning-Llama-3.2-BlackSheep-3B.i1-IQ3_M.gguf | 3 + ...ning-Llama-3.2-BlackSheep-3B.i1-IQ3_S.gguf | 3 + ...ing-Llama-3.2-BlackSheep-3B.i1-IQ3_XS.gguf | 3 + ...ng-Llama-3.2-BlackSheep-3B.i1-IQ3_XXS.gguf | 3 + ...ing-Llama-3.2-BlackSheep-3B.i1-IQ4_NL.gguf | 3 + ...ing-Llama-3.2-BlackSheep-3B.i1-IQ4_XS.gguf | 3 + ...oning-Llama-3.2-BlackSheep-3B.i1-Q2_K.gguf | 3 + ...ing-Llama-3.2-BlackSheep-3B.i1-Q2_K_S.gguf | 3 + ...ing-Llama-3.2-BlackSheep-3B.i1-Q3_K_L.gguf | 3 + ...ing-Llama-3.2-BlackSheep-3B.i1-Q3_K_M.gguf | 3 + ...ing-Llama-3.2-BlackSheep-3B.i1-Q3_K_S.gguf | 3 + ...oning-Llama-3.2-BlackSheep-3B.i1-Q4_0.gguf | 3 + ...oning-Llama-3.2-BlackSheep-3B.i1-Q4_1.gguf | 3 + ...ing-Llama-3.2-BlackSheep-3B.i1-Q4_K_M.gguf | 3 + ...ing-Llama-3.2-BlackSheep-3B.i1-Q4_K_S.gguf | 3 + ...ing-Llama-3.2-BlackSheep-3B.i1-Q5_K_M.gguf | 3 + ...ing-Llama-3.2-BlackSheep-3B.i1-Q5_K_S.gguf | 3 + ...oning-Llama-3.2-BlackSheep-3B.i1-Q6_K.gguf | 3 + README.md | 85 +++++++++++++++++++ imatrix.dat | 3 + 27 files changed, 220 insertions(+) create mode 100644 .gitattributes create mode 100644 Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ1_M.gguf create mode 100644 Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ1_S.gguf create mode 100644 Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ2_M.gguf create mode 100644 Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ2_S.gguf create mode 100644 Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ2_XS.gguf create mode 100644 Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ2_XXS.gguf create mode 100644 Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ3_M.gguf create mode 100644 Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ3_S.gguf create mode 100644 Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ3_XS.gguf create mode 100644 Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ3_XXS.gguf create mode 100644 Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ4_NL.gguf create mode 100644 Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ4_XS.gguf create mode 100644 Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q2_K.gguf create mode 100644 Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q2_K_S.gguf create mode 100644 Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q3_K_L.gguf create mode 100644 Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q3_K_M.gguf create mode 100644 Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q3_K_S.gguf create mode 100644 Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q4_0.gguf create mode 100644 Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q4_1.gguf create mode 100644 Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q4_K_M.gguf create mode 100644 Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q4_K_S.gguf create mode 100644 Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q5_K_M.gguf create mode 100644 Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q5_K_S.gguf create mode 100644 Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q6_K.gguf create mode 100644 README.md create mode 100644 imatrix.dat diff --git a/.gitattributes b/.gitattributes new file mode 100644 index 0000000..ffda2d3 --- /dev/null +++ b/.gitattributes @@ -0,0 +1,60 @@ +*.7z filter=lfs diff=lfs merge=lfs -text +*.arrow filter=lfs diff=lfs merge=lfs -text +*.bin filter=lfs diff=lfs merge=lfs -text +*.bz2 filter=lfs diff=lfs merge=lfs -text +*.ckpt filter=lfs diff=lfs merge=lfs -text +*.ftz filter=lfs diff=lfs merge=lfs -text +*.gz filter=lfs diff=lfs merge=lfs -text +*.h5 filter=lfs diff=lfs merge=lfs -text +*.joblib filter=lfs diff=lfs merge=lfs -text +*.lfs.* filter=lfs diff=lfs merge=lfs -text +*.mlmodel filter=lfs diff=lfs merge=lfs -text +*.model filter=lfs diff=lfs merge=lfs -text +*.msgpack filter=lfs diff=lfs merge=lfs -text +*.npy filter=lfs diff=lfs merge=lfs -text +*.npz filter=lfs diff=lfs merge=lfs -text +*.onnx filter=lfs diff=lfs merge=lfs -text +*.ot filter=lfs diff=lfs merge=lfs -text +*.parquet filter=lfs diff=lfs merge=lfs -text +*.pb filter=lfs diff=lfs merge=lfs -text +*.pickle filter=lfs diff=lfs merge=lfs -text +*.pkl filter=lfs diff=lfs merge=lfs -text +*.pt filter=lfs diff=lfs merge=lfs -text +*.pth filter=lfs diff=lfs merge=lfs -text +*.rar filter=lfs diff=lfs merge=lfs -text +*.safetensors filter=lfs diff=lfs merge=lfs -text +saved_model/**/* filter=lfs diff=lfs merge=lfs -text +*.tar.* filter=lfs diff=lfs merge=lfs -text +*.tar filter=lfs diff=lfs merge=lfs -text +*.tflite filter=lfs diff=lfs merge=lfs -text +*.tgz filter=lfs diff=lfs merge=lfs -text +*.wasm filter=lfs diff=lfs merge=lfs -text +*.xz filter=lfs diff=lfs merge=lfs -text +*.zip filter=lfs diff=lfs merge=lfs -text +*.zst filter=lfs diff=lfs merge=lfs -text +*tfevents* filter=lfs diff=lfs merge=lfs -text +imatrix.dat filter=lfs diff=lfs merge=lfs -text +Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ1_M.gguf filter=lfs diff=lfs merge=lfs -text +Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ1_S.gguf filter=lfs diff=lfs merge=lfs -text +Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ2_M.gguf filter=lfs diff=lfs merge=lfs -text +Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ2_S.gguf filter=lfs diff=lfs merge=lfs -text +Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ2_XS.gguf filter=lfs diff=lfs merge=lfs -text +Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ2_XXS.gguf filter=lfs diff=lfs merge=lfs -text +Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ3_M.gguf filter=lfs diff=lfs merge=lfs -text +Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ3_S.gguf filter=lfs diff=lfs merge=lfs -text +Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ3_XS.gguf filter=lfs diff=lfs merge=lfs -text +Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ3_XXS.gguf filter=lfs diff=lfs merge=lfs -text +Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ4_NL.gguf filter=lfs diff=lfs merge=lfs -text +Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ4_XS.gguf filter=lfs diff=lfs merge=lfs -text +Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q2_K.gguf filter=lfs diff=lfs merge=lfs -text +Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q2_K_S.gguf filter=lfs diff=lfs merge=lfs -text +Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q3_K_L.gguf filter=lfs diff=lfs merge=lfs -text +Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q3_K_M.gguf filter=lfs diff=lfs merge=lfs -text +Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q3_K_S.gguf filter=lfs diff=lfs merge=lfs -text +Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q4_0.gguf filter=lfs diff=lfs merge=lfs -text +Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q4_1.gguf filter=lfs diff=lfs merge=lfs -text +Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text +Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q4_K_S.gguf filter=lfs diff=lfs merge=lfs -text +Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text +Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q5_K_S.gguf filter=lfs diff=lfs merge=lfs -text +Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q6_K.gguf filter=lfs diff=lfs merge=lfs -text diff --git a/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ1_M.gguf b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ1_M.gguf new file mode 100644 index 0000000..6f03a24 --- /dev/null +++ b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ1_M.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:bff7a675d6ba9cd52fcb3475b2dc6556a15f2b0b45234ff011a7eaea03600550 +size 924191904 diff --git a/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ1_S.gguf b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ1_S.gguf new file mode 100644 index 0000000..9d6f4e0 --- /dev/null +++ b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ1_S.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:dd8212d7d8cfaa77426fc3fb0c92b639a811696d376dce69f089a5a7506ec7ab +size 868158624 diff --git a/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ2_M.gguf b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ2_M.gguf new file mode 100644 index 0000000..eac5c31 --- /dev/null +++ b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ2_M.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:8a4f2a6df86d126b987d130da9b416de617692a0c98694b22831f2f8b864244e +size 1229032608 diff --git a/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ2_S.gguf b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ2_S.gguf new file mode 100644 index 0000000..ae0581a --- /dev/null +++ b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ2_S.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:f7aba0f0093748d225a1f1b6ebe8b04b9dcafee4e576a60b26996254c1ab58f3 +size 1154321568 diff --git a/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ2_XS.gguf b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ2_XS.gguf new file mode 100644 index 0000000..8de13fb --- /dev/null +++ b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ2_XS.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:ec9008b1bd5f19ab6ac4ebb96cf3b3efee25ab01c0f21e7c41caf4697517451b +size 1100549280 diff --git a/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ2_XXS.gguf b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ2_XXS.gguf new file mode 100644 index 0000000..5c218dd --- /dev/null +++ b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ2_XXS.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:f1b5284a21f00064a20dec53d544fb485f01d6d89f13c93454d8dedb564abda5 +size 1017580704 diff --git a/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ3_M.gguf b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ3_M.gguf new file mode 100644 index 0000000..3cf9aed --- /dev/null +++ b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ3_M.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:de4b7e586090d6aea8466bf375fb0af12837bd11d22c98b79d36d6262ddaa020 +size 1599669408 diff --git a/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ3_S.gguf b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ3_S.gguf new file mode 100644 index 0000000..29a8d6b --- /dev/null +++ b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ3_S.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:f1fb695278d4ef7864fa47827b878c92b5855c7d9db741afbda31288c5c14a04 +size 1542849696 diff --git a/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ3_XS.gguf b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ3_XS.gguf new file mode 100644 index 0000000..8bc77f5 --- /dev/null +++ b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ3_XS.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:8af942f8da43cd4d164a095f8bbe0a485ced213a8483ada6a59533b34314736a +size 1476789408 diff --git a/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ3_XXS.gguf b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ3_XXS.gguf new file mode 100644 index 0000000..c78c45a --- /dev/null +++ b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ3_XXS.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:205f74764d3dfb1b945c4b3b39ee4bfcc2166f90e333b2990a27bdab984d51af +size 1348766880 diff --git a/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ4_NL.gguf b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ4_NL.gguf new file mode 100644 index 0000000..1403b33 --- /dev/null +++ b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ4_NL.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:23081f5a7d6ebfc7dfe632e6be99f9e9272461d7fd9da2504a2652e7ccfe184e +size 1917191328 diff --git a/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ4_XS.gguf b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ4_XS.gguf new file mode 100644 index 0000000..c26b99b --- /dev/null +++ b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ4_XS.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:117cc86e18024a68e8f0cf705f2028f9a791d8fa4b13de86a439167f0b2ee186 +size 1829110944 diff --git a/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q2_K.gguf b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q2_K.gguf new file mode 100644 index 0000000..f874468 --- /dev/null +++ b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q2_K.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:a2bee4d6ba0d99039e11abe1c92e4536724e306e29567c17daf01c8b62522791 +size 1363936416 diff --git a/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q2_K_S.gguf b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q2_K_S.gguf new file mode 100644 index 0000000..7bb712c --- /dev/null +++ b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q2_K_S.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:e9730f4fd6db3229f090e5fbe375416ef3eee28a764417f60e49666bb9f0b467 +size 1274283168 diff --git a/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q3_K_L.gguf b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q3_K_L.gguf new file mode 100644 index 0000000..3dbcde5 --- /dev/null +++ b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q3_K_L.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:3147edd72ae4a6347305d057a2b2ca0fa1753d0c4960b3f293405e7ed45a3503 +size 1815348384 diff --git a/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q3_K_M.gguf b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q3_K_M.gguf new file mode 100644 index 0000000..3e61a51 --- /dev/null +++ b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q3_K_M.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:108c00d8733e178c105723dddc852b4c58aff2e32b68a8a6242f0e19efcc9fe3 +size 1687159968 diff --git a/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q3_K_S.gguf b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q3_K_S.gguf new file mode 100644 index 0000000..e1c4fe9 --- /dev/null +++ b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q3_K_S.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:5db2f7b9d01e18cbc017c7771f3e868f96ca2d9ad15a35956fcadcedb8f7a6d6 +size 1542849696 diff --git a/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q4_0.gguf b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q4_0.gguf new file mode 100644 index 0000000..ee9442d --- /dev/null +++ b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q4_0.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:012c1754a740e621d3942363b421e2689ad99f31aabb677cef47bb30722dc0a4 +size 1921909920 diff --git a/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q4_1.gguf b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q4_1.gguf new file mode 100644 index 0000000..53cdadf --- /dev/null +++ b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q4_1.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:ec9e1d0c4762fac27d2b0253a74783a56d8d028f1f11966379ac287bcd56c5b2 +size 2093352096 diff --git a/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q4_K_M.gguf b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q4_K_M.gguf new file mode 100644 index 0000000..170e560 --- /dev/null +++ b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q4_K_M.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:bed4fbde6f11546cf6b2acf32bbd52d3f25a68e3a442357bcefae821e08f6498 +size 2019378336 diff --git a/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q4_K_S.gguf b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q4_K_S.gguf new file mode 100644 index 0000000..5f9d064 --- /dev/null +++ b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q4_K_S.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:5c40835936a0839e0f28bc296a121dc6d80a07612103e63da0d4e39ea3a43007 +size 1928201376 diff --git a/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q5_K_M.gguf b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q5_K_M.gguf new file mode 100644 index 0000000..638f91c --- /dev/null +++ b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q5_K_M.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:a20aba4aa5faa5cd71d7a521deb5f9c5d6307175904c7ddd0a9395feea9729bd +size 2322154656 diff --git a/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q5_K_S.gguf b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q5_K_S.gguf new file mode 100644 index 0000000..d66fee0 --- /dev/null +++ b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q5_K_S.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:67277001b3c565c6f5e7743e13d26c8dbd9cfadf49520c5e9db478e6bf7a1469 +size 2269512864 diff --git a/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q6_K.gguf b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q6_K.gguf new file mode 100644 index 0000000..c788873 --- /dev/null +++ b/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q6_K.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:99333d4a74b62d03d649ecc695d46c3352724c960dc16a996d74ec828d986286 +size 2643854496 diff --git a/README.md b/README.md new file mode 100644 index 0000000..8353b8f --- /dev/null +++ b/README.md @@ -0,0 +1,85 @@ +--- +base_model: DavidAU/Deep-Reasoning-Llama-3.2-BlackSheep-3B +language: +- en +library_name: transformers +quantized_by: mradermacher +tags: +- reasoning +- thinking +- cot +- deepseek +- Llama 3.2 +- 128k context +- fine tune +- llama-3 +- llama-3.2 +--- +## About + + + + + + +weighted/imatrix quants of https://huggingface.co/DavidAU/Deep-Reasoning-Llama-3.2-BlackSheep-3B + + +static quants are available at https://huggingface.co/mradermacher/Deep-Reasoning-Llama-3.2-BlackSheep-3B-GGUF +## Usage + +If you are unsure how to use GGUF files, refer to one of [TheBloke's +READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for +more details, including on how to concatenate multi-part files. + +## Provided Quants + +(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants) + +| Link | Type | Size/GB | Notes | +|:-----|:-----|--------:|:------| +| [GGUF](https://huggingface.co/mradermacher/Deep-Reasoning-Llama-3.2-BlackSheep-3B-i1-GGUF/resolve/main/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ1_S.gguf) | i1-IQ1_S | 1.0 | for the desperate | +| [GGUF](https://huggingface.co/mradermacher/Deep-Reasoning-Llama-3.2-BlackSheep-3B-i1-GGUF/resolve/main/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ1_M.gguf) | i1-IQ1_M | 1.0 | mostly desperate | +| [GGUF](https://huggingface.co/mradermacher/Deep-Reasoning-Llama-3.2-BlackSheep-3B-i1-GGUF/resolve/main/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 1.1 | | +| [GGUF](https://huggingface.co/mradermacher/Deep-Reasoning-Llama-3.2-BlackSheep-3B-i1-GGUF/resolve/main/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ2_XS.gguf) | i1-IQ2_XS | 1.2 | | +| [GGUF](https://huggingface.co/mradermacher/Deep-Reasoning-Llama-3.2-BlackSheep-3B-i1-GGUF/resolve/main/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ2_S.gguf) | i1-IQ2_S | 1.3 | | +| [GGUF](https://huggingface.co/mradermacher/Deep-Reasoning-Llama-3.2-BlackSheep-3B-i1-GGUF/resolve/main/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ2_M.gguf) | i1-IQ2_M | 1.3 | | +| [GGUF](https://huggingface.co/mradermacher/Deep-Reasoning-Llama-3.2-BlackSheep-3B-i1-GGUF/resolve/main/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q2_K_S.gguf) | i1-Q2_K_S | 1.4 | very low quality | +| [GGUF](https://huggingface.co/mradermacher/Deep-Reasoning-Llama-3.2-BlackSheep-3B-i1-GGUF/resolve/main/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 1.4 | lower quality | +| [GGUF](https://huggingface.co/mradermacher/Deep-Reasoning-Llama-3.2-BlackSheep-3B-i1-GGUF/resolve/main/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q2_K.gguf) | i1-Q2_K | 1.5 | IQ3_XXS probably better | +| [GGUF](https://huggingface.co/mradermacher/Deep-Reasoning-Llama-3.2-BlackSheep-3B-i1-GGUF/resolve/main/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ3_XS.gguf) | i1-IQ3_XS | 1.6 | | +| [GGUF](https://huggingface.co/mradermacher/Deep-Reasoning-Llama-3.2-BlackSheep-3B-i1-GGUF/resolve/main/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ3_S.gguf) | i1-IQ3_S | 1.6 | beats Q3_K* | +| [GGUF](https://huggingface.co/mradermacher/Deep-Reasoning-Llama-3.2-BlackSheep-3B-i1-GGUF/resolve/main/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q3_K_S.gguf) | i1-Q3_K_S | 1.6 | IQ3_XS probably better | +| [GGUF](https://huggingface.co/mradermacher/Deep-Reasoning-Llama-3.2-BlackSheep-3B-i1-GGUF/resolve/main/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ3_M.gguf) | i1-IQ3_M | 1.7 | | +| [GGUF](https://huggingface.co/mradermacher/Deep-Reasoning-Llama-3.2-BlackSheep-3B-i1-GGUF/resolve/main/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q3_K_M.gguf) | i1-Q3_K_M | 1.8 | IQ3_S probably better | +| [GGUF](https://huggingface.co/mradermacher/Deep-Reasoning-Llama-3.2-BlackSheep-3B-i1-GGUF/resolve/main/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q3_K_L.gguf) | i1-Q3_K_L | 1.9 | IQ3_M probably better | +| [GGUF](https://huggingface.co/mradermacher/Deep-Reasoning-Llama-3.2-BlackSheep-3B-i1-GGUF/resolve/main/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ4_XS.gguf) | i1-IQ4_XS | 1.9 | | +| [GGUF](https://huggingface.co/mradermacher/Deep-Reasoning-Llama-3.2-BlackSheep-3B-i1-GGUF/resolve/main/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-IQ4_NL.gguf) | i1-IQ4_NL | 2.0 | prefer IQ4_XS | +| [GGUF](https://huggingface.co/mradermacher/Deep-Reasoning-Llama-3.2-BlackSheep-3B-i1-GGUF/resolve/main/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q4_0.gguf) | i1-Q4_0 | 2.0 | fast, low quality | +| [GGUF](https://huggingface.co/mradermacher/Deep-Reasoning-Llama-3.2-BlackSheep-3B-i1-GGUF/resolve/main/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q4_K_S.gguf) | i1-Q4_K_S | 2.0 | optimal size/speed/quality | +| [GGUF](https://huggingface.co/mradermacher/Deep-Reasoning-Llama-3.2-BlackSheep-3B-i1-GGUF/resolve/main/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q4_K_M.gguf) | i1-Q4_K_M | 2.1 | fast, recommended | +| [GGUF](https://huggingface.co/mradermacher/Deep-Reasoning-Llama-3.2-BlackSheep-3B-i1-GGUF/resolve/main/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q4_1.gguf) | i1-Q4_1 | 2.2 | | +| [GGUF](https://huggingface.co/mradermacher/Deep-Reasoning-Llama-3.2-BlackSheep-3B-i1-GGUF/resolve/main/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q5_K_S.gguf) | i1-Q5_K_S | 2.4 | | +| [GGUF](https://huggingface.co/mradermacher/Deep-Reasoning-Llama-3.2-BlackSheep-3B-i1-GGUF/resolve/main/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q5_K_M.gguf) | i1-Q5_K_M | 2.4 | | +| [GGUF](https://huggingface.co/mradermacher/Deep-Reasoning-Llama-3.2-BlackSheep-3B-i1-GGUF/resolve/main/Deep-Reasoning-Llama-3.2-BlackSheep-3B.i1-Q6_K.gguf) | i1-Q6_K | 2.7 | practically like static Q6_K | + +Here is a handy graph by ikawrakow comparing some lower-quality quant +types (lower is better): + +![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png) + +And here are Artefact2's thoughts on the matter: +https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9 + +## FAQ / Model Request + +See https://huggingface.co/mradermacher/model_requests for some answers to +questions you might have and/or if you want some other model quantized. + +## Thanks + +I thank my company, [nethype GmbH](https://www.nethype.de/), for letting +me use its servers and providing upgrades to my workstation to enable +this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to. + + diff --git a/imatrix.dat b/imatrix.dat new file mode 100644 index 0000000..b14e4ef --- /dev/null +++ b/imatrix.dat @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:cdc37f4720478a72dcf265fc7017e641446e946dacaed09d11a37e78be23526b +size 2988377