初始化项目,由ModelHub XC社区提供模型

Model: mradermacher/Marco-Nano-Base-i1-GGUF
Source: Original Platform
This commit is contained in:
ModelHub XC
2026-04-22 12:26:33 +08:00
commit ca1a36563d
27 changed files with 267 additions and 0 deletions

60
.gitattributes vendored Normal file
View File

@@ -0,0 +1,60 @@
*.7z filter=lfs diff=lfs merge=lfs -text
*.arrow filter=lfs diff=lfs merge=lfs -text
*.bin filter=lfs diff=lfs merge=lfs -text
*.bz2 filter=lfs diff=lfs merge=lfs -text
*.ckpt filter=lfs diff=lfs merge=lfs -text
*.ftz filter=lfs diff=lfs merge=lfs -text
*.gz filter=lfs diff=lfs merge=lfs -text
*.h5 filter=lfs diff=lfs merge=lfs -text
*.joblib filter=lfs diff=lfs merge=lfs -text
*.lfs.* filter=lfs diff=lfs merge=lfs -text
*.mlmodel filter=lfs diff=lfs merge=lfs -text
*.model filter=lfs diff=lfs merge=lfs -text
*.msgpack filter=lfs diff=lfs merge=lfs -text
*.npy filter=lfs diff=lfs merge=lfs -text
*.npz filter=lfs diff=lfs merge=lfs -text
*.onnx filter=lfs diff=lfs merge=lfs -text
*.ot filter=lfs diff=lfs merge=lfs -text
*.parquet filter=lfs diff=lfs merge=lfs -text
*.pb filter=lfs diff=lfs merge=lfs -text
*.pickle filter=lfs diff=lfs merge=lfs -text
*.pkl filter=lfs diff=lfs merge=lfs -text
*.pt filter=lfs diff=lfs merge=lfs -text
*.pth filter=lfs diff=lfs merge=lfs -text
*.rar filter=lfs diff=lfs merge=lfs -text
*.safetensors filter=lfs diff=lfs merge=lfs -text
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
*.tar.* filter=lfs diff=lfs merge=lfs -text
*.tar filter=lfs diff=lfs merge=lfs -text
*.tflite filter=lfs diff=lfs merge=lfs -text
*.tgz filter=lfs diff=lfs merge=lfs -text
*.wasm filter=lfs diff=lfs merge=lfs -text
*.xz filter=lfs diff=lfs merge=lfs -text
*.zip filter=lfs diff=lfs merge=lfs -text
*.zst filter=lfs diff=lfs merge=lfs -text
*tfevents* filter=lfs diff=lfs merge=lfs -text
Marco-Nano-Base.imatrix.gguf filter=lfs diff=lfs merge=lfs -text
Marco-Nano-Base.i1-Q2_K.gguf filter=lfs diff=lfs merge=lfs -text
Marco-Nano-Base.i1-IQ3_M.gguf filter=lfs diff=lfs merge=lfs -text
Marco-Nano-Base.i1-Q4_K_S.gguf filter=lfs diff=lfs merge=lfs -text
Marco-Nano-Base.i1-IQ3_XXS.gguf filter=lfs diff=lfs merge=lfs -text
Marco-Nano-Base.i1-Q3_K_M.gguf filter=lfs diff=lfs merge=lfs -text
Marco-Nano-Base.i1-IQ4_NL.gguf filter=lfs diff=lfs merge=lfs -text
Marco-Nano-Base.i1-Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text
Marco-Nano-Base.i1-IQ2_M.gguf filter=lfs diff=lfs merge=lfs -text
Marco-Nano-Base.i1-Q6_K.gguf filter=lfs diff=lfs merge=lfs -text
Marco-Nano-Base.i1-IQ4_XS.gguf filter=lfs diff=lfs merge=lfs -text
Marco-Nano-Base.i1-Q2_K_S.gguf filter=lfs diff=lfs merge=lfs -text
Marco-Nano-Base.i1-IQ1_M.gguf filter=lfs diff=lfs merge=lfs -text
Marco-Nano-Base.i1-Q3_K_S.gguf filter=lfs diff=lfs merge=lfs -text
Marco-Nano-Base.i1-IQ2_XXS.gguf filter=lfs diff=lfs merge=lfs -text
Marco-Nano-Base.i1-Q3_K_L.gguf filter=lfs diff=lfs merge=lfs -text
Marco-Nano-Base.i1-IQ2_XS.gguf filter=lfs diff=lfs merge=lfs -text
Marco-Nano-Base.i1-Q5_K_S.gguf filter=lfs diff=lfs merge=lfs -text
Marco-Nano-Base.i1-IQ2_S.gguf filter=lfs diff=lfs merge=lfs -text
Marco-Nano-Base.i1-IQ1_S.gguf filter=lfs diff=lfs merge=lfs -text
Marco-Nano-Base.i1-Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text
Marco-Nano-Base.i1-Q4_0.gguf filter=lfs diff=lfs merge=lfs -text
Marco-Nano-Base.i1-IQ3_XS.gguf filter=lfs diff=lfs merge=lfs -text
Marco-Nano-Base.i1-Q4_1.gguf filter=lfs diff=lfs merge=lfs -text
Marco-Nano-Base.i1-IQ3_S.gguf filter=lfs diff=lfs merge=lfs -text

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:374924fda568cb7cb84de7c136b435e6bb9019de16059779e6ee4bcec2d0cf92
size 2796093024

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:6d6c179bc8489adc8b1b5a98b3bf12fae9896a5eb26d2b03757f88ab5a33d8b5
size 2674294368

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:9854dbe3bc8a48a55a2132077b9f71350563475b09c273c68498ff014a4e3a53
size 3349781088

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:9fb4689844859731f554c059d71261ce6e585ae6be0fa5117a523ac05429efe8
size 3187382880

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:157d22d40703b5e3d33cae95879e2720f47c2abe9d9a8327506afeb7b62aa695
size 3163324000

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:95fa5dcf36d3ea2f6b983e7017b9f5fd6fa95d6208f29ec030f18a9b8fbf919f
size 2999090784

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:3529c834cba244efba285cf1cdfc391adf41ae11102b647f15ad9bb959baa1ae
size 3980846688

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:f7f721c281e47c6f586a905637562937ca72bb5f29398c844e6be931bfc1a593
size 3934938720

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:909bfdf1fe953f52fe765c2f7960175b7b959bbc6069b65b31dc76c23f729139
size 3751208544

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:bd3c8234aaac1d2e0a1064898381308cba381039f67f45bd1dda77643890a94c
size 3663796832

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:c23290b0cfdb47686bfc718820009e1764eb14957ad80088ee256d40dd611a5f
size 4657492576

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:f4218fb9736e42feefff76e9c35a3a82e5352e5986f86e6afaccb512ed60eec7
size 4487479904

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:b8ba28d5506be8a876923b3513fa8493bf40413d15ba81866218af662a22f06c
size 3391343200

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:2782f9b18e92787ea09b4f599502ffd574960b15e27be0fac485dbdce5b07951
size 3416607328

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:a491e60ff17c2ae948b754734f01a2c9a23913eab63f3d8e9584bff4fe9154ee
size 4436584032

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:58e550cf2b94dc7e177c4aaf1c1fafd9b6c644aa04bb247dad56e344e11396a6
size 4271892064

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:72d7854fdbeea0723233a757f163b928cead4904029eb1b21919552eeae9c630
size 3934938720

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:a26281e6fed4601e2c6b8ed3f9db6d399e3c9e11b6800a3cba8dfa62f19e3d7b
size 4674597472

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:4a10a6896ccd6c4f6a633d9abedee10a54c0a03566cc77e94e9c8dbb41acc48f
size 5157163616

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:343f29e728f71b30055f987edd9485cdc02383c4567bf49df00f2353a5da9726
size 5459505760

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:744df064f79ba46d05f798c7a3ac444c38085ba1ee9af10d5ccad206509c7602
size 4994413152

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:0dd16279c12517ccbe5d0e05b6893d815597fb2e22259581376b2ae45092a8ca
size 6217544288

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:a281bc6be96da64db35b7d83933c6e9a75d3e5429afedb56db70e8081d6e7e1c
size 5816480352

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:84921886ba3c6df8ac821cd927e051dbbacf7b37a8c7afb156e7e3849bc2cc19
size 7337262688

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:d04d6d6303a1827a3d26c398a62a714f5c465da63328b93035f02ca4f4102fa0
size 63993376

132
README.md Normal file
View File

@@ -0,0 +1,132 @@
---
base_model: AIDC-AI/Marco-Nano-Base
datasets:
- nvidia/Nemotron-CC-v2
- nvidia/Nemotron-Pretraining-SFT-v1
- nvidia/Nemotron-Pretraining-Specialized-v1
- nvidia/Nemotron-CC-v2.1
- allenai/dolmino-mix-1124
- nvidia/Nemotron-CC-Math-v1
- nvidia/OpenMathInstruct-2
- HuggingFaceTB/finemath
- LLM360/MegaMath
- open-thoughts/OpenThoughts3-1.2M
- opencsg/Fineweb-Edu-Chinese-V2.1
- HuggingFaceFW/fineweb-2
- allenai/dolma3_dolmino_mix-100B-1125
language:
- en
- zh
- ar
- de
- es
- fr
- ko
- ja
- pt
- tr
- id
- it
- nl
- pl
- ru
- vi
- th
- he
- uk
- ms
- bn
- cs
- ur
- kk
- el
- ro
- hu
- ne
- az
library_name: transformers
license: apache-2.0
mradermacher:
readme_rev: 1
quantized_by: mradermacher
tags:
- moe
- mixture-of-experts
- multilingual
- upcycling
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
<!-- ### quants: Q2_K IQ3_M Q4_K_S IQ3_XXS Q3_K_M small-IQ4_NL Q4_K_M IQ2_M Q6_K IQ4_XS Q2_K_S IQ1_M Q3_K_S IQ2_XXS Q3_K_L IQ2_XS Q5_K_S IQ2_S IQ1_S Q5_K_M Q4_0 IQ3_XS Q4_1 IQ3_S -->
<!-- ### quants_skip: -->
<!-- ### skip_mmproj: -->
weighted/imatrix quants of https://huggingface.co/AIDC-AI/Marco-Nano-Base
<!-- provided-files -->
***For a convenient overview and download list, visit our [model page for this model](https://hf.tst.eu/model#Marco-Nano-Base-i1-GGUF).***
static quants are available at https://huggingface.co/mradermacher/Marco-Nano-Base-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Marco-Nano-Base-i1-GGUF/resolve/main/Marco-Nano-Base.imatrix.gguf) | imatrix | 0.2 | imatrix file (for creating your own quants) |
| [GGUF](https://huggingface.co/mradermacher/Marco-Nano-Base-i1-GGUF/resolve/main/Marco-Nano-Base.i1-IQ1_S.gguf) | i1-IQ1_S | 2.8 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/Marco-Nano-Base-i1-GGUF/resolve/main/Marco-Nano-Base.i1-IQ1_M.gguf) | i1-IQ1_M | 2.9 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/Marco-Nano-Base-i1-GGUF/resolve/main/Marco-Nano-Base.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 3.1 | |
| [GGUF](https://huggingface.co/mradermacher/Marco-Nano-Base-i1-GGUF/resolve/main/Marco-Nano-Base.i1-IQ2_XS.gguf) | i1-IQ2_XS | 3.3 | |
| [GGUF](https://huggingface.co/mradermacher/Marco-Nano-Base-i1-GGUF/resolve/main/Marco-Nano-Base.i1-IQ2_S.gguf) | i1-IQ2_S | 3.3 | |
| [GGUF](https://huggingface.co/mradermacher/Marco-Nano-Base-i1-GGUF/resolve/main/Marco-Nano-Base.i1-IQ2_M.gguf) | i1-IQ2_M | 3.4 | |
| [GGUF](https://huggingface.co/mradermacher/Marco-Nano-Base-i1-GGUF/resolve/main/Marco-Nano-Base.i1-Q2_K.gguf) | i1-Q2_K | 3.5 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/Marco-Nano-Base-i1-GGUF/resolve/main/Marco-Nano-Base.i1-Q2_K_S.gguf) | i1-Q2_K_S | 3.5 | very low quality |
| [GGUF](https://huggingface.co/mradermacher/Marco-Nano-Base-i1-GGUF/resolve/main/Marco-Nano-Base.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 3.8 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Marco-Nano-Base-i1-GGUF/resolve/main/Marco-Nano-Base.i1-IQ3_XS.gguf) | i1-IQ3_XS | 3.9 | |
| [GGUF](https://huggingface.co/mradermacher/Marco-Nano-Base-i1-GGUF/resolve/main/Marco-Nano-Base.i1-IQ3_S.gguf) | i1-IQ3_S | 4.0 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/Marco-Nano-Base-i1-GGUF/resolve/main/Marco-Nano-Base.i1-Q3_K_S.gguf) | i1-Q3_K_S | 4.0 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/Marco-Nano-Base-i1-GGUF/resolve/main/Marco-Nano-Base.i1-IQ3_M.gguf) | i1-IQ3_M | 4.1 | |
| [GGUF](https://huggingface.co/mradermacher/Marco-Nano-Base-i1-GGUF/resolve/main/Marco-Nano-Base.i1-Q3_K_M.gguf) | i1-Q3_K_M | 4.4 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/Marco-Nano-Base-i1-GGUF/resolve/main/Marco-Nano-Base.i1-Q3_K_L.gguf) | i1-Q3_K_L | 4.5 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/Marco-Nano-Base-i1-GGUF/resolve/main/Marco-Nano-Base.i1-IQ4_XS.gguf) | i1-IQ4_XS | 4.6 | |
| [GGUF](https://huggingface.co/mradermacher/Marco-Nano-Base-i1-GGUF/resolve/main/Marco-Nano-Base.i1-IQ4_NL.gguf) | i1-IQ4_NL | 4.8 | prefer IQ4_XS |
| [GGUF](https://huggingface.co/mradermacher/Marco-Nano-Base-i1-GGUF/resolve/main/Marco-Nano-Base.i1-Q4_0.gguf) | i1-Q4_0 | 4.8 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/Marco-Nano-Base-i1-GGUF/resolve/main/Marco-Nano-Base.i1-Q4_K_S.gguf) | i1-Q4_K_S | 5.1 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/Marco-Nano-Base-i1-GGUF/resolve/main/Marco-Nano-Base.i1-Q4_1.gguf) | i1-Q4_1 | 5.3 | |
| [GGUF](https://huggingface.co/mradermacher/Marco-Nano-Base-i1-GGUF/resolve/main/Marco-Nano-Base.i1-Q4_K_M.gguf) | i1-Q4_K_M | 5.6 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Marco-Nano-Base-i1-GGUF/resolve/main/Marco-Nano-Base.i1-Q5_K_S.gguf) | i1-Q5_K_S | 5.9 | |
| [GGUF](https://huggingface.co/mradermacher/Marco-Nano-Base-i1-GGUF/resolve/main/Marco-Nano-Base.i1-Q5_K_M.gguf) | i1-Q5_K_M | 6.3 | |
| [GGUF](https://huggingface.co/mradermacher/Marco-Nano-Base-i1-GGUF/resolve/main/Marco-Nano-Base.i1-Q6_K.gguf) | i1-Q6_K | 7.4 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):
![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png)
And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->