初始化项目,由ModelHub XC社区提供模型
Model: featherless-ai-quants/google-codegemma-1.1-2b-GGUF Source: Original Platform
This commit is contained in:
47
.gitattributes
vendored
Normal file
47
.gitattributes
vendored
Normal file
@@ -0,0 +1,47 @@
|
||||
*.7z filter=lfs diff=lfs merge=lfs -text
|
||||
*.arrow filter=lfs diff=lfs merge=lfs -text
|
||||
*.bin filter=lfs diff=lfs merge=lfs -text
|
||||
*.bz2 filter=lfs diff=lfs merge=lfs -text
|
||||
*.ckpt filter=lfs diff=lfs merge=lfs -text
|
||||
*.ftz filter=lfs diff=lfs merge=lfs -text
|
||||
*.gz filter=lfs diff=lfs merge=lfs -text
|
||||
*.h5 filter=lfs diff=lfs merge=lfs -text
|
||||
*.joblib filter=lfs diff=lfs merge=lfs -text
|
||||
*.lfs.* filter=lfs diff=lfs merge=lfs -text
|
||||
*.mlmodel filter=lfs diff=lfs merge=lfs -text
|
||||
*.model filter=lfs diff=lfs merge=lfs -text
|
||||
*.msgpack filter=lfs diff=lfs merge=lfs -text
|
||||
*.npy filter=lfs diff=lfs merge=lfs -text
|
||||
*.npz filter=lfs diff=lfs merge=lfs -text
|
||||
*.onnx filter=lfs diff=lfs merge=lfs -text
|
||||
*.ot filter=lfs diff=lfs merge=lfs -text
|
||||
*.parquet filter=lfs diff=lfs merge=lfs -text
|
||||
*.pb filter=lfs diff=lfs merge=lfs -text
|
||||
*.pickle filter=lfs diff=lfs merge=lfs -text
|
||||
*.pkl filter=lfs diff=lfs merge=lfs -text
|
||||
*.pt filter=lfs diff=lfs merge=lfs -text
|
||||
*.pth filter=lfs diff=lfs merge=lfs -text
|
||||
*.rar filter=lfs diff=lfs merge=lfs -text
|
||||
*.safetensors filter=lfs diff=lfs merge=lfs -text
|
||||
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
||||
*.tar.* filter=lfs diff=lfs merge=lfs -text
|
||||
*.tar filter=lfs diff=lfs merge=lfs -text
|
||||
*.tflite filter=lfs diff=lfs merge=lfs -text
|
||||
*.tgz filter=lfs diff=lfs merge=lfs -text
|
||||
*.wasm filter=lfs diff=lfs merge=lfs -text
|
||||
*.xz filter=lfs diff=lfs merge=lfs -text
|
||||
*.zip filter=lfs diff=lfs merge=lfs -text
|
||||
*.zst filter=lfs diff=lfs merge=lfs -text
|
||||
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
||||
featherless-quants.png filter=lfs diff=lfs merge=lfs -text
|
||||
google-codegemma-1.1-2b-IQ4_XS.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
google-codegemma-1.1-2b-Q2_K.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
google-codegemma-1.1-2b-Q3_K_L.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
google-codegemma-1.1-2b-Q3_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
google-codegemma-1.1-2b-Q3_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
google-codegemma-1.1-2b-Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
google-codegemma-1.1-2b-Q4_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
google-codegemma-1.1-2b-Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
google-codegemma-1.1-2b-Q5_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
google-codegemma-1.1-2b-Q6_K.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
google-codegemma-1.1-2b-Q8_0.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
47
README.md
Normal file
47
README.md
Normal file
@@ -0,0 +1,47 @@
|
||||
---
|
||||
base_model: google/codegemma-1.1-2b
|
||||
pipeline_tag: text-generation
|
||||
quantized_by: featherless-ai-quants
|
||||
---
|
||||
|
||||
# google/codegemma-1.1-2b GGUF Quantizations 🚀
|
||||
|
||||

|
||||
|
||||
*Optimized GGUF quantization files for enhanced model performance*
|
||||
|
||||
> Powered by [Featherless AI](https://featherless.ai) - run any model you'd like for a simple small fee.
|
||||
---
|
||||
|
||||
## Available Quantizations 📊
|
||||
|
||||
| Quantization Type | File | Size |
|
||||
|-------------------|------|------|
|
||||
| IQ4_XS | [google-codegemma-1.1-2b-IQ4_XS.gguf](https://huggingface.co/featherless-ai-quants/google-codegemma-1.1-2b-GGUF/blob/main/google-codegemma-1.1-2b-IQ4_XS.gguf) | 1431.67 MB |
|
||||
| Q2_K | [google-codegemma-1.1-2b-Q2_K.gguf](https://huggingface.co/featherless-ai-quants/google-codegemma-1.1-2b-GGUF/blob/main/google-codegemma-1.1-2b-Q2_K.gguf) | 1104.28 MB |
|
||||
| Q3_K_L | [google-codegemma-1.1-2b-Q3_K_L.gguf](https://huggingface.co/featherless-ai-quants/google-codegemma-1.1-2b-GGUF/blob/main/google-codegemma-1.1-2b-Q3_K_L.gguf) | 1397.70 MB |
|
||||
| Q3_K_M | [google-codegemma-1.1-2b-Q3_K_M.gguf](https://huggingface.co/featherless-ai-quants/google-codegemma-1.1-2b-GGUF/blob/main/google-codegemma-1.1-2b-Q3_K_M.gguf) | 1319.70 MB |
|
||||
| Q3_K_S | [google-codegemma-1.1-2b-Q3_K_S.gguf](https://huggingface.co/featherless-ai-quants/google-codegemma-1.1-2b-GGUF/blob/main/google-codegemma-1.1-2b-Q3_K_S.gguf) | 1228.31 MB |
|
||||
| Q4_K_M | [google-codegemma-1.1-2b-Q4_K_M.gguf](https://huggingface.co/featherless-ai-quants/google-codegemma-1.1-2b-GGUF/blob/main/google-codegemma-1.1-2b-Q4_K_M.gguf) | 1554.74 MB |
|
||||
| Q4_K_S | [google-codegemma-1.1-2b-Q4_K_S.gguf](https://huggingface.co/featherless-ai-quants/google-codegemma-1.1-2b-GGUF/blob/main/google-codegemma-1.1-2b-Q4_K_S.gguf) | 1487.58 MB |
|
||||
| Q5_K_M | [google-codegemma-1.1-2b-Q5_K_M.gguf](https://huggingface.co/featherless-ai-quants/google-codegemma-1.1-2b-GGUF/blob/main/google-codegemma-1.1-2b-Q5_K_M.gguf) | 1754.43 MB |
|
||||
| Q5_K_S | [google-codegemma-1.1-2b-Q5_K_S.gguf](https://huggingface.co/featherless-ai-quants/google-codegemma-1.1-2b-GGUF/blob/main/google-codegemma-1.1-2b-Q5_K_S.gguf) | 1715.58 MB |
|
||||
| Q6_K | [google-codegemma-1.1-2b-Q6_K.gguf](https://huggingface.co/featherless-ai-quants/google-codegemma-1.1-2b-GGUF/blob/main/google-codegemma-1.1-2b-Q6_K.gguf) | 1966.59 MB |
|
||||
| Q8_0 | [google-codegemma-1.1-2b-Q8_0.gguf](https://huggingface.co/featherless-ai-quants/google-codegemma-1.1-2b-GGUF/blob/main/google-codegemma-1.1-2b-Q8_0.gguf) | 2545.42 MB |
|
||||
|
||||
|
||||
---
|
||||
|
||||
## ⚡ Powered by [Featherless AI](https://featherless.ai)
|
||||
|
||||
### Key Features
|
||||
|
||||
- 🔥 **Instant Hosting** - Deploy any Llama model on HuggingFace instantly
|
||||
- 🛠️ **Zero Infrastructure** - No server setup or maintenance required
|
||||
- 📚 **Vast Compatibility** - Support for 2400+ models and counting
|
||||
- 💎 **Affordable Pricing** - Starting at just $10/month
|
||||
|
||||
---
|
||||
|
||||
**Links:**
|
||||
[Get Started](https://featherless.ai) | [Documentation](https://featherless.ai/docs) | [Models](https://featherless.ai/models)
|
||||
3
featherless-quants.png
Normal file
3
featherless-quants.png
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:2e1b4d66c8306c7b0614089381fdf86ea4efb02dffb78d22767a084cb8b88d6b
|
||||
size 1614532
|
||||
3
google-codegemma-1.1-2b-IQ4_XS.gguf
Normal file
3
google-codegemma-1.1-2b-IQ4_XS.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:947abc8822227025a171b1990650a61e10633605ebacf75e864154c839abd384
|
||||
size 1501217312
|
||||
3
google-codegemma-1.1-2b-Q2_K.gguf
Normal file
3
google-codegemma-1.1-2b-Q2_K.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:7d963b4112651366a05fc1f401751a429832cc3f3178fb8ac59f19d72a6814a5
|
||||
size 1157923360
|
||||
3
google-codegemma-1.1-2b-Q3_K_L.gguf
Normal file
3
google-codegemma-1.1-2b-Q3_K_L.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:41c39a46e10aec4fb82830ffd0dc228b3c97fcafad92f043e71608bfab32a755
|
||||
size 1465590304
|
||||
3
google-codegemma-1.1-2b-Q3_K_M.gguf
Normal file
3
google-codegemma-1.1-2b-Q3_K_M.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:60e9f4e82aba88c97295b7f0138f6e39438d229c375e10cfa223068031e93e9a
|
||||
size 1383801376
|
||||
3
google-codegemma-1.1-2b-Q3_K_S.gguf
Normal file
3
google-codegemma-1.1-2b-Q3_K_S.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:39000146132381e10d1582d9b28f7e18b9a6b00fb415202a088f30f657fc1611
|
||||
size 1287979552
|
||||
3
google-codegemma-1.1-2b-Q4_K_M.gguf
Normal file
3
google-codegemma-1.1-2b-Q4_K_M.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:2c4ebbd70b48eb26f09c1a0b156e4c99d1ca622e9d790597a7d96f9cdea72bb1
|
||||
size 1630261792
|
||||
3
google-codegemma-1.1-2b-Q4_K_S.gguf
Normal file
3
google-codegemma-1.1-2b-Q4_K_S.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:f6c0c41cecd5de3237eac07d822ececc2d3ac52e494be1554551465407c0e9a7
|
||||
size 1559839264
|
||||
3
google-codegemma-1.1-2b-Q5_K_M.gguf
Normal file
3
google-codegemma-1.1-2b-Q5_K_M.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:7615c089ea6b70cbb4f1a4f3bd695f79e98129f921956ad81a88fffa09b780c5
|
||||
size 1839649312
|
||||
3
google-codegemma-1.1-2b-Q5_K_S.gguf
Normal file
3
google-codegemma-1.1-2b-Q5_K_S.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:67f458fd2615965e5049d8021b4568b8d8f0259a3be857a02aa67847e55fa2b2
|
||||
size 1798914592
|
||||
3
google-codegemma-1.1-2b-Q6_K.gguf
Normal file
3
google-codegemma-1.1-2b-Q6_K.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:d4a432b83eced620da25a167cea70aaedd921398b998a7e4ac25805a2b63011a
|
||||
size 2062123552
|
||||
3
google-codegemma-1.1-2b-Q8_0.gguf
Normal file
3
google-codegemma-1.1-2b-Q8_0.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:95b27298721dea3bfec97efb1928043b96a32434ed5729032e3d0d38953876b1
|
||||
size 2669068832
|
||||
Reference in New Issue
Block a user