Compare commits
10 Commits
66f35c8ffc
...
3bc7339376
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
3bc7339376 | ||
|
|
7b761548a6 | ||
|
|
3af1fae516 | ||
|
|
1be512e760 | ||
|
|
d00fccf445 | ||
|
|
3575c5d4ac | ||
|
|
cd6de8b8f9 | ||
|
|
683895ea29 | ||
|
|
08d8ef8a7f | ||
|
|
27b3b55108 |
9
.gitattributes
vendored
9
.gitattributes
vendored
@@ -43,3 +43,12 @@ gemma-2b-mt-German-to-English.Q4_0.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
gemma-2b-mt-German-to-English.IQ4_NL.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
gemma-2b-mt-German-to-English.Q4_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
gemma-2b-mt-German-to-English.Q4_K.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
gemma-2b-mt-German-to-English.Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
gemma-2b-mt-German-to-English.Q4_1.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
gemma-2b-mt-German-to-English.Q5_0.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
gemma-2b-mt-German-to-English.Q5_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
gemma-2b-mt-German-to-English.Q5_K.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
gemma-2b-mt-German-to-English.Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
gemma-2b-mt-German-to-English.Q5_1.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
gemma-2b-mt-German-to-English.Q6_K.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
gemma-2b-mt-German-to-English.Q8_0.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
|
||||
77
README.md
Normal file
77
README.md
Normal file
@@ -0,0 +1,77 @@
|
||||
Quantization made by Richard Erkhov.
|
||||
|
||||
[Github](https://github.com/RichardErkhov)
|
||||
|
||||
[Discord](https://discord.gg/pvy7H8DZMG)
|
||||
|
||||
[Request more models](https://github.com/RichardErkhov/quant_request)
|
||||
|
||||
|
||||
gemma-2b-mt-German-to-English - GGUF
|
||||
- Model creator: https://huggingface.co/Samvardhan777/
|
||||
- Original model: https://huggingface.co/Samvardhan777/gemma-2b-mt-German-to-English/
|
||||
|
||||
|
||||
| Name | Quant method | Size |
|
||||
| ---- | ---- | ---- |
|
||||
| [gemma-2b-mt-German-to-English.Q2_K.gguf](https://huggingface.co/RichardErkhov/Samvardhan777_-_gemma-2b-mt-German-to-English-gguf/blob/main/gemma-2b-mt-German-to-English.Q2_K.gguf) | Q2_K | 1.08GB |
|
||||
| [gemma-2b-mt-German-to-English.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/Samvardhan777_-_gemma-2b-mt-German-to-English-gguf/blob/main/gemma-2b-mt-German-to-English.Q3_K_S.gguf) | Q3_K_S | 1.2GB |
|
||||
| [gemma-2b-mt-German-to-English.Q3_K.gguf](https://huggingface.co/RichardErkhov/Samvardhan777_-_gemma-2b-mt-German-to-English-gguf/blob/main/gemma-2b-mt-German-to-English.Q3_K.gguf) | Q3_K | 1.29GB |
|
||||
| [gemma-2b-mt-German-to-English.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/Samvardhan777_-_gemma-2b-mt-German-to-English-gguf/blob/main/gemma-2b-mt-German-to-English.Q3_K_M.gguf) | Q3_K_M | 1.29GB |
|
||||
| [gemma-2b-mt-German-to-English.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/Samvardhan777_-_gemma-2b-mt-German-to-English-gguf/blob/main/gemma-2b-mt-German-to-English.Q3_K_L.gguf) | Q3_K_L | 1.36GB |
|
||||
| [gemma-2b-mt-German-to-English.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/Samvardhan777_-_gemma-2b-mt-German-to-English-gguf/blob/main/gemma-2b-mt-German-to-English.IQ4_XS.gguf) | IQ4_XS | 1.4GB |
|
||||
| [gemma-2b-mt-German-to-English.Q4_0.gguf](https://huggingface.co/RichardErkhov/Samvardhan777_-_gemma-2b-mt-German-to-English-gguf/blob/main/gemma-2b-mt-German-to-English.Q4_0.gguf) | Q4_0 | 1.44GB |
|
||||
| [gemma-2b-mt-German-to-English.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/Samvardhan777_-_gemma-2b-mt-German-to-English-gguf/blob/main/gemma-2b-mt-German-to-English.IQ4_NL.gguf) | IQ4_NL | 1.45GB |
|
||||
| [gemma-2b-mt-German-to-English.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/Samvardhan777_-_gemma-2b-mt-German-to-English-gguf/blob/main/gemma-2b-mt-German-to-English.Q4_K_S.gguf) | Q4_K_S | 1.45GB |
|
||||
| [gemma-2b-mt-German-to-English.Q4_K.gguf](https://huggingface.co/RichardErkhov/Samvardhan777_-_gemma-2b-mt-German-to-English-gguf/blob/main/gemma-2b-mt-German-to-English.Q4_K.gguf) | Q4_K | 1.52GB |
|
||||
| [gemma-2b-mt-German-to-English.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/Samvardhan777_-_gemma-2b-mt-German-to-English-gguf/blob/main/gemma-2b-mt-German-to-English.Q4_K_M.gguf) | Q4_K_M | 1.52GB |
|
||||
| [gemma-2b-mt-German-to-English.Q4_1.gguf](https://huggingface.co/RichardErkhov/Samvardhan777_-_gemma-2b-mt-German-to-English-gguf/blob/main/gemma-2b-mt-German-to-English.Q4_1.gguf) | Q4_1 | 1.56GB |
|
||||
| [gemma-2b-mt-German-to-English.Q5_0.gguf](https://huggingface.co/RichardErkhov/Samvardhan777_-_gemma-2b-mt-German-to-English-gguf/blob/main/gemma-2b-mt-German-to-English.Q5_0.gguf) | Q5_0 | 1.68GB |
|
||||
| [gemma-2b-mt-German-to-English.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/Samvardhan777_-_gemma-2b-mt-German-to-English-gguf/blob/main/gemma-2b-mt-German-to-English.Q5_K_S.gguf) | Q5_K_S | 1.68GB |
|
||||
| [gemma-2b-mt-German-to-English.Q5_K.gguf](https://huggingface.co/RichardErkhov/Samvardhan777_-_gemma-2b-mt-German-to-English-gguf/blob/main/gemma-2b-mt-German-to-English.Q5_K.gguf) | Q5_K | 1.71GB |
|
||||
| [gemma-2b-mt-German-to-English.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/Samvardhan777_-_gemma-2b-mt-German-to-English-gguf/blob/main/gemma-2b-mt-German-to-English.Q5_K_M.gguf) | Q5_K_M | 1.71GB |
|
||||
| [gemma-2b-mt-German-to-English.Q5_1.gguf](https://huggingface.co/RichardErkhov/Samvardhan777_-_gemma-2b-mt-German-to-English-gguf/blob/main/gemma-2b-mt-German-to-English.Q5_1.gguf) | Q5_1 | 1.79GB |
|
||||
| [gemma-2b-mt-German-to-English.Q6_K.gguf](https://huggingface.co/RichardErkhov/Samvardhan777_-_gemma-2b-mt-German-to-English-gguf/blob/main/gemma-2b-mt-German-to-English.Q6_K.gguf) | Q6_K | 1.92GB |
|
||||
| [gemma-2b-mt-German-to-English.Q8_0.gguf](https://huggingface.co/RichardErkhov/Samvardhan777_-_gemma-2b-mt-German-to-English-gguf/blob/main/gemma-2b-mt-German-to-English.Q8_0.gguf) | Q8_0 | 2.49GB |
|
||||
|
||||
|
||||
|
||||
|
||||
Original model description:
|
||||
---
|
||||
license: mit
|
||||
language:
|
||||
- de
|
||||
- en
|
||||
pipeline_tag: translation
|
||||
tags:
|
||||
- text-generation-inference
|
||||
---
|
||||
|
||||
|
||||
# Description
|
||||
|
||||
## Gemma 2B German to English v0.1 Alpha [Experimental Release]
|
||||
This is a german instruction finetuned version of Google's Gemma 2B model. This is an experiment to see if Gemma can be Translate German to English by expanding vocabulary. While the responses may be rusty at times, it shows a lot of promise for a 2B parameter model.
|
||||
|
||||
|
||||
|
||||
|
||||
---
|
||||
# Model description 🗄️:
|
||||
Model type: A 2B parameter GPT-like model finetuned on 100,000 samples consisting of an equal proportion of English and German samples.
|
||||
|
||||
Language(s): Bilingual. English and German.
|
||||
|
||||
License: Google Gemma Terms of Use
|
||||
|
||||
Finetuned from model: Samvardhan777/gemma-2b-mt-German-to-English
|
||||
|
||||
Training Precision: bfloat16
|
||||
|
||||
Training Hardware: Free Google Colab
|
||||
|
||||
Dataset: kaitchup/opus-German-to-English
|
||||
|
||||
---
|
||||
|
||||
3
gemma-2b-mt-German-to-English.Q4_1.gguf
Normal file
3
gemma-2b-mt-German-to-English.Q4_1.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:9a9a5df15645b0b863c5f3e834abea043fd8c6eabe9ada0ec8a7a8045778c027
|
||||
size 1675053280
|
||||
3
gemma-2b-mt-German-to-English.Q4_K_M.gguf
Normal file
3
gemma-2b-mt-German-to-English.Q4_K_M.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:c507abfc3abf33eaa88ae51670b0c3da3feebb44f70cf585dd04fb0640d63fb0
|
||||
size 1630263520
|
||||
3
gemma-2b-mt-German-to-English.Q5_0.gguf
Normal file
3
gemma-2b-mt-German-to-English.Q5_0.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:caae2b0c30765b17b55100b8761de8ca4ba6e49488149ccdfc78782d3f3bc509
|
||||
size 1798916320
|
||||
3
gemma-2b-mt-German-to-English.Q5_1.gguf
Normal file
3
gemma-2b-mt-German-to-English.Q5_1.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:9dfb9b8013f0c52b5ac4cfd079f3f3dc8f3149ab0fae179fa49986cae642873b
|
||||
size 1922779360
|
||||
3
gemma-2b-mt-German-to-English.Q5_K.gguf
Normal file
3
gemma-2b-mt-German-to-English.Q5_K.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:77f1be2b4fd28a42df57af5afd917574559aa141935e6e78cdaddcfc99482b1f
|
||||
size 1839651040
|
||||
3
gemma-2b-mt-German-to-English.Q5_K_M.gguf
Normal file
3
gemma-2b-mt-German-to-English.Q5_K_M.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:77f1be2b4fd28a42df57af5afd917574559aa141935e6e78cdaddcfc99482b1f
|
||||
size 1839651040
|
||||
3
gemma-2b-mt-German-to-English.Q5_K_S.gguf
Normal file
3
gemma-2b-mt-German-to-English.Q5_K_S.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:2ba1c988e2c3e30ef6031cc4a9259b748fc203202554d4e0187effb03a5868de
|
||||
size 1798916320
|
||||
3
gemma-2b-mt-German-to-English.Q6_K.gguf
Normal file
3
gemma-2b-mt-German-to-English.Q6_K.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:64c35839f0bea6ac0337719a86dd798e0076689377f78cc63070fd9318b143bb
|
||||
size 2062125280
|
||||
3
gemma-2b-mt-German-to-English.Q8_0.gguf
Normal file
3
gemma-2b-mt-German-to-English.Q8_0.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:9935a3ba8cd98be821a658dc045bcbc079477a855bfbf779de8e6e872e61d47c
|
||||
size 2669070560
|
||||
Reference in New Issue
Block a user