add citation and links to README
This commit is contained in:
16
README.md
16
README.md
@@ -18,3 +18,19 @@ This is a continued pre-train as well as an instruct fine-tune done using Unslot
|
|||||||
It has been trained on 15% of the German Wikipedia as well as the full German version of the Alpaca-GPT4 dataset (translated version).
|
It has been trained on 15% of the German Wikipedia as well as the full German version of the Alpaca-GPT4 dataset (translated version).
|
||||||
|
|
||||||
Even though a lot of training has been done, this is still a tiny model and is highly limited to its small size. Expect many hallucinations and do not use this in a demanding production workflow.
|
Even though a lot of training has been done, this is still a tiny model and is highly limited to its small size. Expect many hallucinations and do not use this in a demanding production workflow.
|
||||||
|
|
||||||
|
# Links
|
||||||
|
|
||||||
|
- [Ollama](https://ollama.com/jace-ai/SmolLM2-German-Instruct)
|
||||||
|
- [Technical report paper](url)
|
||||||
|
|
||||||
|
# Cite as
|
||||||
|
|
||||||
|
```bibtex
|
||||||
|
@misc{smollm2germaninstruct,
|
||||||
|
author = {Magnus Leonard Schlinsog},
|
||||||
|
title = {Enhancing Foreign Language Proficiency in SmolLM2-360M via Continued Pretraining and Instruction Fine-Tuning},
|
||||||
|
year = {2025},
|
||||||
|
url = {https://huggingface.co/mags0ft/SmolLM2-360m-German-Instruct},
|
||||||
|
}
|
||||||
|
```
|
||||||
Reference in New Issue
Block a user