54 lines
2.3 KiB
Markdown
54 lines
2.3 KiB
Markdown
|
|
---
|
||
|
|
license: apache-2.0
|
||
|
|
language:
|
||
|
|
- en
|
||
|
|
pipeline_tag: text-generation
|
||
|
|
library_name: transformers
|
||
|
|
base_model:
|
||
|
|
- Qwen/Qwen2-1.5B-Instruct
|
||
|
|
datasets:
|
||
|
|
- HuggingFaceTB/smollm-corpus
|
||
|
|
---
|
||
|
|
# 🐟 TAID-LLM-1.5B
|
||
|
|
|
||
|
|
🤗 [Models](https://huggingface.co/SakanaAI) | 📚 [Paper](https://arxiv.org/abs/2501.16937) | 📝 [Blog](https://sakana.ai/taid/) | 🐦 [Twitter](https://twitter.com/SakanaAILabs)
|
||
|
|
|
||
|
|
**TAID-LLM-1.5B** is an English language model created through TAID (Temporally Adaptive Interpolated Distillation), our new knowledge distillation method.
|
||
|
|
We used [Qwen2-72B-Instruct](https://huggingface.co/Qwen/Qwen2-72B-Instruct) as the teacher model and [Qwen2-1.5B-Instruct](https://huggingface.co/Qwen/Qwen2-1.5B-Instruct) as the student model.
|
||
|
|
|
||
|
|
## Model Details
|
||
|
|
|
||
|
|
- **Developed by:** [Sakana AI](https://sakana.ai/)
|
||
|
|
- **Model type:** Autoregressive Language Model
|
||
|
|
- **Language(s):** English
|
||
|
|
- **License:** [Apache License, Version 2.0](./LICENSE)
|
||
|
|
- **Paper:** https://arxiv.org/abs/2501.16937
|
||
|
|
- **Blog:** https://sakana.ai/taid/
|
||
|
|
|
||
|
|
## Uses
|
||
|
|
This model is provided for research and development purposes only and should be considered as an experimental prototype.
|
||
|
|
It is not intended for commercial use or deployment in mission-critical environments.
|
||
|
|
Use of this model is at the user's own risk, and its performance and outcomes are not guaranteed.
|
||
|
|
Sakana AI shall not be liable for any direct, indirect, special, incidental, or consequential damages, or any loss arising from the use of this model, regardless of the results obtained.
|
||
|
|
Users must fully understand the risks associated with the use of this model and use it at their own discretion.
|
||
|
|
|
||
|
|
|
||
|
|
## Acknowledgement
|
||
|
|
|
||
|
|
We would like to thank the developers of the source models for their contributions and for making their work available.
|
||
|
|
|
||
|
|
This model is based on results obtained from a project, JPNP20017, subsidized by the New Energy and Industrial Technology Development Organization (NEDO).
|
||
|
|
|
||
|
|
## Citation
|
||
|
|
|
||
|
|
```bibtex
|
||
|
|
@misc{sakana2025taid,
|
||
|
|
title = {TAID: Temporally Adaptive Interpolated Distillation for Efficient Knowledge Transfer in Language Models},
|
||
|
|
author. = {Makoto Shing and Kou Misaki and Han Bao and Sho Yokoi and Takuya Akiba},
|
||
|
|
year = {2025},
|
||
|
|
eprint = {2501.16937},
|
||
|
|
archivePrefix = {arXiv},
|
||
|
|
primaryClass = {cs.LG},
|
||
|
|
url = {https://arxiv.org/abs/2501.16937}
|
||
|
|
}
|
||
|
|
```
|