Update README.md

This commit is contained in:
GeneZC
2023-11-14 02:26:54 +00:00
parent 27e1a60346
commit bca6f7269f

View File

@@ -8,7 +8,7 @@ tasks:
## MiniMA-3B ## MiniMA-3B
📑 [arXiv]() | 👻 [GitHub](https://github.com/GeneZC/MiniMA) | 🤗 [HuggingFace-MiniMA](https://huggingface.co/GeneZC/MiniMA-3B) | 🤗 [HuggingFace-MiniChat](https://huggingface.co/GeneZC/MiniChat-3B) | 🤖 [ModelScope-MiniMA](https://modelscope.cn/models/GeneZC/MiniMA-3B) | 🤖 [ModelScope-MiniChat](https://modelscope.cn/models/GeneZC/MiniChat-3B) 📑 [arXiv](https://arxiv.org/abs/2311.07052) | 👻 [GitHub](https://github.com/GeneZC/MiniMA) | 🤗 [HuggingFace-MiniMA](https://huggingface.co/GeneZC/MiniMA-3B) | 🤗 [HuggingFace-MiniChat](https://huggingface.co/GeneZC/MiniChat-3B) | 🤖 [ModelScope-MiniMA](https://modelscope.cn/models/GeneZC/MiniMA-3B) | 🤖 [ModelScope-MiniChat](https://modelscope.cn/models/GeneZC/MiniChat-3B)
❗ Must comply with LICENSE of LLaMA2 since it is derived from LLaMA2. ❗ Must comply with LICENSE of LLaMA2 since it is derived from LLaMA2.
@@ -52,6 +52,6 @@ output = tokenizer.decode(output_ids, skip_special_tokens=True).strip()
title={Towards the Law of Capacity Gap in Distilling Language Models}, title={Towards the Law of Capacity Gap in Distilling Language Models},
author={Zhang, Chen and Song, Dawei and Ye, Zheyu and Gao, Yan}, author={Zhang, Chen and Song, Dawei and Ye, Zheyu and Gao, Yan},
year={2023}, year={2023},
url={} url={https://arxiv.org/abs/2311.07052}
} }
``` ```