e78d0e4ddca9734928ea0bc307693af09e0f86ea
Model: Sayan01/Llama-Flan-XL2base Source: Original Platform
license, language, datasets
| license | language | datasets | ||
|---|---|---|---|---|
| apache-2.0 |
|
|
This is a 230M parameter Small Llama model distilled from the Original one. The model is distilled on OpenOrca's FLAN dataset. The distillation ran over 160000 random samples of FLAN dataset. It is free to download. Also, it is a work in progress, so please use it at your own risk
Description