1.7 KiB
1.7 KiB
base_model, language, license, tags, datasets, pipeline_tag, new_version
| base_model | language | license | tags | datasets | pipeline_tag | new_version | ||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| unsloth/Meta-Llama-3.2-1B-Instruct |
|
apache-2.0 |
|
|
text2text-generation | ussipan/SipanGPT-0.3-Llama-3.2-1B-GGUF |
SipánGPT 0.2 Llama 3.2 1B GGUF
- Modelo pre-entrenado para responder preguntas de la Universidad Señor de Sipán de Lambayeque, Perú.
- Pre-trained model to answer questions from the Señor de Sipán University of Lambayeque, Peru.
Testing the model
- Debido a la cantidad de conversaciones con las que fue entrenado (5400 conversaciones), el modelo genera bastantes alucinaciones.
- Due to the number of conversations with which it was trained (5400 conversations), the model generates quite a few hallucinations.
Uploaded model
- Developed by: jhangmez
- License: apache-2.0
- Finetuned from model : unsloth/Meta-Llama-3.2-1B-Instruct
This llama model was trained 2x faster with Unsloth and Huggingface's TRL library.

