Files
ModelHub XC d6c86af170 初始化项目,由ModelHub XC社区提供模型
Model: premrajreddy/Home-TinyLlama-1.1B-HomeAssist-GGUF
Source: Original Platform
2026-04-21 14:18:55 +08:00

44 lines
1.5 KiB
Markdown
Raw Permalink Blame History

This file contains ambiguous Unicode characters

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

---
language: en
license: apache-2.0
tags:
- home-assistant
- voice-assistant
- automation
- assistant
- home
pipeline_tag: text-generation
datasets:
- acon96/Home-Assistant-Requests
base_model:
- TinyLlama/TinyLlama-1.1B-Chat-v1.0
base_model_relation: finetune
---
# 🏠 TinyLLaMA-1.1B Home Assistant Voice Model
This model is a **fine-tuned version** of [TinyLlama/TinyLlama-1.1B-Chat-v1.0](https://huggingface.co/TinyLlama/TinyLlama-1.1B-Chat-v1.0), trained with [acon96/Home-Assistant-Requests](https://huggingface.co/datasets/acon96/Home-Assistant-Requests).
It is designed to act as a **voice-controlled smart home assistant** that takes natural language instructions and outputs **Home Assistant commands**.
---
## ✨ Features
- Converts **natural language voice commands** into Home Assistant automation calls.
- Produces **friendly confirmations** and **structured JSON service commands**.
- Lightweight (1.1B parameters) runs efficiently on CPUs, GPUs, and via **Ollama** with quantization.
---
## 🔧 Example Usage (Transformers)
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("premrajreddy/tinyllama-1.1b-home-llm")
model = AutoModelForCausalLM.from_pretrained("premrajreddy/tinyllama-1.1b-home-llm")
query = "turn on the kitchen lights"
inputs = tokenizer(query, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=80)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))