2.2 KiB
2.2 KiB
base_model, tags, license, language, pipeline_tag
| base_model | tags | license | language | pipeline_tag | ||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| mufeedh28/dictalm2-israeli-law-instruct-merged |
|
apache-2.0 |
|
text-generation |
DictaLM 2.0 — Israeli Law Chat (GGUF)
Run the Hebrew legal chatbot locally with Ollama
F16 full precision | ~14.5 GB | Requires 16GB+ RAM
Quick Start
ollama run hf.co/mufeedh28/dictalm2-israeli-law-GGUF
Then ask questions in Hebrew:
>>> מהן זכויות השוכר לפי חוק השכירות?
>>> מה קורה אם מעסיק לא משלם פיצויי פיטורים?
>>> האם ניתן לערער על החלטת בית משפט השלום?
About
This is the F16 (full precision) GGUF version of DictaLM 2.0 — Israeli Law Chat, a 7B Hebrew legal chatbot fine-tuned on 140K+ Israeli legal documents and 7,291 Q&A pairs.
For full model details, training data, and usage examples, see the main model card.
File Details
| File | Precision | Size | Quality |
|---|---|---|---|
dictalm2-israeli-law.F16.gguf |
F16 | ~14.5 GB | Full precision — no quality loss |
Requirements
- Ollama installed
- 16 GB+ RAM (GPU or CPU)
Alternative Usage
With llama.cpp directly
./llama-cli -m dictalm2-israeli-law.F16.gguf -p "[INST] מהן זכויות העובד בפיטורים? [/INST]" -n 512
With llama-cpp-python
from llama_cpp import Llama
llm = Llama(model_path="dictalm2-israeli-law.F16.gguf", n_ctx=2048)
output = llm("[INST] מהן זכויות העובד בפיטורים? [/INST]", max_tokens=512, temperature=0.7)
print(output["choices"][0]["text"])
Disclaimer: This model may produce inaccurate legal information. Do not use as a substitute for professional legal advice.
Made by Mufeed Hammud | Full Model | GitHub