Files
dictalm2-israeli-law-GGUF/README.md
ModelHub XC ae924f5b69 初始化项目,由ModelHub XC社区提供模型
Model: mufeedh28/dictalm2-israeli-law-GGUF
Source: Original Platform
2026-04-11 07:46:57 +08:00

2.2 KiB

base_model, tags, license, language, pipeline_tag
base_model tags license language pipeline_tag
mufeedh28/dictalm2-israeli-law-instruct-merged
gguf
mistral
legal
hebrew
israel
law
ollama
apache-2.0
he
text-generation

DictaLM 2.0 — Israeli Law Chat (GGUF)

F16 full precision | ~14.5 GB | Requires 16GB+ RAM


Quick Start

ollama run hf.co/mufeedh28/dictalm2-israeli-law-GGUF

Then ask questions in Hebrew:

>>> מהן זכויות השוכר לפי חוק השכירות?
>>> מה קורה אם מעסיק לא משלם פיצויי פיטורים?
>>> האם ניתן לערער על החלטת בית משפט השלום?

About

This is the F16 (full precision) GGUF version of DictaLM 2.0 — Israeli Law Chat, a 7B Hebrew legal chatbot fine-tuned on 140K+ Israeli legal documents and 7,291 Q&A pairs.

For full model details, training data, and usage examples, see the main model card.

File Details

File Precision Size Quality
dictalm2-israeli-law.F16.gguf F16 ~14.5 GB Full precision — no quality loss

Requirements

  • Ollama installed
  • 16 GB+ RAM (GPU or CPU)

Alternative Usage

With llama.cpp directly

./llama-cli -m dictalm2-israeli-law.F16.gguf -p "[INST] מהן זכויות העובד בפיטורים? [/INST]" -n 512

With llama-cpp-python

from llama_cpp import Llama

llm = Llama(model_path="dictalm2-israeli-law.F16.gguf", n_ctx=2048)
output = llm("[INST] מהן זכויות העובד בפיטורים? [/INST]", max_tokens=512, temperature=0.7)
print(output["choices"][0]["text"])

Disclaimer: This model may produce inaccurate legal information. Do not use as a substitute for professional legal advice.

Made by Mufeed Hammud | Full Model | GitHub