Files
Josie-r1-4b-bfloat16/README.md
2025-12-19 05:27:13 +00:00

851 B

tags, base_model, pipeline_tag, library_name
tags base_model pipeline_tag library_name
chat
mlx
Goekdeniz-Guelmez/Josie-r1-4b text-generation mlx

mlx-community/Josie-r1-4b-bfloat16

This model mlx-community/Josie-r1-4b-bfloat16 was converted to MLX format from Goekdeniz-Guelmez/Josie-r1-4b using mlx-lm version 0.28.3.

Use with mlx

pip install mlx-lm
from mlx_lm import load, generate

model, tokenizer = load("mlx-community/Josie-r1-4b-bfloat16")

prompt = "hello"

if tokenizer.chat_template is not None:
    messages = [{"role": "user", "content": prompt}]
    prompt = tokenizer.apply_chat_template(
        messages, add_generation_prompt=True
    )

response = generate(model, tokenizer, prompt=prompt, verbose=True)