Files
Hermes-3-Llama-3.1-8B-MLX/README.md
ModelHub XC 66a451af25 初始化项目,由ModelHub XC社区提供模型
Model: mlx-community/Hermes-3-Llama-3.1-8B-MLX
Source: Original Platform
2026-05-02 17:13:38 +08:00

1.3 KiB

language, license, tags, base_model, widget, library_name, pipeline_tag, model-index
language license tags base_model widget library_name pipeline_tag model-index
en
llama3
Llama-3
instruct
finetune
chatml
gpt4
synthetic data
distillation
function calling
json mode
axolotl
roleplaying
chat
mlx
NousResearch/Hermes-3-Llama-3.1-8B
example_title messages
Hermes 3
role content
system You are a sentient, superintelligent artificial general intelligence, here to teach and assist me.
role content
user What is the meaning of life?
mlx text-generation
name results
Hermes-3-Llama-3.1-70B

mlx-community/Hermes-3-Llama-3.1-8B-MLX

This model mlx-community/Hermes-3-Llama-3.1-8B-MLX was converted to MLX format from NousResearch/Hermes-3-Llama-3.1-8B using mlx-lm version 0.23.2.

Use with mlx

pip install mlx-lm
from mlx_lm import load, generate

model, tokenizer = load("mlx-community/Hermes-3-Llama-3.1-8B-MLX")

prompt = "hello"

if tokenizer.chat_template is not None:
    messages = [{"role": "user", "content": prompt}]
    prompt = tokenizer.apply_chat_template(
        messages, add_generation_prompt=True
    )

response = generate(model, tokenizer, prompt=prompt, verbose=True)