ModelHub XC cb638f2b34 初始化项目,由ModelHub XC社区提供模型
Model: louisbrulenaudet/Maxine-7B-0401-stock
Source: Original Platform
2026-05-07 06:38:53 +08:00

tags, base_model, license, language, library_name, pipeline_tag, model-index
tags base_model license language library_name pipeline_tag model-index
merge
mergekit
MTSAIR/multi_verse_model
rwitz/experiment26-truthy-iter-0
MaziyarPanahi/Calme-7B-Instruct-v0.2
chemistry
biology
math
MTSAIR/multi_verse_model
rwitz/experiment26-truthy-iter-0
MaziyarPanahi/Calme-7B-Instruct-v0.2
apache-2.0
en
transformers text-generation
name results
Maxine-7B-0401-stock
task metrics source
type
text-generation
name type value
Average Average 76.73
name type value
ARC ARC 73.12
name type value
GSM8K GSM8K 70.66
name type value
Winogrande Winogrande 85
name type value
TruthfulQA TruthfulQA 78.07
name type value
HellaSwag HellaSwag 89.13
name url
Open LLM Leaderboard https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard

Maxine-7B-0401-stock, an xtraordinary 7B model

03-22-2024 - To date, louisbrulenaudet/Pearl-34B-ties is the "Best 🤝 base merges and moerges model of around 30B" on the Open LLM Leaderboard.

Configuration

models:
    - model: OpenPipe/mistral-ft-optimized-1227
    - model: MTSAIR/multi_verse_model
    - model: rwitz/experiment26-truthy-iter-0
    - model: MaziyarPanahi/Calme-7B-Instruct-v0.2
merge_method: model_stock
base_model: OpenPipe/mistral-ft-optimized-1227
dtype: bfloat16

Usage

!pip install -qU transformers accelerate

from transformers import AutoTokenizer
import transformers
import torch

model = "louisbrulenaudet/Maxine-7B-0401-stock"
messages = [{"role": "user", "content": "What is a large language model?"}]

tokenizer = AutoTokenizer.from_pretrained(model)
prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
pipeline = transformers.pipeline(
    "text-generation",
    model=model,
    torch_dtype=torch.float16,
    device_map="auto",
)

outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])

Citing & Authors

If you use this code in your research, please use the following BibTeX entry.

@misc{louisbrulenaudet2024,
  author =       {Louis Brulé Naudet},
  title =        {Maxine-7B-0401-stock, an xtraordinary 7B model},
  year =         {2024}
  howpublished = {\url{https://huggingface.co/louisbrulenaudet/Maxine-7B-0401-stock}},
}

Feedback

If you have any feedback, please reach out at louisbrulenaudet@icloud.com.

Description
Model synced from source: louisbrulenaudet/Maxine-7B-0401-stock
Readme 511 KiB