base_model, library_name, tags
base_model library_name tags
jaspionjader/bh-58
suayptalha/Maestro-R1-Llama-8B
transformers
mergekit
merge

merge

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the SLERP merge method.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

slices:
- sources:
  - model: jaspionjader/bh-58
    layer_range:
    - 0
    - 32
  - model: suayptalha/Maestro-R1-Llama-8B
    layer_range:
    - 0
    - 32
merge_method: slerp
base_model: jaspionjader/bh-58
parameters:
  t:
  - filter: self_attn
    value:
    - 0.09
    - 0.05
    - 0.07
    - 0.08
    - 0.06
  - filter: mlp
    value:
    - 0.06
    - 0.08
    - 0.07
    - 0.05
    - 0.09
  - value: 0.07
dtype: bfloat16

Description
Model synced from source: jaspionjader/Kosmos-EVAA-immersive-mix-v45.1-8B
Readme 29 KiB