base_model, library_name, tags, license
base_model library_name tags license
liminerity/M7-7b
chihoonlee10/T3Q-Mistral-Orca-Math-DPO
transformers
mergekit
merge
apache-2.0

merged

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the SLERP merge method.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

base_model:
  model:
    path: chihoonlee10/T3Q-Mistral-Orca-Math-DPO
dtype: float16
merge_method: slerp
parameters:
  t:
  - filter: self_attn
    value: [0.0, 0.5, 0.3, 0.7, 1.0]
  - filter: mlp
    value: [1.0, 0.5, 0.7, 0.3, 0.0]
  - value: 0.4
slices:
- sources:
  - layer_range: [0, 32]
    model:
      model:
        path: liminerity/M7-7b
  - layer_range: [0, 32]
    model:
      model:
        path: chihoonlee10/T3Q-Mistral-Orca-Math-DPO
Description
Model synced from source: nlpguy/T3QM7
Readme 563 KiB