Files
Cypher-7B/README.md
ModelHub XC 8431b3443e 初始化项目,由ModelHub XC社区提供模型
Model: aloobun/Cypher-7B
Source: Original Platform
2026-05-07 08:03:43 +08:00

1.4 KiB

base_model, library_name, tags, license
base_model library_name tags license
NousResearch/Nous-Hermes-2-Mistral-7B-DPO
cognitivecomputations/samantha-1.1-westlake-7b-laser
transformers
mergekit
merge
mistral
nous
westlake
samantha
cc

Quants by @mradermacher: https://huggingface.co/mradermacher/Cypher-7B-GGUF

Configuration

The following YAML configuration was used to produce this model:

slices:
  - sources:
      - model: "NousResearch/Nous-Hermes-2-Mistral-7B-DPO"
        layer_range: [0, 32]
      - model: "cognitivecomputations/samantha-1.1-westlake-7b-laser"
        layer_range: [0, 32]
merge_method: slerp
base_model: "NousResearch/Nous-Hermes-2-Mistral-7B-DPO"
parameters:
  t:
    - filter: lm_head 
      value: [0.55]
    - filter: embed_tokens
      value: [0.7]
    - filter: self_attn
      value: [0.65, 0.35]
    - filter: mlp
      value:  [0.35, 0.65]
    - filter: layernorm
      value: [0.4, 0.6]
    - filter: modelnorm
      value: [0.6]
    - value: 0.5
dtype: bfloat16

Merge Method

This model was merged using the SLERP merge method.

Models Merged

The following models were included in the merge: