初始化项目,由ModelHub XC社区提供模型

Model: MrRikyz/StarlightMoon-Foxfire-12B
Source: Original Platform
This commit is contained in:
ModelHub XC
2026-05-01 05:41:30 +08:00
commit 50540a2d27
14 changed files with 8655 additions and 0 deletions

36
.gitattributes vendored Normal file
View File

@@ -0,0 +1,36 @@
*.7z filter=lfs diff=lfs merge=lfs -text
*.arrow filter=lfs diff=lfs merge=lfs -text
*.bin filter=lfs diff=lfs merge=lfs -text
*.bz2 filter=lfs diff=lfs merge=lfs -text
*.ckpt filter=lfs diff=lfs merge=lfs -text
*.ftz filter=lfs diff=lfs merge=lfs -text
*.gz filter=lfs diff=lfs merge=lfs -text
*.h5 filter=lfs diff=lfs merge=lfs -text
*.joblib filter=lfs diff=lfs merge=lfs -text
*.lfs.* filter=lfs diff=lfs merge=lfs -text
*.mlmodel filter=lfs diff=lfs merge=lfs -text
*.model filter=lfs diff=lfs merge=lfs -text
*.msgpack filter=lfs diff=lfs merge=lfs -text
*.npy filter=lfs diff=lfs merge=lfs -text
*.npz filter=lfs diff=lfs merge=lfs -text
*.onnx filter=lfs diff=lfs merge=lfs -text
*.ot filter=lfs diff=lfs merge=lfs -text
*.parquet filter=lfs diff=lfs merge=lfs -text
*.pb filter=lfs diff=lfs merge=lfs -text
*.pickle filter=lfs diff=lfs merge=lfs -text
*.pkl filter=lfs diff=lfs merge=lfs -text
*.pt filter=lfs diff=lfs merge=lfs -text
*.pth filter=lfs diff=lfs merge=lfs -text
*.rar filter=lfs diff=lfs merge=lfs -text
*.safetensors filter=lfs diff=lfs merge=lfs -text
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
*.tar.* filter=lfs diff=lfs merge=lfs -text
*.tar filter=lfs diff=lfs merge=lfs -text
*.tflite filter=lfs diff=lfs merge=lfs -text
*.tgz filter=lfs diff=lfs merge=lfs -text
*.wasm filter=lfs diff=lfs merge=lfs -text
*.xz filter=lfs diff=lfs merge=lfs -text
*.zip filter=lfs diff=lfs merge=lfs -text
*.zst filter=lfs diff=lfs merge=lfs -text
*tfevents* filter=lfs diff=lfs merge=lfs -text
tokenizer.json filter=lfs diff=lfs merge=lfs -text

116
README.md Normal file
View File

@@ -0,0 +1,116 @@
---
base_model:
- PygmalionAI/Pygmalion-3-12B
- PygmalionAI/Eleusis-12B
- SicariusSicariiStuff/Impish_Bloodmoon_12B_Abliterated
- DreadPoor/Famino-12B-Model_Stock
- Vortex5/Azure-Starlight-12B
- MrRikyz/Foxfire_Bloom
- SicariusSicariiStuff/Impish_Bloodmoon_12B
library_name: transformers
tags:
- mergekit
- merge
- roleplay
- rp
- nsfw
- model_stock
language:
- en
---
<div style="width: 100%; text-align: center; padding: 40px 20px; background: radial-gradient(circle, rgba(43,0,72,0.05) 0%, rgba(10,10,35,0.02) 100%); border-radius: 20px; margin-bottom: 30px; border: 1px solid rgba(106, 17, 203, 0.1);">
<h1 style="font-size: 3.5em; font-weight: 900; background: linear-gradient(135deg, #b19cd9 0%, #6a11cb 25%, #2575fc 70%, #33ccff 100%); -webkit-background-clip: text; -webkit-text-fill-color: transparent; filter: drop-shadow(0px 0px 12px rgba(106, 17, 203, 0.4)); margin: 0; padding: 10px 0;">
StarlightMoon-Foxfire-12B
</h1>
<p style="font-size: 1.4em; color: #5d5d81; font-family: 'Georgia', serif; font-style: italic; margin-top: 10px; letter-spacing: 2px;">
A blend of 12B Roleplay models
</p>
<div style="width: 60%; height: 2px; background: linear-gradient(to right, transparent, #6a11cb, #33ccff, transparent); margin: 25px auto;"></div>
</div>
## 🌟 Overview
**StarlightMoon-Foxfire-12B** is a merge built on top of `Impish_Bloodmoon_12B` by `SicariusSicariiStuff`.
The model uses the **model_stock** method
### 🧩 Models Merged
This model results from a merge between:
* **SicariusSicariiStuff/Impish_Bloodmoon_12B** (Base)
* **SicariusSicariiStuff/Impish_Bloodmoon_12B_Abliterated**
* **DreadPoor/Famino-12B-Model_Stock**
* **PygmalionAI/Eleusis-12B**
* **PygmalionAI/Pygmalion-3-12B**
* **MrRikyz/Foxfire_Bloom**
* **Vortex5/Azure-Starlight-12B**
---
## 🛠️ Merge Details
### Method: MODEL_STOCK
The merge was performed using `mergekit` with the following parameters:
- **Base Model:** Impish_Bloodmoon_12B_Abliterated
- **Dtype:** Float32
- **Out_Dtype** BFloat16
- **lambda:** 0.82
---
<details>
<summary><b>View Merge Configuration (YAML)</b></summary>
```yaml
base_model: SicariusSicariiStuff/Impish_Bloodmoon_12B
dtype: float32
merge_method: model_stock
modules:
default:
slices:
- sources:
- layer_range: [0, 40]
model: SicariusSicariiStuff/Impish_Bloodmoon_12B_Abliterated
parameters:
weight: 0.54
- layer_range: [0, 40]
model: DreadPoor/Famino-12B-Model_Stock
parameters:
weight: 0.38
- layer_range: [0, 40]
model: PygmalionAI/Eleusis-12B
parameters:
weight: 0.12
- layer_range: [0, 40]
model: PygmalionAI/Pygmalion-3-12B
parameters:
weight: 0.12
- layer_range: [0, 40]
model: MrRikyz/Foxfire_Bloom
parameters:
weight: 0.22
- layer_range: [0, 40]
model: Vortex5/Azure-Starlight-12B
parameters:
weight: 0.1
- layer_range: [0, 40]
model: SicariusSicariiStuff/Impish_Bloodmoon_12B
out_dtype: bfloat16
parameters:
lambda: 0.82
normalize: 1.0
tokenizer:
source: MrRikyz/Foxfire_Bloom
```
</details>
# ✨ Acknowledgements
Thanks to the authors of the original models for their incredible work:
- SicariusSicariiStuff for `Impish_Bloodmoon_12B` and `Impish_Bloodmoon_12B_Abliterated`
- DreadPoor for `Famino-12B-Model_Stock`
- PygmalionAI for `Eleusis-12B` and `PygmalionAI/Pygmalion-3-12B`
- Vortex5 for `Azure-Starlight-12B`

4
chat_template.jinja Normal file
View File

@@ -0,0 +1,4 @@
{% if not add_generation_prompt is defined %}{% set add_generation_prompt = false %}{% endif %}{% for message in messages %}{{'<|im_start|>' + message['role'] + '
' + message['content'] + '<|im_end|>' + '
'}}{% endfor %}{% if add_generation_prompt %}{{ '<|im_start|>assistant
' }}{% endif %}

26
config.json Normal file
View File

@@ -0,0 +1,26 @@
{
"architectures": [
"MistralForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 1,
"dtype": "bfloat16",
"eos_token_id": 2,
"head_dim": 128,
"hidden_act": "silu",
"hidden_size": 5120,
"initializer_range": 0.02,
"intermediate_size": 14336,
"max_position_embeddings": 131072,
"model_type": "mistral",
"num_attention_heads": 32,
"num_hidden_layers": 40,
"num_key_value_heads": 8,
"rms_norm_eps": 1e-05,
"rope_theta": 1000000.0,
"sliding_window": null,
"tie_word_embeddings": false,
"transformers_version": "4.57.3",
"use_cache": true,
"vocab_size": 131072
}

39
mergekit_config.yml Normal file
View File

@@ -0,0 +1,39 @@
base_model: SicariusSicariiStuff/Impish_Bloodmoon_12B
dtype: float32
merge_method: model_stock
modules:
default:
slices:
- sources:
- layer_range: [0, 40]
model: SicariusSicariiStuff/Impish_Bloodmoon_12B_Abliterated
parameters:
weight: 0.54
- layer_range: [0, 40]
model: DreadPoor/Famino-12B-Model_Stock
parameters:
weight: 0.38
- layer_range: [0, 40]
model: PygmalionAI/Eleusis-12B
parameters:
weight: 0.12
- layer_range: [0, 40]
model: PygmalionAI/Pygmalion-3-12B
parameters:
weight: 0.12
- layer_range: [0, 40]
model: MrRikyz/Foxfire_Bloom
parameters:
weight: 0.22
- layer_range: [0, 40]
model: Vortex5/Azure-Starlight-12B
parameters:
weight: 0.1
- layer_range: [0, 40]
model: SicariusSicariiStuff/Impish_Bloodmoon_12B
out_dtype: bfloat16
parameters:
lambda: 0.82
normalize: 1.0
tokenizer:
source: MrRikyz/Foxfire_Bloom

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:515e55b3b3331e82abdd620c2b8d9c7d558a3bd44d5df413f5db5d8bdf3d9359
size 4865489336

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:3ce3b10ea8e9f054addf928867962cff53abd1281403c54000e678e4d713a19f
size 4907529456

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:26dd7bbab1f8a7681ef888975311dcdd49414cc8a0bf46a6fe757b43af89d923
size 4907529464

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:571b9647f6ed889c83537c0d7a0f068598b75982251bed5227dd2eb965ca1801
size 4907529456

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:2cea272f4007a751f6c10fb277667d5a700ab4a5488ea3d26444c1714f4836b2
size 4907529392

View File

@@ -0,0 +1,371 @@
{
"metadata": {
"total_size": 24495564800,
"mergekit_version": "0.1.4"
},
"weight_map": {
"lm_head.weight": "model-00001-of-00005.safetensors",
"model.embed_tokens.weight": "model-00001-of-00005.safetensors",
"model.layers.0.input_layernorm.weight": "model-00001-of-00005.safetensors",
"model.layers.0.mlp.down_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.0.mlp.gate_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.0.mlp.up_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.0.post_attention_layernorm.weight": "model-00001-of-00005.safetensors",
"model.layers.0.self_attn.k_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.0.self_attn.o_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.0.self_attn.q_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.0.self_attn.v_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.1.input_layernorm.weight": "model-00001-of-00005.safetensors",
"model.layers.1.mlp.down_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.1.mlp.gate_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.1.mlp.up_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.1.post_attention_layernorm.weight": "model-00001-of-00005.safetensors",
"model.layers.1.self_attn.k_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.1.self_attn.o_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.1.self_attn.q_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.1.self_attn.v_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.10.input_layernorm.weight": "model-00001-of-00005.safetensors",
"model.layers.10.mlp.down_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.10.mlp.gate_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.10.mlp.up_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.10.post_attention_layernorm.weight": "model-00001-of-00005.safetensors",
"model.layers.10.self_attn.k_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.10.self_attn.o_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.10.self_attn.q_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.10.self_attn.v_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.11.input_layernorm.weight": "model-00001-of-00005.safetensors",
"model.layers.11.mlp.down_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.11.mlp.gate_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.11.mlp.up_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.11.post_attention_layernorm.weight": "model-00001-of-00005.safetensors",
"model.layers.11.self_attn.k_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.11.self_attn.o_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.11.self_attn.q_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.11.self_attn.v_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.12.input_layernorm.weight": "model-00001-of-00005.safetensors",
"model.layers.12.mlp.down_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.12.mlp.gate_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.12.mlp.up_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.12.post_attention_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.12.self_attn.k_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.12.self_attn.o_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.12.self_attn.q_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.12.self_attn.v_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.13.input_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.13.mlp.down_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.13.mlp.gate_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.13.mlp.up_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.13.post_attention_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.13.self_attn.k_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.13.self_attn.o_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.13.self_attn.q_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.13.self_attn.v_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.14.input_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.14.mlp.down_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.14.mlp.gate_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.14.mlp.up_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.14.post_attention_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.14.self_attn.k_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.14.self_attn.o_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.14.self_attn.q_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.14.self_attn.v_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.15.input_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.15.mlp.down_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.15.mlp.gate_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.15.mlp.up_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.15.post_attention_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.15.self_attn.k_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.15.self_attn.o_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.15.self_attn.q_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.15.self_attn.v_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.16.input_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.16.mlp.down_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.16.mlp.gate_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.16.mlp.up_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.16.post_attention_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.16.self_attn.k_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.16.self_attn.o_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.16.self_attn.q_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.16.self_attn.v_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.17.input_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.17.mlp.down_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.17.mlp.gate_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.17.mlp.up_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.17.post_attention_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.17.self_attn.k_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.17.self_attn.o_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.17.self_attn.q_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.17.self_attn.v_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.18.input_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.18.mlp.down_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.18.mlp.gate_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.18.mlp.up_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.18.post_attention_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.18.self_attn.k_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.18.self_attn.o_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.18.self_attn.q_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.18.self_attn.v_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.19.input_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.19.mlp.down_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.19.mlp.gate_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.19.mlp.up_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.19.post_attention_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.19.self_attn.k_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.19.self_attn.o_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.19.self_attn.q_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.19.self_attn.v_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.2.input_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.2.mlp.down_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.2.mlp.gate_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.2.mlp.up_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.2.post_attention_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.2.self_attn.k_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.2.self_attn.o_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.2.self_attn.q_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.2.self_attn.v_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.20.input_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.20.mlp.down_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.20.mlp.gate_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.20.mlp.up_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.20.post_attention_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.20.self_attn.k_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.20.self_attn.o_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.20.self_attn.q_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.20.self_attn.v_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.21.input_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.21.mlp.down_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.21.mlp.gate_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.21.mlp.up_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.21.post_attention_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.21.self_attn.k_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.21.self_attn.o_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.21.self_attn.q_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.21.self_attn.v_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.22.input_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.22.mlp.down_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.22.mlp.gate_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.22.mlp.up_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.22.post_attention_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.22.self_attn.k_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.22.self_attn.o_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.22.self_attn.q_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.22.self_attn.v_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.23.input_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.23.mlp.down_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.23.mlp.gate_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.23.mlp.up_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.23.post_attention_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.23.self_attn.k_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.23.self_attn.o_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.23.self_attn.q_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.23.self_attn.v_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.24.input_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.24.mlp.down_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.24.mlp.gate_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.24.mlp.up_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.24.post_attention_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.24.self_attn.k_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.24.self_attn.o_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.24.self_attn.q_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.24.self_attn.v_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.25.input_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.25.mlp.down_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.25.mlp.gate_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.25.mlp.up_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.25.post_attention_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.25.self_attn.k_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.25.self_attn.o_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.25.self_attn.q_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.25.self_attn.v_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.26.input_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.26.mlp.down_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.26.mlp.gate_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.26.mlp.up_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.26.post_attention_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.26.self_attn.k_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.26.self_attn.o_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.26.self_attn.q_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.26.self_attn.v_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.27.input_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.27.mlp.down_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.27.mlp.gate_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.27.mlp.up_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.27.post_attention_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.27.self_attn.k_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.27.self_attn.o_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.27.self_attn.q_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.27.self_attn.v_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.28.input_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.28.mlp.down_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.28.mlp.gate_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.28.mlp.up_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.28.post_attention_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.28.self_attn.k_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.28.self_attn.o_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.28.self_attn.q_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.28.self_attn.v_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.29.input_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.29.mlp.down_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.29.mlp.gate_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.29.mlp.up_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.29.post_attention_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.29.self_attn.k_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.29.self_attn.o_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.29.self_attn.q_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.29.self_attn.v_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.3.input_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.3.mlp.down_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.3.mlp.gate_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.3.mlp.up_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.3.post_attention_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.3.self_attn.k_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.3.self_attn.o_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.3.self_attn.q_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.3.self_attn.v_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.30.input_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.30.mlp.down_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.30.mlp.gate_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.30.mlp.up_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.30.post_attention_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.30.self_attn.k_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.30.self_attn.o_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.30.self_attn.q_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.30.self_attn.v_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.31.input_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.31.mlp.down_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.31.mlp.gate_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.31.mlp.up_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.31.post_attention_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.31.self_attn.k_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.31.self_attn.o_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.31.self_attn.q_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.31.self_attn.v_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.32.input_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.32.mlp.down_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.32.mlp.gate_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.32.mlp.up_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.32.post_attention_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.32.self_attn.k_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.32.self_attn.o_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.32.self_attn.q_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.32.self_attn.v_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.33.input_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.33.mlp.down_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.33.mlp.gate_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.33.mlp.up_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.33.post_attention_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.33.self_attn.k_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.33.self_attn.o_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.33.self_attn.q_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.33.self_attn.v_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.34.input_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.34.mlp.down_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.34.mlp.gate_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.34.mlp.up_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.34.post_attention_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.34.self_attn.k_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.34.self_attn.o_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.34.self_attn.q_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.34.self_attn.v_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.35.input_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.35.mlp.down_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.35.mlp.gate_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.35.mlp.up_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.35.post_attention_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.35.self_attn.k_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.35.self_attn.o_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.35.self_attn.q_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.35.self_attn.v_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.36.input_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.36.mlp.down_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.36.mlp.gate_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.36.mlp.up_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.36.post_attention_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.36.self_attn.k_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.36.self_attn.o_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.36.self_attn.q_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.36.self_attn.v_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.37.input_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.37.mlp.down_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.37.mlp.gate_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.37.mlp.up_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.37.post_attention_layernorm.weight": "model-00005-of-00005.safetensors",
"model.layers.37.self_attn.k_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.37.self_attn.o_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.37.self_attn.q_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.37.self_attn.v_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.38.input_layernorm.weight": "model-00005-of-00005.safetensors",
"model.layers.38.mlp.down_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.38.mlp.gate_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.38.mlp.up_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.38.post_attention_layernorm.weight": "model-00005-of-00005.safetensors",
"model.layers.38.self_attn.k_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.38.self_attn.o_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.38.self_attn.q_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.38.self_attn.v_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.39.input_layernorm.weight": "model-00005-of-00005.safetensors",
"model.layers.39.mlp.down_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.39.mlp.gate_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.39.mlp.up_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.39.post_attention_layernorm.weight": "model-00005-of-00005.safetensors",
"model.layers.39.self_attn.k_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.39.self_attn.o_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.39.self_attn.q_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.39.self_attn.v_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.4.input_layernorm.weight": "model-00005-of-00005.safetensors",
"model.layers.4.mlp.down_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.4.mlp.gate_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.4.mlp.up_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.4.post_attention_layernorm.weight": "model-00005-of-00005.safetensors",
"model.layers.4.self_attn.k_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.4.self_attn.o_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.4.self_attn.q_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.4.self_attn.v_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.5.input_layernorm.weight": "model-00005-of-00005.safetensors",
"model.layers.5.mlp.down_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.5.mlp.gate_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.5.mlp.up_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.5.post_attention_layernorm.weight": "model-00005-of-00005.safetensors",
"model.layers.5.self_attn.k_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.5.self_attn.o_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.5.self_attn.q_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.5.self_attn.v_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.6.input_layernorm.weight": "model-00005-of-00005.safetensors",
"model.layers.6.mlp.down_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.6.mlp.gate_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.6.mlp.up_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.6.post_attention_layernorm.weight": "model-00005-of-00005.safetensors",
"model.layers.6.self_attn.k_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.6.self_attn.o_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.6.self_attn.q_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.6.self_attn.v_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.7.input_layernorm.weight": "model-00005-of-00005.safetensors",
"model.layers.7.mlp.down_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.7.mlp.gate_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.7.mlp.up_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.7.post_attention_layernorm.weight": "model-00005-of-00005.safetensors",
"model.layers.7.self_attn.k_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.7.self_attn.o_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.7.self_attn.q_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.7.self_attn.v_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.8.input_layernorm.weight": "model-00005-of-00005.safetensors",
"model.layers.8.mlp.down_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.8.mlp.gate_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.8.mlp.up_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.8.post_attention_layernorm.weight": "model-00005-of-00005.safetensors",
"model.layers.8.self_attn.k_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.8.self_attn.o_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.8.self_attn.q_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.8.self_attn.v_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.9.input_layernorm.weight": "model-00005-of-00005.safetensors",
"model.layers.9.mlp.down_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.9.mlp.gate_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.9.mlp.up_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.9.post_attention_layernorm.weight": "model-00005-of-00005.safetensors",
"model.layers.9.self_attn.k_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.9.self_attn.o_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.9.self_attn.q_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.9.self_attn.v_proj.weight": "model-00005-of-00005.safetensors",
"model.norm.weight": "model-00005-of-00005.safetensors"
}
}

30
special_tokens_map.json Normal file
View File

@@ -0,0 +1,30 @@
{
"bos_token": {
"content": "<s>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false
},
"eos_token": {
"content": "<|im_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false
},
"pad_token": {
"content": "<pad>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false
},
"unk_token": {
"content": "<unk>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false
}
}

3
tokenizer.json Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:0de30a5895b1aff2e470e085b9898756108ab32944ee156370ad93c2bc373b40
size 17078343

8015
tokenizer_config.json Normal file

File diff suppressed because it is too large Load Diff