初始化项目,由ModelHub XC社区提供模型

Model: vicgalle/NeuralBeagle-11B
Source: Original Platform
This commit is contained in:
ModelHub XC
2026-04-11 19:19:11 +08:00
commit 8667255515
14 changed files with 91848 additions and 0 deletions

35
.gitattributes vendored Normal file
View File

@@ -0,0 +1,35 @@
*.7z filter=lfs diff=lfs merge=lfs -text
*.arrow filter=lfs diff=lfs merge=lfs -text
*.bin filter=lfs diff=lfs merge=lfs -text
*.bz2 filter=lfs diff=lfs merge=lfs -text
*.ckpt filter=lfs diff=lfs merge=lfs -text
*.ftz filter=lfs diff=lfs merge=lfs -text
*.gz filter=lfs diff=lfs merge=lfs -text
*.h5 filter=lfs diff=lfs merge=lfs -text
*.joblib filter=lfs diff=lfs merge=lfs -text
*.lfs.* filter=lfs diff=lfs merge=lfs -text
*.mlmodel filter=lfs diff=lfs merge=lfs -text
*.model filter=lfs diff=lfs merge=lfs -text
*.msgpack filter=lfs diff=lfs merge=lfs -text
*.npy filter=lfs diff=lfs merge=lfs -text
*.npz filter=lfs diff=lfs merge=lfs -text
*.onnx filter=lfs diff=lfs merge=lfs -text
*.ot filter=lfs diff=lfs merge=lfs -text
*.parquet filter=lfs diff=lfs merge=lfs -text
*.pb filter=lfs diff=lfs merge=lfs -text
*.pickle filter=lfs diff=lfs merge=lfs -text
*.pkl filter=lfs diff=lfs merge=lfs -text
*.pt filter=lfs diff=lfs merge=lfs -text
*.pth filter=lfs diff=lfs merge=lfs -text
*.rar filter=lfs diff=lfs merge=lfs -text
*.safetensors filter=lfs diff=lfs merge=lfs -text
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
*.tar.* filter=lfs diff=lfs merge=lfs -text
*.tar filter=lfs diff=lfs merge=lfs -text
*.tflite filter=lfs diff=lfs merge=lfs -text
*.tgz filter=lfs diff=lfs merge=lfs -text
*.wasm filter=lfs diff=lfs merge=lfs -text
*.xz filter=lfs diff=lfs merge=lfs -text
*.zip filter=lfs diff=lfs merge=lfs -text
*.zst filter=lfs diff=lfs merge=lfs -text
*tfevents* filter=lfs diff=lfs merge=lfs -text

150
README.md Normal file
View File

@@ -0,0 +1,150 @@
---
license: apache-2.0
tags:
- dpo
- 11B
- merge
datasets:
- argilla/distilabel-intel-orca-dpo-pairs
base_model:
- vicgalle/franken-Beagle-11B
model-index:
- name: NeuralBeagle-11B
results:
- task:
type: text-generation
name: Text Generation
dataset:
name: AI2 Reasoning Challenge (25-Shot)
type: ai2_arc
config: ARC-Challenge
split: test
args:
num_few_shot: 25
metrics:
- type: acc_norm
value: 73.29
name: normalized accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=vicgalle/NeuralBeagle-11B
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: HellaSwag (10-Shot)
type: hellaswag
split: validation
args:
num_few_shot: 10
metrics:
- type: acc_norm
value: 87.61
name: normalized accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=vicgalle/NeuralBeagle-11B
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: MMLU (5-Shot)
type: cais/mmlu
config: all
split: test
args:
num_few_shot: 5
metrics:
- type: acc
value: 63.8
name: accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=vicgalle/NeuralBeagle-11B
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: TruthfulQA (0-shot)
type: truthful_qa
config: multiple_choice
split: validation
args:
num_few_shot: 0
metrics:
- type: mc2
value: 71.36
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=vicgalle/NeuralBeagle-11B
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: Winogrande (5-shot)
type: winogrande
config: winogrande_xl
split: validation
args:
num_few_shot: 5
metrics:
- type: acc
value: 82.64
name: accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=vicgalle/NeuralBeagle-11B
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: GSM8k (5-shot)
type: gsm8k
config: main
split: test
args:
num_few_shot: 5
metrics:
- type: acc
value: 58.98
name: accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=vicgalle/NeuralBeagle-11B
name: Open LLM Leaderboard
---
# NeuralBeagle-11B
DPO'd from vicgalle/franken-Beagle-11B, a Beagle-like model upscaled to 11B.
It is a frankenmerge model created using mergekit. Then, we applied DPO over a high-quality preference dataset.
![image/png](https://cdn-uploads.huggingface.co/production/uploads/5fad8602b8423e1d80b8a965/6u4L-v7GHZWSJq2CT40TS.png)
## Sample generations
**Generate a movie review about the Pokémon 1 movie, focusing on BDSM.**
> Before delving into the realm of BDSOMatic interpretations within Pokémon: The First Movie (aka "Pokémon: Mewtwo's Revenge" in Japan), it is important to acknowledge that this animated feature is primarily intended for a younger audience. It follows the adventures of Ash Ketchum, Pikachu, and their friends as they encounter an artificially created Pokémon named Mewtwo. While certain aspects of the storyline and visuals could be stretched to include various subtexts or symbolism, our analysis will focus on a more playful and metaphorical exploration of BDSM concepts within this film's context.
Participants: Warning: The following review may contain light-hearted and imaginative interpretations that are not intended as serious critical analysis and should not be taken literally.
Within the Pokémon universe, Ash Ketchum and his loyal Pikachu find themselves thrust into a high-stakes scenario reminiscent of a BDSM "Domestic Discipline" (DD) fantasy. As Mewtwo, a genetically modified Pokémon with immense power and intellect, seeks vengeance against the scientists who created him as a tool for battle entertainment, he operates as an archetypal "Dominant" figure.
Mewtwo, born from scientific manipulation, is an embodiment of control, both over his own creation and his obedient army of Pokémon. He exercises mastery over his subjects, mirroring the role of a "Dom," or Dominant, in a BDSM dynamic.
Ash Ketchum, initially unaware of the manipulations that have brought him into this situation, can be seen as a submissive protagonist. His humility and naivety parallel those within the DD community who seek guidance from an experienced Dominant to help them understand and embrace their desires for submission.
In his encounter with Mewtwo, Ash is subjected to tests designed to exhibit his strength, loyalty, and unwavering determination - elements crucial in establishing trust between a submissive and a Dominant within a BDSM dynamic. This psychological battle symbolizes the process of earning a "sub's" consent and trust through displays of control and power.
## Prompt format
The template is included in tokenizer_config.json.
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_vicgalle__NeuralBeagle-11B)
| Metric |Value|
|---------------------------------|----:|
|Avg. |72.95|
|AI2 Reasoning Challenge (25-Shot)|73.29|
|HellaSwag (10-Shot) |87.61|
|MMLU (5-Shot) |63.80|
|TruthfulQA (0-shot) |71.36|
|Winogrande (5-shot) |82.64|
|GSM8k (5-shot) |58.98|

26
config.json Normal file
View File

@@ -0,0 +1,26 @@
{
"_name_or_path": "vicgalle/franken-Beagle-11B",
"architectures": [
"MistralForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 1,
"eos_token_id": 2,
"hidden_act": "silu",
"hidden_size": 4096,
"initializer_range": 0.02,
"intermediate_size": 14336,
"max_position_embeddings": 32768,
"model_type": "mistral",
"num_attention_heads": 32,
"num_hidden_layers": 48,
"num_key_value_heads": 8,
"rms_norm_eps": 1e-05,
"rope_theta": 10000.0,
"sliding_window": 4096,
"tie_word_embeddings": false,
"torch_dtype": "float16",
"transformers_version": "4.36.2",
"use_cache": false,
"vocab_size": 32000
}

7
generation_config.json Normal file
View File

@@ -0,0 +1,7 @@
{
"_from_model_config": true,
"bos_token_id": 1,
"eos_token_id": 2,
"transformers_version": "4.36.2",
"use_cache": false
}

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:d2ca038b4d25041d66ad8aac5d340df086320df5969c541b70ca70268cf10461
size 4943162240

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:4f37acb9cb1f684398a681988bd7cf930b3289edb8f8f68ef97087352c138863
size 4999819232

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:8adc72e0984f06a4c22ae6a60531f14865669f72a73e7716ae25c8547f8c3738
size 4915916080

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:f5a51457e6225b178e655750c4142912c36aacb2a1de14f3897cbd78fcfa37c2
size 4915916080

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:9d77f1fa01c220bcc8e0732f9e3550b1a937cb30b605c56f15f609b98bca09e2
size 1688284744

View File

@@ -0,0 +1,442 @@
{
"metadata": {
"total_size": 21463048192
},
"weight_map": {
"lm_head.weight": "model-00005-of-00005.safetensors",
"model.embed_tokens.weight": "model-00001-of-00005.safetensors",
"model.layers.0.input_layernorm.weight": "model-00001-of-00005.safetensors",
"model.layers.0.mlp.down_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.0.mlp.gate_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.0.mlp.up_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.0.post_attention_layernorm.weight": "model-00001-of-00005.safetensors",
"model.layers.0.self_attn.k_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.0.self_attn.o_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.0.self_attn.q_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.0.self_attn.v_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.1.input_layernorm.weight": "model-00001-of-00005.safetensors",
"model.layers.1.mlp.down_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.1.mlp.gate_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.1.mlp.up_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.1.post_attention_layernorm.weight": "model-00001-of-00005.safetensors",
"model.layers.1.self_attn.k_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.1.self_attn.o_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.1.self_attn.q_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.1.self_attn.v_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.10.input_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.10.mlp.down_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.10.mlp.gate_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.10.mlp.up_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.10.post_attention_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.10.self_attn.k_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.10.self_attn.o_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.10.self_attn.q_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.10.self_attn.v_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.11.input_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.11.mlp.down_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.11.mlp.gate_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.11.mlp.up_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.11.post_attention_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.11.self_attn.k_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.11.self_attn.o_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.11.self_attn.q_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.11.self_attn.v_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.12.input_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.12.mlp.down_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.12.mlp.gate_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.12.mlp.up_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.12.post_attention_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.12.self_attn.k_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.12.self_attn.o_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.12.self_attn.q_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.12.self_attn.v_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.13.input_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.13.mlp.down_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.13.mlp.gate_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.13.mlp.up_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.13.post_attention_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.13.self_attn.k_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.13.self_attn.o_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.13.self_attn.q_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.13.self_attn.v_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.14.input_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.14.mlp.down_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.14.mlp.gate_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.14.mlp.up_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.14.post_attention_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.14.self_attn.k_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.14.self_attn.o_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.14.self_attn.q_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.14.self_attn.v_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.15.input_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.15.mlp.down_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.15.mlp.gate_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.15.mlp.up_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.15.post_attention_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.15.self_attn.k_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.15.self_attn.o_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.15.self_attn.q_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.15.self_attn.v_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.16.input_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.16.mlp.down_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.16.mlp.gate_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.16.mlp.up_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.16.post_attention_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.16.self_attn.k_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.16.self_attn.o_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.16.self_attn.q_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.16.self_attn.v_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.17.input_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.17.mlp.down_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.17.mlp.gate_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.17.mlp.up_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.17.post_attention_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.17.self_attn.k_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.17.self_attn.o_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.17.self_attn.q_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.17.self_attn.v_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.18.input_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.18.mlp.down_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.18.mlp.gate_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.18.mlp.up_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.18.post_attention_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.18.self_attn.k_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.18.self_attn.o_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.18.self_attn.q_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.18.self_attn.v_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.19.input_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.19.mlp.down_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.19.mlp.gate_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.19.mlp.up_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.19.post_attention_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.19.self_attn.k_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.19.self_attn.o_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.19.self_attn.q_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.19.self_attn.v_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.2.input_layernorm.weight": "model-00001-of-00005.safetensors",
"model.layers.2.mlp.down_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.2.mlp.gate_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.2.mlp.up_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.2.post_attention_layernorm.weight": "model-00001-of-00005.safetensors",
"model.layers.2.self_attn.k_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.2.self_attn.o_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.2.self_attn.q_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.2.self_attn.v_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.20.input_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.20.mlp.down_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.20.mlp.gate_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.20.mlp.up_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.20.post_attention_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.20.self_attn.k_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.20.self_attn.o_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.20.self_attn.q_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.20.self_attn.v_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.21.input_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.21.mlp.down_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.21.mlp.gate_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.21.mlp.up_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.21.post_attention_layernorm.weight": "model-00002-of-00005.safetensors",
"model.layers.21.self_attn.k_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.21.self_attn.o_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.21.self_attn.q_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.21.self_attn.v_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.22.input_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.22.mlp.down_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.22.mlp.gate_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.22.mlp.up_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.22.post_attention_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.22.self_attn.k_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.22.self_attn.o_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.22.self_attn.q_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.22.self_attn.v_proj.weight": "model-00002-of-00005.safetensors",
"model.layers.23.input_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.23.mlp.down_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.23.mlp.gate_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.23.mlp.up_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.23.post_attention_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.23.self_attn.k_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.23.self_attn.o_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.23.self_attn.q_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.23.self_attn.v_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.24.input_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.24.mlp.down_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.24.mlp.gate_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.24.mlp.up_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.24.post_attention_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.24.self_attn.k_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.24.self_attn.o_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.24.self_attn.q_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.24.self_attn.v_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.25.input_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.25.mlp.down_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.25.mlp.gate_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.25.mlp.up_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.25.post_attention_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.25.self_attn.k_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.25.self_attn.o_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.25.self_attn.q_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.25.self_attn.v_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.26.input_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.26.mlp.down_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.26.mlp.gate_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.26.mlp.up_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.26.post_attention_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.26.self_attn.k_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.26.self_attn.o_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.26.self_attn.q_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.26.self_attn.v_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.27.input_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.27.mlp.down_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.27.mlp.gate_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.27.mlp.up_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.27.post_attention_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.27.self_attn.k_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.27.self_attn.o_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.27.self_attn.q_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.27.self_attn.v_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.28.input_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.28.mlp.down_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.28.mlp.gate_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.28.mlp.up_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.28.post_attention_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.28.self_attn.k_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.28.self_attn.o_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.28.self_attn.q_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.28.self_attn.v_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.29.input_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.29.mlp.down_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.29.mlp.gate_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.29.mlp.up_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.29.post_attention_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.29.self_attn.k_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.29.self_attn.o_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.29.self_attn.q_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.29.self_attn.v_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.3.input_layernorm.weight": "model-00001-of-00005.safetensors",
"model.layers.3.mlp.down_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.3.mlp.gate_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.3.mlp.up_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.3.post_attention_layernorm.weight": "model-00001-of-00005.safetensors",
"model.layers.3.self_attn.k_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.3.self_attn.o_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.3.self_attn.q_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.3.self_attn.v_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.30.input_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.30.mlp.down_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.30.mlp.gate_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.30.mlp.up_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.30.post_attention_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.30.self_attn.k_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.30.self_attn.o_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.30.self_attn.q_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.30.self_attn.v_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.31.input_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.31.mlp.down_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.31.mlp.gate_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.31.mlp.up_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.31.post_attention_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.31.self_attn.k_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.31.self_attn.o_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.31.self_attn.q_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.31.self_attn.v_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.32.input_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.32.mlp.down_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.32.mlp.gate_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.32.mlp.up_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.32.post_attention_layernorm.weight": "model-00003-of-00005.safetensors",
"model.layers.32.self_attn.k_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.32.self_attn.o_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.32.self_attn.q_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.32.self_attn.v_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.33.input_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.33.mlp.down_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.33.mlp.gate_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.33.mlp.up_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.33.post_attention_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.33.self_attn.k_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.33.self_attn.o_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.33.self_attn.q_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.33.self_attn.v_proj.weight": "model-00003-of-00005.safetensors",
"model.layers.34.input_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.34.mlp.down_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.34.mlp.gate_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.34.mlp.up_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.34.post_attention_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.34.self_attn.k_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.34.self_attn.o_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.34.self_attn.q_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.34.self_attn.v_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.35.input_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.35.mlp.down_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.35.mlp.gate_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.35.mlp.up_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.35.post_attention_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.35.self_attn.k_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.35.self_attn.o_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.35.self_attn.q_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.35.self_attn.v_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.36.input_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.36.mlp.down_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.36.mlp.gate_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.36.mlp.up_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.36.post_attention_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.36.self_attn.k_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.36.self_attn.o_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.36.self_attn.q_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.36.self_attn.v_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.37.input_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.37.mlp.down_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.37.mlp.gate_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.37.mlp.up_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.37.post_attention_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.37.self_attn.k_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.37.self_attn.o_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.37.self_attn.q_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.37.self_attn.v_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.38.input_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.38.mlp.down_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.38.mlp.gate_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.38.mlp.up_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.38.post_attention_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.38.self_attn.k_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.38.self_attn.o_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.38.self_attn.q_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.38.self_attn.v_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.39.input_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.39.mlp.down_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.39.mlp.gate_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.39.mlp.up_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.39.post_attention_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.39.self_attn.k_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.39.self_attn.o_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.39.self_attn.q_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.39.self_attn.v_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.4.input_layernorm.weight": "model-00001-of-00005.safetensors",
"model.layers.4.mlp.down_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.4.mlp.gate_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.4.mlp.up_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.4.post_attention_layernorm.weight": "model-00001-of-00005.safetensors",
"model.layers.4.self_attn.k_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.4.self_attn.o_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.4.self_attn.q_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.4.self_attn.v_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.40.input_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.40.mlp.down_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.40.mlp.gate_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.40.mlp.up_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.40.post_attention_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.40.self_attn.k_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.40.self_attn.o_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.40.self_attn.q_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.40.self_attn.v_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.41.input_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.41.mlp.down_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.41.mlp.gate_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.41.mlp.up_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.41.post_attention_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.41.self_attn.k_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.41.self_attn.o_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.41.self_attn.q_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.41.self_attn.v_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.42.input_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.42.mlp.down_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.42.mlp.gate_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.42.mlp.up_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.42.post_attention_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.42.self_attn.k_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.42.self_attn.o_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.42.self_attn.q_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.42.self_attn.v_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.43.input_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.43.mlp.down_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.43.mlp.gate_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.43.mlp.up_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.43.post_attention_layernorm.weight": "model-00004-of-00005.safetensors",
"model.layers.43.self_attn.k_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.43.self_attn.o_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.43.self_attn.q_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.43.self_attn.v_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.44.input_layernorm.weight": "model-00005-of-00005.safetensors",
"model.layers.44.mlp.down_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.44.mlp.gate_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.44.mlp.up_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.44.post_attention_layernorm.weight": "model-00005-of-00005.safetensors",
"model.layers.44.self_attn.k_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.44.self_attn.o_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.44.self_attn.q_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.44.self_attn.v_proj.weight": "model-00004-of-00005.safetensors",
"model.layers.45.input_layernorm.weight": "model-00005-of-00005.safetensors",
"model.layers.45.mlp.down_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.45.mlp.gate_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.45.mlp.up_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.45.post_attention_layernorm.weight": "model-00005-of-00005.safetensors",
"model.layers.45.self_attn.k_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.45.self_attn.o_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.45.self_attn.q_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.45.self_attn.v_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.46.input_layernorm.weight": "model-00005-of-00005.safetensors",
"model.layers.46.mlp.down_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.46.mlp.gate_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.46.mlp.up_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.46.post_attention_layernorm.weight": "model-00005-of-00005.safetensors",
"model.layers.46.self_attn.k_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.46.self_attn.o_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.46.self_attn.q_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.46.self_attn.v_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.47.input_layernorm.weight": "model-00005-of-00005.safetensors",
"model.layers.47.mlp.down_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.47.mlp.gate_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.47.mlp.up_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.47.post_attention_layernorm.weight": "model-00005-of-00005.safetensors",
"model.layers.47.self_attn.k_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.47.self_attn.o_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.47.self_attn.q_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.47.self_attn.v_proj.weight": "model-00005-of-00005.safetensors",
"model.layers.5.input_layernorm.weight": "model-00001-of-00005.safetensors",
"model.layers.5.mlp.down_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.5.mlp.gate_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.5.mlp.up_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.5.post_attention_layernorm.weight": "model-00001-of-00005.safetensors",
"model.layers.5.self_attn.k_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.5.self_attn.o_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.5.self_attn.q_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.5.self_attn.v_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.6.input_layernorm.weight": "model-00001-of-00005.safetensors",
"model.layers.6.mlp.down_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.6.mlp.gate_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.6.mlp.up_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.6.post_attention_layernorm.weight": "model-00001-of-00005.safetensors",
"model.layers.6.self_attn.k_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.6.self_attn.o_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.6.self_attn.q_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.6.self_attn.v_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.7.input_layernorm.weight": "model-00001-of-00005.safetensors",
"model.layers.7.mlp.down_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.7.mlp.gate_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.7.mlp.up_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.7.post_attention_layernorm.weight": "model-00001-of-00005.safetensors",
"model.layers.7.self_attn.k_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.7.self_attn.o_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.7.self_attn.q_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.7.self_attn.v_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.8.input_layernorm.weight": "model-00001-of-00005.safetensors",
"model.layers.8.mlp.down_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.8.mlp.gate_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.8.mlp.up_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.8.post_attention_layernorm.weight": "model-00001-of-00005.safetensors",
"model.layers.8.self_attn.k_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.8.self_attn.o_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.8.self_attn.q_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.8.self_attn.v_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.9.input_layernorm.weight": "model-00001-of-00005.safetensors",
"model.layers.9.mlp.down_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.9.mlp.gate_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.9.mlp.up_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.9.post_attention_layernorm.weight": "model-00001-of-00005.safetensors",
"model.layers.9.self_attn.k_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.9.self_attn.o_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.9.self_attn.q_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.9.self_attn.v_proj.weight": "model-00001-of-00005.safetensors",
"model.norm.weight": "model-00005-of-00005.safetensors"
}
}

5
special_tokens_map.json Normal file
View File

@@ -0,0 +1,5 @@
{
"bos_token": "<s>",
"eos_token": "</s>",
"unk_token": "<unk>"
}

91122
tokenizer.json Normal file

File diff suppressed because it is too large Load Diff

BIN
tokenizer.model (Stored with Git LFS) Normal file

Binary file not shown.

43
tokenizer_config.json Normal file
View File

@@ -0,0 +1,43 @@
{
"add_bos_token": true,
"add_eos_token": false,
"added_tokens_decoder": {
"0": {
"content": "<unk>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"1": {
"content": "<s>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"2": {
"content": "</s>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
}
},
"additional_special_tokens": [],
"bos_token": "<s>",
"chat_template": "{% for message in messages %}{% if message['role'] == 'system' %}{% if message['content']%}{{'### System:\n' + message['content']+'\n\n'}}{% endif %}{% elif message['role'] == 'user' %}{{'### User:\n' + message['content']+'\n\n'}}{% elif message['role'] == 'assistant' %}{{'### Assistant:\n' + message['content']}}{% endif %}{% if loop.last and add_generation_prompt %}{{ '### Assistant:\n' }}{% endif %}{% endfor %}",
"clean_up_tokenization_spaces": false,
"eos_token": "</s>",
"legacy": true,
"model_max_length": 1000000000000000019884624838656,
"pad_token": null,
"sp_model_kwargs": {},
"spaces_between_special_tokens": false,
"tokenizer_class": "LlamaTokenizer",
"unk_token": "<unk>",
"use_default_system_prompt": false
}