初始化项目,由ModelHub XC社区提供模型

Model: north/llama3_north_llama3_edu_above_1_lr1e5_8192_60000
Source: Original Platform
This commit is contained in:
ModelHub XC
2026-04-13 03:50:00 +08:00
commit dd3060c24d
15 changed files with 413137 additions and 0 deletions

35
.gitattributes vendored Normal file
View File

@@ -0,0 +1,35 @@
*.7z filter=lfs diff=lfs merge=lfs -text
*.arrow filter=lfs diff=lfs merge=lfs -text
*.bin filter=lfs diff=lfs merge=lfs -text
*.bz2 filter=lfs diff=lfs merge=lfs -text
*.ckpt filter=lfs diff=lfs merge=lfs -text
*.ftz filter=lfs diff=lfs merge=lfs -text
*.gz filter=lfs diff=lfs merge=lfs -text
*.h5 filter=lfs diff=lfs merge=lfs -text
*.joblib filter=lfs diff=lfs merge=lfs -text
*.lfs.* filter=lfs diff=lfs merge=lfs -text
*.mlmodel filter=lfs diff=lfs merge=lfs -text
*.model filter=lfs diff=lfs merge=lfs -text
*.msgpack filter=lfs diff=lfs merge=lfs -text
*.npy filter=lfs diff=lfs merge=lfs -text
*.npz filter=lfs diff=lfs merge=lfs -text
*.onnx filter=lfs diff=lfs merge=lfs -text
*.ot filter=lfs diff=lfs merge=lfs -text
*.parquet filter=lfs diff=lfs merge=lfs -text
*.pb filter=lfs diff=lfs merge=lfs -text
*.pickle filter=lfs diff=lfs merge=lfs -text
*.pkl filter=lfs diff=lfs merge=lfs -text
*.pt filter=lfs diff=lfs merge=lfs -text
*.pth filter=lfs diff=lfs merge=lfs -text
*.rar filter=lfs diff=lfs merge=lfs -text
*.safetensors filter=lfs diff=lfs merge=lfs -text
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
*.tar.* filter=lfs diff=lfs merge=lfs -text
*.tar filter=lfs diff=lfs merge=lfs -text
*.tflite filter=lfs diff=lfs merge=lfs -text
*.tgz filter=lfs diff=lfs merge=lfs -text
*.wasm filter=lfs diff=lfs merge=lfs -text
*.xz filter=lfs diff=lfs merge=lfs -text
*.zip filter=lfs diff=lfs merge=lfs -text
*.zst filter=lfs diff=lfs merge=lfs -text
*tfevents* filter=lfs diff=lfs merge=lfs -text

117
LICENSE Normal file
View File

@@ -0,0 +1,117 @@
META LLAMA 3 COMMUNITY LICENSE AGREEMENT
Meta Llama 3 Version Release Date: April 18, 2024
“Agreement” means the terms and conditions for use, reproduction, distribution and modification of the
Llama Materials set forth herein.
“Documentation” means the specifications, manuals and documentation accompanying Meta Llama 3
distributed by Meta at https://llama.meta.com/get-started/.
“Licensee” or “you” means you, or your employer or any other person or entity (if you are entering into
this Agreement on such person or entitys behalf), of the age required under applicable laws, rules or
regulations to provide legal consent and that has legal authority to bind your employer or such other
person or entity if you are entering in this Agreement on their behalf.
“Meta Llama 3” means the foundational large language models and software and algorithms, including
machine-learning model code, trained model weights, inference-enabling code, training-enabling code,
fine-tuning enabling code and other elements of the foregoing distributed by Meta at
https://llama.meta.com/llama-downloads.
“Llama Materials” means, collectively, Metas proprietary Meta Llama 3 and Documentation (and any
portion thereof) made available under this Agreement.
“Meta” or “we” means Meta Platforms Ireland Limited (if you are located in or, if you are an entity, your
principal place of business is in the EEA or Switzerland) and Meta Platforms, Inc. (if you are located
outside of the EEA or Switzerland).
By clicking “I Accept” below or by using or distributing any portion or element of the Llama Materials,
you agree to be bound by this Agreement.
1. License Rights and Redistribution.
a. Grant of Rights. You are granted a non-exclusive, worldwide, non-transferable and royalty-free
limited license under Metas intellectual property or other rights owned by Meta embodied in the Llama
Materials to use, reproduce, distribute, copy, create derivative works of, and make modifications to the
Llama Materials.
b. Redistribution and Use.
i. If you distribute or make available the Llama Materials (or any derivative works
thereof), or a product or service that uses any of them, including another AI model, you shall (A) provide
a copy of this Agreement with any such Llama Materials; and (B) prominently display “Built with Meta
Llama 3” on a related website, user interface, blogpost, about page, or product documentation. If you
use the Llama Materials to create, train, fine tune, or otherwise improve an AI model, which is
distributed or made available, you shall also include “Llama 3” at the beginning of any such AI model
name.
ii. If you receive Llama Materials, or any derivative works thereof, from a Licensee as part
of an integrated end user product, then Section 2 of this Agreement will not apply to you.
iii. You must retain in all copies of the Llama Materials that you distribute the following
attribution notice within a “Notice” text file distributed as a part of such copies: “Meta Llama 3 is
licensed under the Meta Llama 3 Community License, Copyright © Meta Platforms, Inc. All Rights
Reserved.”
iv. Your use of the Llama Materials must comply with applicable laws and regulations
(including trade compliance laws and regulations) and adhere to the Acceptable Use Policy for the Llama
Materials (available at https://llama.meta.com/llama3/use-policy), which is hereby incorporated by
reference into this Agreement.
v. You will not use the Llama Materials or any output or results of the Llama Materials to
improve any other large language model (excluding Meta Llama 3 or derivative works thereof).
2. Additional Commercial Terms. If, on the Meta Llama 3 version release date, the monthly active users
of the products or services made available by or for Licensee, or Licensees affiliates, is greater than 700
million monthly active users in the preceding calendar month, you must request a license from Meta,
which Meta may grant to you in its sole discretion, and you are not authorized to exercise any of the
rights under this Agreement unless or until Meta otherwise expressly grants you such rights.
3. Disclaimer of Warranty. UNLESS REQUIRED BY APPLICABLE LAW, THE LLAMA MATERIALS AND ANY
OUTPUT AND RESULTS THEREFROM ARE PROVIDED ON AN “AS IS” BASIS, WITHOUT WARRANTIES OF
ANY KIND, AND META DISCLAIMS ALL WARRANTIES OF ANY KIND, BOTH EXPRESS AND IMPLIED,
INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES OF TITLE, NON-INFRINGEMENT,
MERCHANTABILITY, OR FITNESS FOR A PARTICULAR PURPOSE. YOU ARE SOLELY RESPONSIBLE FOR
DETERMINING THE APPROPRIATENESS OF USING OR REDISTRIBUTING THE LLAMA MATERIALS AND
ASSUME ANY RISKS ASSOCIATED WITH YOUR USE OF THE LLAMA MATERIALS AND ANY OUTPUT AND
RESULTS.
4. Limitation of Liability. IN NO EVENT WILL META OR ITS AFFILIATES BE LIABLE UNDER ANY THEORY OF
LIABILITY, WHETHER IN CONTRACT, TORT, NEGLIGENCE, PRODUCTS LIABILITY, OR OTHERWISE, ARISING
OUT OF THIS AGREEMENT, FOR ANY LOST PROFITS OR ANY INDIRECT, SPECIAL, CONSEQUENTIAL,
INCIDENTAL, EXEMPLARY OR PUNITIVE DAMAGES, EVEN IF META OR ITS AFFILIATES HAVE BEEN ADVISED
OF THE POSSIBILITY OF ANY OF THE FOREGOING.
5. Intellectual Property.
a. No trademark licenses are granted under this Agreement, and in connection with the Llama
Materials, neither Meta nor Licensee may use any name or mark owned by or associated with the other
or any of its affiliates, except as required for reasonable and customary use in describing and
redistributing the Llama Materials or as set forth in this Section 5(a). Meta hereby grants you a license to
use “Llama 3” (the “Mark”) solely as required to comply with the last sentence of Section 1.b.i. You will
comply with Metas brand guidelines (currently accessible at
https://about.meta.com/brand/resources/meta/company-brand/ ). All goodwill arising out of your use
of the Mark will inure to the benefit of Meta.
b. Subject to Metas ownership of Llama Materials and derivatives made by or for Meta, with
respect to any derivative works and modifications of the Llama Materials that are made by you, as
between you and Meta, you are and will be the owner of such derivative works and modifications.
c. If you institute litigation or other proceedings against Meta or any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Llama Materials or Meta Llama 3 outputs or
results, or any portion of any of the foregoing, constitutes infringement of intellectual property or other
rights owned or licensable by you, then any licenses granted to you under this Agreement shall
terminate as of the date such litigation or claim is filed or instituted. You will indemnify and hold
harmless Meta from and against any claim by any third party arising out of or related to your use or
distribution of the Llama Materials.
6. Term and Termination. The term of this Agreement will commence upon your acceptance of this
Agreement or access to the Llama Materials and will continue in full force and effect until terminated in
accordance with the terms and conditions herein. Meta may terminate this Agreement if you are in
breach of any term or condition of this Agreement. Upon termination of this Agreement, you shall delete
and cease use of the Llama Materials. Sections 3, 4 and 7 shall survive the termination of this
Agreement.
7. Governing Law and Jurisdiction. This Agreement will be governed and construed under the laws of
the State of California without regard to choice of law principles, and the UN Convention on Contracts
for the International Sale of Goods does not apply to this Agreement. The courts of California shall have
exclusive jurisdiction of any dispute arising out of this Agreement.

29
config.json Normal file
View File

@@ -0,0 +1,29 @@
{
"_name_or_path": "meta-llama/Meta-Llama-3-8B",
"architectures": [
"LlamaForCausalLM"
],
"attention_bias": false,
"attention_dropout": 0.0,
"bos_token_id": 128000,
"eos_token_id": 128001,
"hidden_act": "silu",
"hidden_size": 4096,
"initializer_range": 0.02,
"intermediate_size": 14336,
"max_position_embeddings": 8192,
"mlp_bias": false,
"model_type": "llama",
"num_attention_heads": 32,
"num_hidden_layers": 32,
"num_key_value_heads": 8,
"pretraining_tp": 1,
"rms_norm_eps": 1e-05,
"rope_scaling": null,
"rope_theta": 500000.0,
"tie_word_embeddings": false,
"torch_dtype": "float32",
"transformers_version": "4.41.2",
"use_cache": true,
"vocab_size": 128256
}

9
generation_config.json Normal file
View File

@@ -0,0 +1,9 @@
{
"bos_token_id": 128000,
"do_sample": true,
"eos_token_id": 128001,
"max_length": 4096,
"temperature": 0.6,
"top_p": 0.9,
"transformers_version": "4.41.2"
}

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:8639193ffebedc3357a4710ef186feaed111805c3293b292f61a6f1ca6ea6631
size 2973795480

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:10db135e4f79b9559ec99c93c1cf6d1ae1558063b909948ac4d4fa032e98e2cc
size 4886482632

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:171bc897acab036977b9ba9ddea1a725f57c80e9cc33f81194bd53e3e9160506
size 4832007448

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:8b79e97ce333dbc7feadeaa2e33d6e078e3241229ef6e87ebc19fe0cbb5b2d64
size 4999813120

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:32b779e126b7def36106339e0083a474f69a0cb2452eb62057df5c79beeb8ee1
size 4999813128

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:e0ccfcd3cf1bba55884877269db174e2ddc2217a68e4cd0310aefbba30b38d17
size 4832007496

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:d208e072f5d98ede30b988343300532cd833b151f1cbac09632ae7eaa2f823bd
size 4597159360

View File

@@ -0,0 +1,298 @@
{
"metadata": {
"total_size": 32121044992
},
"weight_map": {
"lm_head.weight": "model-00002-of-00007.safetensors",
"model.embed_tokens.weight": "model-00001-of-00007.safetensors",
"model.layers.0.input_layernorm.weight": "model-00001-of-00007.safetensors",
"model.layers.0.mlp.down_proj.weight": "model-00001-of-00007.safetensors",
"model.layers.0.mlp.gate_proj.weight": "model-00001-of-00007.safetensors",
"model.layers.0.mlp.up_proj.weight": "model-00001-of-00007.safetensors",
"model.layers.0.post_attention_layernorm.weight": "model-00001-of-00007.safetensors",
"model.layers.0.self_attn.k_proj.weight": "model-00001-of-00007.safetensors",
"model.layers.0.self_attn.o_proj.weight": "model-00001-of-00007.safetensors",
"model.layers.0.self_attn.q_proj.weight": "model-00001-of-00007.safetensors",
"model.layers.0.self_attn.v_proj.weight": "model-00001-of-00007.safetensors",
"model.layers.1.input_layernorm.weight": "model-00002-of-00007.safetensors",
"model.layers.1.mlp.down_proj.weight": "model-00002-of-00007.safetensors",
"model.layers.1.mlp.gate_proj.weight": "model-00002-of-00007.safetensors",
"model.layers.1.mlp.up_proj.weight": "model-00002-of-00007.safetensors",
"model.layers.1.post_attention_layernorm.weight": "model-00002-of-00007.safetensors",
"model.layers.1.self_attn.k_proj.weight": "model-00002-of-00007.safetensors",
"model.layers.1.self_attn.o_proj.weight": "model-00002-of-00007.safetensors",
"model.layers.1.self_attn.q_proj.weight": "model-00002-of-00007.safetensors",
"model.layers.1.self_attn.v_proj.weight": "model-00002-of-00007.safetensors",
"model.layers.10.input_layernorm.weight": "model-00004-of-00007.safetensors",
"model.layers.10.mlp.down_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.10.mlp.gate_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.10.mlp.up_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.10.post_attention_layernorm.weight": "model-00004-of-00007.safetensors",
"model.layers.10.self_attn.k_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.10.self_attn.o_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.10.self_attn.q_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.10.self_attn.v_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.11.input_layernorm.weight": "model-00004-of-00007.safetensors",
"model.layers.11.mlp.down_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.11.mlp.gate_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.11.mlp.up_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.11.post_attention_layernorm.weight": "model-00004-of-00007.safetensors",
"model.layers.11.self_attn.k_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.11.self_attn.o_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.11.self_attn.q_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.11.self_attn.v_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.12.input_layernorm.weight": "model-00004-of-00007.safetensors",
"model.layers.12.mlp.down_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.12.mlp.gate_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.12.mlp.up_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.12.post_attention_layernorm.weight": "model-00004-of-00007.safetensors",
"model.layers.12.self_attn.k_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.12.self_attn.o_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.12.self_attn.q_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.12.self_attn.v_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.13.input_layernorm.weight": "model-00004-of-00007.safetensors",
"model.layers.13.mlp.down_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.13.mlp.gate_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.13.mlp.up_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.13.post_attention_layernorm.weight": "model-00004-of-00007.safetensors",
"model.layers.13.self_attn.k_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.13.self_attn.o_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.13.self_attn.q_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.13.self_attn.v_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.14.input_layernorm.weight": "model-00004-of-00007.safetensors",
"model.layers.14.mlp.down_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.14.mlp.gate_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.14.mlp.up_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.14.post_attention_layernorm.weight": "model-00004-of-00007.safetensors",
"model.layers.14.self_attn.k_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.14.self_attn.o_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.14.self_attn.q_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.14.self_attn.v_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.15.input_layernorm.weight": "model-00005-of-00007.safetensors",
"model.layers.15.mlp.down_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.15.mlp.gate_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.15.mlp.up_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.15.post_attention_layernorm.weight": "model-00005-of-00007.safetensors",
"model.layers.15.self_attn.k_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.15.self_attn.o_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.15.self_attn.q_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.15.self_attn.v_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.16.input_layernorm.weight": "model-00005-of-00007.safetensors",
"model.layers.16.mlp.down_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.16.mlp.gate_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.16.mlp.up_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.16.post_attention_layernorm.weight": "model-00005-of-00007.safetensors",
"model.layers.16.self_attn.k_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.16.self_attn.o_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.16.self_attn.q_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.16.self_attn.v_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.17.input_layernorm.weight": "model-00005-of-00007.safetensors",
"model.layers.17.mlp.down_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.17.mlp.gate_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.17.mlp.up_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.17.post_attention_layernorm.weight": "model-00005-of-00007.safetensors",
"model.layers.17.self_attn.k_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.17.self_attn.o_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.17.self_attn.q_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.17.self_attn.v_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.18.input_layernorm.weight": "model-00005-of-00007.safetensors",
"model.layers.18.mlp.down_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.18.mlp.gate_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.18.mlp.up_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.18.post_attention_layernorm.weight": "model-00005-of-00007.safetensors",
"model.layers.18.self_attn.k_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.18.self_attn.o_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.18.self_attn.q_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.18.self_attn.v_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.19.input_layernorm.weight": "model-00005-of-00007.safetensors",
"model.layers.19.mlp.down_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.19.mlp.gate_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.19.mlp.up_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.19.post_attention_layernorm.weight": "model-00005-of-00007.safetensors",
"model.layers.19.self_attn.k_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.19.self_attn.o_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.19.self_attn.q_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.19.self_attn.v_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.2.input_layernorm.weight": "model-00002-of-00007.safetensors",
"model.layers.2.mlp.down_proj.weight": "model-00002-of-00007.safetensors",
"model.layers.2.mlp.gate_proj.weight": "model-00002-of-00007.safetensors",
"model.layers.2.mlp.up_proj.weight": "model-00002-of-00007.safetensors",
"model.layers.2.post_attention_layernorm.weight": "model-00002-of-00007.safetensors",
"model.layers.2.self_attn.k_proj.weight": "model-00002-of-00007.safetensors",
"model.layers.2.self_attn.o_proj.weight": "model-00002-of-00007.safetensors",
"model.layers.2.self_attn.q_proj.weight": "model-00002-of-00007.safetensors",
"model.layers.2.self_attn.v_proj.weight": "model-00002-of-00007.safetensors",
"model.layers.20.input_layernorm.weight": "model-00005-of-00007.safetensors",
"model.layers.20.mlp.down_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.20.mlp.gate_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.20.mlp.up_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.20.post_attention_layernorm.weight": "model-00005-of-00007.safetensors",
"model.layers.20.self_attn.k_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.20.self_attn.o_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.20.self_attn.q_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.20.self_attn.v_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.21.input_layernorm.weight": "model-00006-of-00007.safetensors",
"model.layers.21.mlp.down_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.21.mlp.gate_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.21.mlp.up_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.21.post_attention_layernorm.weight": "model-00006-of-00007.safetensors",
"model.layers.21.self_attn.k_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.21.self_attn.o_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.21.self_attn.q_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.21.self_attn.v_proj.weight": "model-00005-of-00007.safetensors",
"model.layers.22.input_layernorm.weight": "model-00006-of-00007.safetensors",
"model.layers.22.mlp.down_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.22.mlp.gate_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.22.mlp.up_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.22.post_attention_layernorm.weight": "model-00006-of-00007.safetensors",
"model.layers.22.self_attn.k_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.22.self_attn.o_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.22.self_attn.q_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.22.self_attn.v_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.23.input_layernorm.weight": "model-00006-of-00007.safetensors",
"model.layers.23.mlp.down_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.23.mlp.gate_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.23.mlp.up_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.23.post_attention_layernorm.weight": "model-00006-of-00007.safetensors",
"model.layers.23.self_attn.k_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.23.self_attn.o_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.23.self_attn.q_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.23.self_attn.v_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.24.input_layernorm.weight": "model-00006-of-00007.safetensors",
"model.layers.24.mlp.down_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.24.mlp.gate_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.24.mlp.up_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.24.post_attention_layernorm.weight": "model-00006-of-00007.safetensors",
"model.layers.24.self_attn.k_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.24.self_attn.o_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.24.self_attn.q_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.24.self_attn.v_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.25.input_layernorm.weight": "model-00006-of-00007.safetensors",
"model.layers.25.mlp.down_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.25.mlp.gate_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.25.mlp.up_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.25.post_attention_layernorm.weight": "model-00006-of-00007.safetensors",
"model.layers.25.self_attn.k_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.25.self_attn.o_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.25.self_attn.q_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.25.self_attn.v_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.26.input_layernorm.weight": "model-00007-of-00007.safetensors",
"model.layers.26.mlp.down_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.26.mlp.gate_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.26.mlp.up_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.26.post_attention_layernorm.weight": "model-00007-of-00007.safetensors",
"model.layers.26.self_attn.k_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.26.self_attn.o_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.26.self_attn.q_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.26.self_attn.v_proj.weight": "model-00006-of-00007.safetensors",
"model.layers.27.input_layernorm.weight": "model-00007-of-00007.safetensors",
"model.layers.27.mlp.down_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.27.mlp.gate_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.27.mlp.up_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.27.post_attention_layernorm.weight": "model-00007-of-00007.safetensors",
"model.layers.27.self_attn.k_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.27.self_attn.o_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.27.self_attn.q_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.27.self_attn.v_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.28.input_layernorm.weight": "model-00007-of-00007.safetensors",
"model.layers.28.mlp.down_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.28.mlp.gate_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.28.mlp.up_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.28.post_attention_layernorm.weight": "model-00007-of-00007.safetensors",
"model.layers.28.self_attn.k_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.28.self_attn.o_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.28.self_attn.q_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.28.self_attn.v_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.29.input_layernorm.weight": "model-00007-of-00007.safetensors",
"model.layers.29.mlp.down_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.29.mlp.gate_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.29.mlp.up_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.29.post_attention_layernorm.weight": "model-00007-of-00007.safetensors",
"model.layers.29.self_attn.k_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.29.self_attn.o_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.29.self_attn.q_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.29.self_attn.v_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.3.input_layernorm.weight": "model-00002-of-00007.safetensors",
"model.layers.3.mlp.down_proj.weight": "model-00002-of-00007.safetensors",
"model.layers.3.mlp.gate_proj.weight": "model-00002-of-00007.safetensors",
"model.layers.3.mlp.up_proj.weight": "model-00002-of-00007.safetensors",
"model.layers.3.post_attention_layernorm.weight": "model-00002-of-00007.safetensors",
"model.layers.3.self_attn.k_proj.weight": "model-00002-of-00007.safetensors",
"model.layers.3.self_attn.o_proj.weight": "model-00002-of-00007.safetensors",
"model.layers.3.self_attn.q_proj.weight": "model-00002-of-00007.safetensors",
"model.layers.3.self_attn.v_proj.weight": "model-00002-of-00007.safetensors",
"model.layers.30.input_layernorm.weight": "model-00007-of-00007.safetensors",
"model.layers.30.mlp.down_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.30.mlp.gate_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.30.mlp.up_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.30.post_attention_layernorm.weight": "model-00007-of-00007.safetensors",
"model.layers.30.self_attn.k_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.30.self_attn.o_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.30.self_attn.q_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.30.self_attn.v_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.31.input_layernorm.weight": "model-00007-of-00007.safetensors",
"model.layers.31.mlp.down_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.31.mlp.gate_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.31.mlp.up_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.31.post_attention_layernorm.weight": "model-00007-of-00007.safetensors",
"model.layers.31.self_attn.k_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.31.self_attn.o_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.31.self_attn.q_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.31.self_attn.v_proj.weight": "model-00007-of-00007.safetensors",
"model.layers.4.input_layernorm.weight": "model-00003-of-00007.safetensors",
"model.layers.4.mlp.down_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.4.mlp.gate_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.4.mlp.up_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.4.post_attention_layernorm.weight": "model-00003-of-00007.safetensors",
"model.layers.4.self_attn.k_proj.weight": "model-00002-of-00007.safetensors",
"model.layers.4.self_attn.o_proj.weight": "model-00002-of-00007.safetensors",
"model.layers.4.self_attn.q_proj.weight": "model-00002-of-00007.safetensors",
"model.layers.4.self_attn.v_proj.weight": "model-00002-of-00007.safetensors",
"model.layers.5.input_layernorm.weight": "model-00003-of-00007.safetensors",
"model.layers.5.mlp.down_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.5.mlp.gate_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.5.mlp.up_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.5.post_attention_layernorm.weight": "model-00003-of-00007.safetensors",
"model.layers.5.self_attn.k_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.5.self_attn.o_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.5.self_attn.q_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.5.self_attn.v_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.6.input_layernorm.weight": "model-00003-of-00007.safetensors",
"model.layers.6.mlp.down_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.6.mlp.gate_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.6.mlp.up_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.6.post_attention_layernorm.weight": "model-00003-of-00007.safetensors",
"model.layers.6.self_attn.k_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.6.self_attn.o_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.6.self_attn.q_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.6.self_attn.v_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.7.input_layernorm.weight": "model-00003-of-00007.safetensors",
"model.layers.7.mlp.down_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.7.mlp.gate_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.7.mlp.up_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.7.post_attention_layernorm.weight": "model-00003-of-00007.safetensors",
"model.layers.7.self_attn.k_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.7.self_attn.o_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.7.self_attn.q_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.7.self_attn.v_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.8.input_layernorm.weight": "model-00003-of-00007.safetensors",
"model.layers.8.mlp.down_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.8.mlp.gate_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.8.mlp.up_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.8.post_attention_layernorm.weight": "model-00003-of-00007.safetensors",
"model.layers.8.self_attn.k_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.8.self_attn.o_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.8.self_attn.q_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.8.self_attn.v_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.9.input_layernorm.weight": "model-00004-of-00007.safetensors",
"model.layers.9.mlp.down_proj.weight": "model-00004-of-00007.safetensors",
"model.layers.9.mlp.gate_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.9.mlp.up_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.9.post_attention_layernorm.weight": "model-00004-of-00007.safetensors",
"model.layers.9.self_attn.k_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.9.self_attn.o_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.9.self_attn.q_proj.weight": "model-00003-of-00007.safetensors",
"model.layers.9.self_attn.v_proj.weight": "model-00003-of-00007.safetensors",
"model.norm.weight": "model-00002-of-00007.safetensors"
}
}

4
special_tokens_map.json Normal file
View File

@@ -0,0 +1,4 @@
{
"bos_token": "<|begin_of_text|>",
"eos_token": "<|end_of_text|>"
}

410563
tokenizer.json Normal file

File diff suppressed because it is too large Load Diff

2061
tokenizer_config.json Normal file

File diff suppressed because it is too large Load Diff