初始化项目,由ModelHub XC社区提供模型
Model: omrisap/LMMS_RSFT_verify Source: Original Platform
This commit is contained in:
36
.gitattributes
vendored
Normal file
36
.gitattributes
vendored
Normal file
@@ -0,0 +1,36 @@
|
||||
*.7z filter=lfs diff=lfs merge=lfs -text
|
||||
*.arrow filter=lfs diff=lfs merge=lfs -text
|
||||
*.bin filter=lfs diff=lfs merge=lfs -text
|
||||
*.bz2 filter=lfs diff=lfs merge=lfs -text
|
||||
*.ckpt filter=lfs diff=lfs merge=lfs -text
|
||||
*.ftz filter=lfs diff=lfs merge=lfs -text
|
||||
*.gz filter=lfs diff=lfs merge=lfs -text
|
||||
*.h5 filter=lfs diff=lfs merge=lfs -text
|
||||
*.joblib filter=lfs diff=lfs merge=lfs -text
|
||||
*.lfs.* filter=lfs diff=lfs merge=lfs -text
|
||||
*.mlmodel filter=lfs diff=lfs merge=lfs -text
|
||||
*.model filter=lfs diff=lfs merge=lfs -text
|
||||
*.msgpack filter=lfs diff=lfs merge=lfs -text
|
||||
*.npy filter=lfs diff=lfs merge=lfs -text
|
||||
*.npz filter=lfs diff=lfs merge=lfs -text
|
||||
*.onnx filter=lfs diff=lfs merge=lfs -text
|
||||
*.ot filter=lfs diff=lfs merge=lfs -text
|
||||
*.parquet filter=lfs diff=lfs merge=lfs -text
|
||||
*.pb filter=lfs diff=lfs merge=lfs -text
|
||||
*.pickle filter=lfs diff=lfs merge=lfs -text
|
||||
*.pkl filter=lfs diff=lfs merge=lfs -text
|
||||
*.pt filter=lfs diff=lfs merge=lfs -text
|
||||
*.pth filter=lfs diff=lfs merge=lfs -text
|
||||
*.rar filter=lfs diff=lfs merge=lfs -text
|
||||
*.safetensors filter=lfs diff=lfs merge=lfs -text
|
||||
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
||||
*.tar.* filter=lfs diff=lfs merge=lfs -text
|
||||
*.tar filter=lfs diff=lfs merge=lfs -text
|
||||
*.tflite filter=lfs diff=lfs merge=lfs -text
|
||||
*.tgz filter=lfs diff=lfs merge=lfs -text
|
||||
*.wasm filter=lfs diff=lfs merge=lfs -text
|
||||
*.xz filter=lfs diff=lfs merge=lfs -text
|
||||
*.zip filter=lfs diff=lfs merge=lfs -text
|
||||
*.zst filter=lfs diff=lfs merge=lfs -text
|
||||
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
||||
tokenizer.json filter=lfs diff=lfs merge=lfs -text
|
||||
199
README.md
Normal file
199
README.md
Normal file
@@ -0,0 +1,199 @@
|
||||
---
|
||||
library_name: transformers
|
||||
tags: []
|
||||
---
|
||||
|
||||
# Model Card for Model ID
|
||||
|
||||
<!-- Provide a quick summary of what the model is/does. -->
|
||||
|
||||
|
||||
|
||||
## Model Details
|
||||
|
||||
### Model Description
|
||||
|
||||
<!-- Provide a longer summary of what this model is. -->
|
||||
|
||||
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
|
||||
|
||||
- **Developed by:** [More Information Needed]
|
||||
- **Funded by [optional]:** [More Information Needed]
|
||||
- **Shared by [optional]:** [More Information Needed]
|
||||
- **Model type:** [More Information Needed]
|
||||
- **Language(s) (NLP):** [More Information Needed]
|
||||
- **License:** [More Information Needed]
|
||||
- **Finetuned from model [optional]:** [More Information Needed]
|
||||
|
||||
### Model Sources [optional]
|
||||
|
||||
<!-- Provide the basic links for the model. -->
|
||||
|
||||
- **Repository:** [More Information Needed]
|
||||
- **Paper [optional]:** [More Information Needed]
|
||||
- **Demo [optional]:** [More Information Needed]
|
||||
|
||||
## Uses
|
||||
|
||||
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
|
||||
|
||||
### Direct Use
|
||||
|
||||
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
|
||||
|
||||
[More Information Needed]
|
||||
|
||||
### Downstream Use [optional]
|
||||
|
||||
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
|
||||
|
||||
[More Information Needed]
|
||||
|
||||
### Out-of-Scope Use
|
||||
|
||||
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
|
||||
|
||||
[More Information Needed]
|
||||
|
||||
## Bias, Risks, and Limitations
|
||||
|
||||
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
|
||||
|
||||
[More Information Needed]
|
||||
|
||||
### Recommendations
|
||||
|
||||
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
|
||||
|
||||
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
|
||||
|
||||
## How to Get Started with the Model
|
||||
|
||||
Use the code below to get started with the model.
|
||||
|
||||
[More Information Needed]
|
||||
|
||||
## Training Details
|
||||
|
||||
### Training Data
|
||||
|
||||
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
|
||||
|
||||
[More Information Needed]
|
||||
|
||||
### Training Procedure
|
||||
|
||||
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
|
||||
|
||||
#### Preprocessing [optional]
|
||||
|
||||
[More Information Needed]
|
||||
|
||||
|
||||
#### Training Hyperparameters
|
||||
|
||||
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
|
||||
|
||||
#### Speeds, Sizes, Times [optional]
|
||||
|
||||
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
|
||||
|
||||
[More Information Needed]
|
||||
|
||||
## Evaluation
|
||||
|
||||
<!-- This section describes the evaluation protocols and provides the results. -->
|
||||
|
||||
### Testing Data, Factors & Metrics
|
||||
|
||||
#### Testing Data
|
||||
|
||||
<!-- This should link to a Dataset Card if possible. -->
|
||||
|
||||
[More Information Needed]
|
||||
|
||||
#### Factors
|
||||
|
||||
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
|
||||
|
||||
[More Information Needed]
|
||||
|
||||
#### Metrics
|
||||
|
||||
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
|
||||
|
||||
[More Information Needed]
|
||||
|
||||
### Results
|
||||
|
||||
[More Information Needed]
|
||||
|
||||
#### Summary
|
||||
|
||||
|
||||
|
||||
## Model Examination [optional]
|
||||
|
||||
<!-- Relevant interpretability work for the model goes here -->
|
||||
|
||||
[More Information Needed]
|
||||
|
||||
## Environmental Impact
|
||||
|
||||
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
|
||||
|
||||
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
|
||||
|
||||
- **Hardware Type:** [More Information Needed]
|
||||
- **Hours used:** [More Information Needed]
|
||||
- **Cloud Provider:** [More Information Needed]
|
||||
- **Compute Region:** [More Information Needed]
|
||||
- **Carbon Emitted:** [More Information Needed]
|
||||
|
||||
## Technical Specifications [optional]
|
||||
|
||||
### Model Architecture and Objective
|
||||
|
||||
[More Information Needed]
|
||||
|
||||
### Compute Infrastructure
|
||||
|
||||
[More Information Needed]
|
||||
|
||||
#### Hardware
|
||||
|
||||
[More Information Needed]
|
||||
|
||||
#### Software
|
||||
|
||||
[More Information Needed]
|
||||
|
||||
## Citation [optional]
|
||||
|
||||
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
|
||||
|
||||
**BibTeX:**
|
||||
|
||||
[More Information Needed]
|
||||
|
||||
**APA:**
|
||||
|
||||
[More Information Needed]
|
||||
|
||||
## Glossary [optional]
|
||||
|
||||
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
|
||||
|
||||
[More Information Needed]
|
||||
|
||||
## More Information [optional]
|
||||
|
||||
[More Information Needed]
|
||||
|
||||
## Model Card Authors [optional]
|
||||
|
||||
[More Information Needed]
|
||||
|
||||
## Model Card Contact
|
||||
|
||||
[More Information Needed]
|
||||
539
added_tokens.json
Normal file
539
added_tokens.json
Normal file
@@ -0,0 +1,539 @@
|
||||
{
|
||||
"</tool_call>": 151658,
|
||||
"<ANSWER>": 152177,
|
||||
"<FINALIZE>": 152178,
|
||||
"<RETRY>": 152179,
|
||||
"<tool_call>": 151657,
|
||||
"<z_0>": 151665,
|
||||
"<z_100>": 151765,
|
||||
"<z_101>": 151766,
|
||||
"<z_102>": 151767,
|
||||
"<z_103>": 151768,
|
||||
"<z_104>": 151769,
|
||||
"<z_105>": 151770,
|
||||
"<z_106>": 151771,
|
||||
"<z_107>": 151772,
|
||||
"<z_108>": 151773,
|
||||
"<z_109>": 151774,
|
||||
"<z_10>": 151675,
|
||||
"<z_110>": 151775,
|
||||
"<z_111>": 151776,
|
||||
"<z_112>": 151777,
|
||||
"<z_113>": 151778,
|
||||
"<z_114>": 151779,
|
||||
"<z_115>": 151780,
|
||||
"<z_116>": 151781,
|
||||
"<z_117>": 151782,
|
||||
"<z_118>": 151783,
|
||||
"<z_119>": 151784,
|
||||
"<z_11>": 151676,
|
||||
"<z_120>": 151785,
|
||||
"<z_121>": 151786,
|
||||
"<z_122>": 151787,
|
||||
"<z_123>": 151788,
|
||||
"<z_124>": 151789,
|
||||
"<z_125>": 151790,
|
||||
"<z_126>": 151791,
|
||||
"<z_127>": 151792,
|
||||
"<z_128>": 151793,
|
||||
"<z_129>": 151794,
|
||||
"<z_12>": 151677,
|
||||
"<z_130>": 151795,
|
||||
"<z_131>": 151796,
|
||||
"<z_132>": 151797,
|
||||
"<z_133>": 151798,
|
||||
"<z_134>": 151799,
|
||||
"<z_135>": 151800,
|
||||
"<z_136>": 151801,
|
||||
"<z_137>": 151802,
|
||||
"<z_138>": 151803,
|
||||
"<z_139>": 151804,
|
||||
"<z_13>": 151678,
|
||||
"<z_140>": 151805,
|
||||
"<z_141>": 151806,
|
||||
"<z_142>": 151807,
|
||||
"<z_143>": 151808,
|
||||
"<z_144>": 151809,
|
||||
"<z_145>": 151810,
|
||||
"<z_146>": 151811,
|
||||
"<z_147>": 151812,
|
||||
"<z_148>": 151813,
|
||||
"<z_149>": 151814,
|
||||
"<z_14>": 151679,
|
||||
"<z_150>": 151815,
|
||||
"<z_151>": 151816,
|
||||
"<z_152>": 151817,
|
||||
"<z_153>": 151818,
|
||||
"<z_154>": 151819,
|
||||
"<z_155>": 151820,
|
||||
"<z_156>": 151821,
|
||||
"<z_157>": 151822,
|
||||
"<z_158>": 151823,
|
||||
"<z_159>": 151824,
|
||||
"<z_15>": 151680,
|
||||
"<z_160>": 151825,
|
||||
"<z_161>": 151826,
|
||||
"<z_162>": 151827,
|
||||
"<z_163>": 151828,
|
||||
"<z_164>": 151829,
|
||||
"<z_165>": 151830,
|
||||
"<z_166>": 151831,
|
||||
"<z_167>": 151832,
|
||||
"<z_168>": 151833,
|
||||
"<z_169>": 151834,
|
||||
"<z_16>": 151681,
|
||||
"<z_170>": 151835,
|
||||
"<z_171>": 151836,
|
||||
"<z_172>": 151837,
|
||||
"<z_173>": 151838,
|
||||
"<z_174>": 151839,
|
||||
"<z_175>": 151840,
|
||||
"<z_176>": 151841,
|
||||
"<z_177>": 151842,
|
||||
"<z_178>": 151843,
|
||||
"<z_179>": 151844,
|
||||
"<z_17>": 151682,
|
||||
"<z_180>": 151845,
|
||||
"<z_181>": 151846,
|
||||
"<z_182>": 151847,
|
||||
"<z_183>": 151848,
|
||||
"<z_184>": 151849,
|
||||
"<z_185>": 151850,
|
||||
"<z_186>": 151851,
|
||||
"<z_187>": 151852,
|
||||
"<z_188>": 151853,
|
||||
"<z_189>": 151854,
|
||||
"<z_18>": 151683,
|
||||
"<z_190>": 151855,
|
||||
"<z_191>": 151856,
|
||||
"<z_192>": 151857,
|
||||
"<z_193>": 151858,
|
||||
"<z_194>": 151859,
|
||||
"<z_195>": 151860,
|
||||
"<z_196>": 151861,
|
||||
"<z_197>": 151862,
|
||||
"<z_198>": 151863,
|
||||
"<z_199>": 151864,
|
||||
"<z_19>": 151684,
|
||||
"<z_1>": 151666,
|
||||
"<z_200>": 151865,
|
||||
"<z_201>": 151866,
|
||||
"<z_202>": 151867,
|
||||
"<z_203>": 151868,
|
||||
"<z_204>": 151869,
|
||||
"<z_205>": 151870,
|
||||
"<z_206>": 151871,
|
||||
"<z_207>": 151872,
|
||||
"<z_208>": 151873,
|
||||
"<z_209>": 151874,
|
||||
"<z_20>": 151685,
|
||||
"<z_210>": 151875,
|
||||
"<z_211>": 151876,
|
||||
"<z_212>": 151877,
|
||||
"<z_213>": 151878,
|
||||
"<z_214>": 151879,
|
||||
"<z_215>": 151880,
|
||||
"<z_216>": 151881,
|
||||
"<z_217>": 151882,
|
||||
"<z_218>": 151883,
|
||||
"<z_219>": 151884,
|
||||
"<z_21>": 151686,
|
||||
"<z_220>": 151885,
|
||||
"<z_221>": 151886,
|
||||
"<z_222>": 151887,
|
||||
"<z_223>": 151888,
|
||||
"<z_224>": 151889,
|
||||
"<z_225>": 151890,
|
||||
"<z_226>": 151891,
|
||||
"<z_227>": 151892,
|
||||
"<z_228>": 151893,
|
||||
"<z_229>": 151894,
|
||||
"<z_22>": 151687,
|
||||
"<z_230>": 151895,
|
||||
"<z_231>": 151896,
|
||||
"<z_232>": 151897,
|
||||
"<z_233>": 151898,
|
||||
"<z_234>": 151899,
|
||||
"<z_235>": 151900,
|
||||
"<z_236>": 151901,
|
||||
"<z_237>": 151902,
|
||||
"<z_238>": 151903,
|
||||
"<z_239>": 151904,
|
||||
"<z_23>": 151688,
|
||||
"<z_240>": 151905,
|
||||
"<z_241>": 151906,
|
||||
"<z_242>": 151907,
|
||||
"<z_243>": 151908,
|
||||
"<z_244>": 151909,
|
||||
"<z_245>": 151910,
|
||||
"<z_246>": 151911,
|
||||
"<z_247>": 151912,
|
||||
"<z_248>": 151913,
|
||||
"<z_249>": 151914,
|
||||
"<z_24>": 151689,
|
||||
"<z_250>": 151915,
|
||||
"<z_251>": 151916,
|
||||
"<z_252>": 151917,
|
||||
"<z_253>": 151918,
|
||||
"<z_254>": 151919,
|
||||
"<z_255>": 151920,
|
||||
"<z_256>": 151921,
|
||||
"<z_257>": 151922,
|
||||
"<z_258>": 151923,
|
||||
"<z_259>": 151924,
|
||||
"<z_25>": 151690,
|
||||
"<z_260>": 151925,
|
||||
"<z_261>": 151926,
|
||||
"<z_262>": 151927,
|
||||
"<z_263>": 151928,
|
||||
"<z_264>": 151929,
|
||||
"<z_265>": 151930,
|
||||
"<z_266>": 151931,
|
||||
"<z_267>": 151932,
|
||||
"<z_268>": 151933,
|
||||
"<z_269>": 151934,
|
||||
"<z_26>": 151691,
|
||||
"<z_270>": 151935,
|
||||
"<z_271>": 151936,
|
||||
"<z_272>": 151937,
|
||||
"<z_273>": 151938,
|
||||
"<z_274>": 151939,
|
||||
"<z_275>": 151940,
|
||||
"<z_276>": 151941,
|
||||
"<z_277>": 151942,
|
||||
"<z_278>": 151943,
|
||||
"<z_279>": 151944,
|
||||
"<z_27>": 151692,
|
||||
"<z_280>": 151945,
|
||||
"<z_281>": 151946,
|
||||
"<z_282>": 151947,
|
||||
"<z_283>": 151948,
|
||||
"<z_284>": 151949,
|
||||
"<z_285>": 151950,
|
||||
"<z_286>": 151951,
|
||||
"<z_287>": 151952,
|
||||
"<z_288>": 151953,
|
||||
"<z_289>": 151954,
|
||||
"<z_28>": 151693,
|
||||
"<z_290>": 151955,
|
||||
"<z_291>": 151956,
|
||||
"<z_292>": 151957,
|
||||
"<z_293>": 151958,
|
||||
"<z_294>": 151959,
|
||||
"<z_295>": 151960,
|
||||
"<z_296>": 151961,
|
||||
"<z_297>": 151962,
|
||||
"<z_298>": 151963,
|
||||
"<z_299>": 151964,
|
||||
"<z_29>": 151694,
|
||||
"<z_2>": 151667,
|
||||
"<z_300>": 151965,
|
||||
"<z_301>": 151966,
|
||||
"<z_302>": 151967,
|
||||
"<z_303>": 151968,
|
||||
"<z_304>": 151969,
|
||||
"<z_305>": 151970,
|
||||
"<z_306>": 151971,
|
||||
"<z_307>": 151972,
|
||||
"<z_308>": 151973,
|
||||
"<z_309>": 151974,
|
||||
"<z_30>": 151695,
|
||||
"<z_310>": 151975,
|
||||
"<z_311>": 151976,
|
||||
"<z_312>": 151977,
|
||||
"<z_313>": 151978,
|
||||
"<z_314>": 151979,
|
||||
"<z_315>": 151980,
|
||||
"<z_316>": 151981,
|
||||
"<z_317>": 151982,
|
||||
"<z_318>": 151983,
|
||||
"<z_319>": 151984,
|
||||
"<z_31>": 151696,
|
||||
"<z_320>": 151985,
|
||||
"<z_321>": 151986,
|
||||
"<z_322>": 151987,
|
||||
"<z_323>": 151988,
|
||||
"<z_324>": 151989,
|
||||
"<z_325>": 151990,
|
||||
"<z_326>": 151991,
|
||||
"<z_327>": 151992,
|
||||
"<z_328>": 151993,
|
||||
"<z_329>": 151994,
|
||||
"<z_32>": 151697,
|
||||
"<z_330>": 151995,
|
||||
"<z_331>": 151996,
|
||||
"<z_332>": 151997,
|
||||
"<z_333>": 151998,
|
||||
"<z_334>": 151999,
|
||||
"<z_335>": 152000,
|
||||
"<z_336>": 152001,
|
||||
"<z_337>": 152002,
|
||||
"<z_338>": 152003,
|
||||
"<z_339>": 152004,
|
||||
"<z_33>": 151698,
|
||||
"<z_340>": 152005,
|
||||
"<z_341>": 152006,
|
||||
"<z_342>": 152007,
|
||||
"<z_343>": 152008,
|
||||
"<z_344>": 152009,
|
||||
"<z_345>": 152010,
|
||||
"<z_346>": 152011,
|
||||
"<z_347>": 152012,
|
||||
"<z_348>": 152013,
|
||||
"<z_349>": 152014,
|
||||
"<z_34>": 151699,
|
||||
"<z_350>": 152015,
|
||||
"<z_351>": 152016,
|
||||
"<z_352>": 152017,
|
||||
"<z_353>": 152018,
|
||||
"<z_354>": 152019,
|
||||
"<z_355>": 152020,
|
||||
"<z_356>": 152021,
|
||||
"<z_357>": 152022,
|
||||
"<z_358>": 152023,
|
||||
"<z_359>": 152024,
|
||||
"<z_35>": 151700,
|
||||
"<z_360>": 152025,
|
||||
"<z_361>": 152026,
|
||||
"<z_362>": 152027,
|
||||
"<z_363>": 152028,
|
||||
"<z_364>": 152029,
|
||||
"<z_365>": 152030,
|
||||
"<z_366>": 152031,
|
||||
"<z_367>": 152032,
|
||||
"<z_368>": 152033,
|
||||
"<z_369>": 152034,
|
||||
"<z_36>": 151701,
|
||||
"<z_370>": 152035,
|
||||
"<z_371>": 152036,
|
||||
"<z_372>": 152037,
|
||||
"<z_373>": 152038,
|
||||
"<z_374>": 152039,
|
||||
"<z_375>": 152040,
|
||||
"<z_376>": 152041,
|
||||
"<z_377>": 152042,
|
||||
"<z_378>": 152043,
|
||||
"<z_379>": 152044,
|
||||
"<z_37>": 151702,
|
||||
"<z_380>": 152045,
|
||||
"<z_381>": 152046,
|
||||
"<z_382>": 152047,
|
||||
"<z_383>": 152048,
|
||||
"<z_384>": 152049,
|
||||
"<z_385>": 152050,
|
||||
"<z_386>": 152051,
|
||||
"<z_387>": 152052,
|
||||
"<z_388>": 152053,
|
||||
"<z_389>": 152054,
|
||||
"<z_38>": 151703,
|
||||
"<z_390>": 152055,
|
||||
"<z_391>": 152056,
|
||||
"<z_392>": 152057,
|
||||
"<z_393>": 152058,
|
||||
"<z_394>": 152059,
|
||||
"<z_395>": 152060,
|
||||
"<z_396>": 152061,
|
||||
"<z_397>": 152062,
|
||||
"<z_398>": 152063,
|
||||
"<z_399>": 152064,
|
||||
"<z_39>": 151704,
|
||||
"<z_3>": 151668,
|
||||
"<z_400>": 152065,
|
||||
"<z_401>": 152066,
|
||||
"<z_402>": 152067,
|
||||
"<z_403>": 152068,
|
||||
"<z_404>": 152069,
|
||||
"<z_405>": 152070,
|
||||
"<z_406>": 152071,
|
||||
"<z_407>": 152072,
|
||||
"<z_408>": 152073,
|
||||
"<z_409>": 152074,
|
||||
"<z_40>": 151705,
|
||||
"<z_410>": 152075,
|
||||
"<z_411>": 152076,
|
||||
"<z_412>": 152077,
|
||||
"<z_413>": 152078,
|
||||
"<z_414>": 152079,
|
||||
"<z_415>": 152080,
|
||||
"<z_416>": 152081,
|
||||
"<z_417>": 152082,
|
||||
"<z_418>": 152083,
|
||||
"<z_419>": 152084,
|
||||
"<z_41>": 151706,
|
||||
"<z_420>": 152085,
|
||||
"<z_421>": 152086,
|
||||
"<z_422>": 152087,
|
||||
"<z_423>": 152088,
|
||||
"<z_424>": 152089,
|
||||
"<z_425>": 152090,
|
||||
"<z_426>": 152091,
|
||||
"<z_427>": 152092,
|
||||
"<z_428>": 152093,
|
||||
"<z_429>": 152094,
|
||||
"<z_42>": 151707,
|
||||
"<z_430>": 152095,
|
||||
"<z_431>": 152096,
|
||||
"<z_432>": 152097,
|
||||
"<z_433>": 152098,
|
||||
"<z_434>": 152099,
|
||||
"<z_435>": 152100,
|
||||
"<z_436>": 152101,
|
||||
"<z_437>": 152102,
|
||||
"<z_438>": 152103,
|
||||
"<z_439>": 152104,
|
||||
"<z_43>": 151708,
|
||||
"<z_440>": 152105,
|
||||
"<z_441>": 152106,
|
||||
"<z_442>": 152107,
|
||||
"<z_443>": 152108,
|
||||
"<z_444>": 152109,
|
||||
"<z_445>": 152110,
|
||||
"<z_446>": 152111,
|
||||
"<z_447>": 152112,
|
||||
"<z_448>": 152113,
|
||||
"<z_449>": 152114,
|
||||
"<z_44>": 151709,
|
||||
"<z_450>": 152115,
|
||||
"<z_451>": 152116,
|
||||
"<z_452>": 152117,
|
||||
"<z_453>": 152118,
|
||||
"<z_454>": 152119,
|
||||
"<z_455>": 152120,
|
||||
"<z_456>": 152121,
|
||||
"<z_457>": 152122,
|
||||
"<z_458>": 152123,
|
||||
"<z_459>": 152124,
|
||||
"<z_45>": 151710,
|
||||
"<z_460>": 152125,
|
||||
"<z_461>": 152126,
|
||||
"<z_462>": 152127,
|
||||
"<z_463>": 152128,
|
||||
"<z_464>": 152129,
|
||||
"<z_465>": 152130,
|
||||
"<z_466>": 152131,
|
||||
"<z_467>": 152132,
|
||||
"<z_468>": 152133,
|
||||
"<z_469>": 152134,
|
||||
"<z_46>": 151711,
|
||||
"<z_470>": 152135,
|
||||
"<z_471>": 152136,
|
||||
"<z_472>": 152137,
|
||||
"<z_473>": 152138,
|
||||
"<z_474>": 152139,
|
||||
"<z_475>": 152140,
|
||||
"<z_476>": 152141,
|
||||
"<z_477>": 152142,
|
||||
"<z_478>": 152143,
|
||||
"<z_479>": 152144,
|
||||
"<z_47>": 151712,
|
||||
"<z_480>": 152145,
|
||||
"<z_481>": 152146,
|
||||
"<z_482>": 152147,
|
||||
"<z_483>": 152148,
|
||||
"<z_484>": 152149,
|
||||
"<z_485>": 152150,
|
||||
"<z_486>": 152151,
|
||||
"<z_487>": 152152,
|
||||
"<z_488>": 152153,
|
||||
"<z_489>": 152154,
|
||||
"<z_48>": 151713,
|
||||
"<z_490>": 152155,
|
||||
"<z_491>": 152156,
|
||||
"<z_492>": 152157,
|
||||
"<z_493>": 152158,
|
||||
"<z_494>": 152159,
|
||||
"<z_495>": 152160,
|
||||
"<z_496>": 152161,
|
||||
"<z_497>": 152162,
|
||||
"<z_498>": 152163,
|
||||
"<z_499>": 152164,
|
||||
"<z_49>": 151714,
|
||||
"<z_4>": 151669,
|
||||
"<z_500>": 152165,
|
||||
"<z_501>": 152166,
|
||||
"<z_502>": 152167,
|
||||
"<z_503>": 152168,
|
||||
"<z_504>": 152169,
|
||||
"<z_505>": 152170,
|
||||
"<z_506>": 152171,
|
||||
"<z_507>": 152172,
|
||||
"<z_508>": 152173,
|
||||
"<z_509>": 152174,
|
||||
"<z_50>": 151715,
|
||||
"<z_510>": 152175,
|
||||
"<z_511>": 152176,
|
||||
"<z_51>": 151716,
|
||||
"<z_52>": 151717,
|
||||
"<z_53>": 151718,
|
||||
"<z_54>": 151719,
|
||||
"<z_55>": 151720,
|
||||
"<z_56>": 151721,
|
||||
"<z_57>": 151722,
|
||||
"<z_58>": 151723,
|
||||
"<z_59>": 151724,
|
||||
"<z_5>": 151670,
|
||||
"<z_60>": 151725,
|
||||
"<z_61>": 151726,
|
||||
"<z_62>": 151727,
|
||||
"<z_63>": 151728,
|
||||
"<z_64>": 151729,
|
||||
"<z_65>": 151730,
|
||||
"<z_66>": 151731,
|
||||
"<z_67>": 151732,
|
||||
"<z_68>": 151733,
|
||||
"<z_69>": 151734,
|
||||
"<z_6>": 151671,
|
||||
"<z_70>": 151735,
|
||||
"<z_71>": 151736,
|
||||
"<z_72>": 151737,
|
||||
"<z_73>": 151738,
|
||||
"<z_74>": 151739,
|
||||
"<z_75>": 151740,
|
||||
"<z_76>": 151741,
|
||||
"<z_77>": 151742,
|
||||
"<z_78>": 151743,
|
||||
"<z_79>": 151744,
|
||||
"<z_7>": 151672,
|
||||
"<z_80>": 151745,
|
||||
"<z_81>": 151746,
|
||||
"<z_82>": 151747,
|
||||
"<z_83>": 151748,
|
||||
"<z_84>": 151749,
|
||||
"<z_85>": 151750,
|
||||
"<z_86>": 151751,
|
||||
"<z_87>": 151752,
|
||||
"<z_88>": 151753,
|
||||
"<z_89>": 151754,
|
||||
"<z_8>": 151673,
|
||||
"<z_90>": 151755,
|
||||
"<z_91>": 151756,
|
||||
"<z_92>": 151757,
|
||||
"<z_93>": 151758,
|
||||
"<z_94>": 151759,
|
||||
"<z_95>": 151760,
|
||||
"<z_96>": 151761,
|
||||
"<z_97>": 151762,
|
||||
"<z_98>": 151763,
|
||||
"<z_99>": 151764,
|
||||
"<z_9>": 151674,
|
||||
"<|box_end|>": 151649,
|
||||
"<|box_start|>": 151648,
|
||||
"<|endoftext|>": 151643,
|
||||
"<|file_sep|>": 151664,
|
||||
"<|fim_middle|>": 151660,
|
||||
"<|fim_pad|>": 151662,
|
||||
"<|fim_prefix|>": 151659,
|
||||
"<|fim_suffix|>": 151661,
|
||||
"<|im_end|>": 151645,
|
||||
"<|im_start|>": 151644,
|
||||
"<|image_pad|>": 151655,
|
||||
"<|object_ref_end|>": 151647,
|
||||
"<|object_ref_start|>": 151646,
|
||||
"<|quad_end|>": 151651,
|
||||
"<|quad_start|>": 151650,
|
||||
"<|repo_name|>": 151663,
|
||||
"<|video_pad|>": 151656,
|
||||
"<|vision_end|>": 151653,
|
||||
"<|vision_pad|>": 151654,
|
||||
"<|vision_start|>": 151652
|
||||
}
|
||||
20
chat_template.jinja
Normal file
20
chat_template.jinja
Normal file
@@ -0,0 +1,20 @@
|
||||
{%- if messages[0]['role'] == 'system' %}
|
||||
{{- '<|im_start|>system
|
||||
' + messages[0]['content'] + '<|im_end|>
|
||||
' }}
|
||||
{%- else %}
|
||||
{{- '<|im_start|>system
|
||||
<|im_end|>
|
||||
' }}
|
||||
{%- endif %}
|
||||
{%- for message in messages %}
|
||||
{%- if (message.role == 'user') or (message.role == 'system' and not loop.first) or (message.role == 'assistant') %}
|
||||
{{- '<|im_start|>' + message.role + '
|
||||
' + message.content + '<|im_end|>' + '
|
||||
' }}
|
||||
{%- endif %}
|
||||
{%- endfor %}
|
||||
{%- if add_generation_prompt %}
|
||||
{{- '<|im_start|>assistant
|
||||
' }}
|
||||
{%- endif %}
|
||||
58
config.json
Normal file
58
config.json
Normal file
@@ -0,0 +1,58 @@
|
||||
{
|
||||
"architectures": [
|
||||
"Qwen2ForCausalLM"
|
||||
],
|
||||
"attention_dropout": 0.0,
|
||||
"bos_token_id": 151643,
|
||||
"dtype": "bfloat16",
|
||||
"eos_token_id": 151645,
|
||||
"hidden_act": "silu",
|
||||
"hidden_size": 3584,
|
||||
"initializer_range": 0.02,
|
||||
"intermediate_size": 18944,
|
||||
"layer_types": [
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention"
|
||||
],
|
||||
"max_position_embeddings": 131072,
|
||||
"max_window_layers": 28,
|
||||
"model_type": "qwen2",
|
||||
"num_attention_heads": 28,
|
||||
"num_hidden_layers": 28,
|
||||
"num_key_value_heads": 4,
|
||||
"rms_norm_eps": 1e-06,
|
||||
"rope_scaling": null,
|
||||
"rope_theta": 500000.0,
|
||||
"sliding_window": null,
|
||||
"tie_word_embeddings": false,
|
||||
"transformers_version": "4.57.6",
|
||||
"use_cache": false,
|
||||
"use_sliding_window": false,
|
||||
"vocab_size": 152180
|
||||
}
|
||||
6
generation_config.json
Normal file
6
generation_config.json
Normal file
@@ -0,0 +1,6 @@
|
||||
{
|
||||
"_from_model_config": true,
|
||||
"bos_token_id": 151643,
|
||||
"eos_token_id": 151645,
|
||||
"transformers_version": "4.57.6"
|
||||
}
|
||||
151388
merges.txt
Normal file
151388
merges.txt
Normal file
File diff suppressed because it is too large
Load Diff
3
model-00001-of-00004.safetensors
Normal file
3
model-00001-of-00004.safetensors
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:180e00f221e3ca1bbfae87367ec551e119ca8e989c085fffd78138b5a28bb675
|
||||
size 4878492264
|
||||
3
model-00002-of-00004.safetensors
Normal file
3
model-00002-of-00004.safetensors
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:4bf80ee973ccb39f0af7ec8ca0d6358635a536ef1ae9d506012a492cf76834e8
|
||||
size 4932751008
|
||||
3
model-00003-of-00004.safetensors
Normal file
3
model-00003-of-00004.safetensors
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:6532ead0baabd9a13e076a4f3e5738e74798e471ad4fb21db8fcbba21d2e7f3d
|
||||
size 4330865200
|
||||
3
model-00004-of-00004.safetensors
Normal file
3
model-00004-of-00004.safetensors
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:e594b0b100a8450c3a8caaad36c3c54a6597c0f21303e2ea778a2cec07ea222f
|
||||
size 1090826368
|
||||
347
model.safetensors.index.json
Normal file
347
model.safetensors.index.json
Normal file
@@ -0,0 +1,347 @@
|
||||
{
|
||||
"metadata": {
|
||||
"total_parameters": 7616448000,
|
||||
"total_size": 15232896000
|
||||
},
|
||||
"weight_map": {
|
||||
"lm_head.weight": "model-00004-of-00004.safetensors",
|
||||
"model.embed_tokens.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.0.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.0.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.0.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.0.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.0.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.0.self_attn.k_proj.bias": "model-00001-of-00004.safetensors",
|
||||
"model.layers.0.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.0.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.0.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
|
||||
"model.layers.0.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.0.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
|
||||
"model.layers.0.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.1.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.1.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.1.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.1.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.1.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.1.self_attn.k_proj.bias": "model-00001-of-00004.safetensors",
|
||||
"model.layers.1.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.1.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.1.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
|
||||
"model.layers.1.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.1.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
|
||||
"model.layers.1.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.10.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.10.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.10.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.10.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.10.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.10.self_attn.k_proj.bias": "model-00002-of-00004.safetensors",
|
||||
"model.layers.10.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.10.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.10.self_attn.q_proj.bias": "model-00002-of-00004.safetensors",
|
||||
"model.layers.10.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.10.self_attn.v_proj.bias": "model-00002-of-00004.safetensors",
|
||||
"model.layers.10.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.11.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.11.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.11.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.11.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.11.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.11.self_attn.k_proj.bias": "model-00002-of-00004.safetensors",
|
||||
"model.layers.11.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.11.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.11.self_attn.q_proj.bias": "model-00002-of-00004.safetensors",
|
||||
"model.layers.11.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.11.self_attn.v_proj.bias": "model-00002-of-00004.safetensors",
|
||||
"model.layers.11.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.12.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.12.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.12.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.12.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.12.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.12.self_attn.k_proj.bias": "model-00002-of-00004.safetensors",
|
||||
"model.layers.12.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.12.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.12.self_attn.q_proj.bias": "model-00002-of-00004.safetensors",
|
||||
"model.layers.12.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.12.self_attn.v_proj.bias": "model-00002-of-00004.safetensors",
|
||||
"model.layers.12.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.13.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.13.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.13.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.13.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.13.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.13.self_attn.k_proj.bias": "model-00002-of-00004.safetensors",
|
||||
"model.layers.13.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.13.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.13.self_attn.q_proj.bias": "model-00002-of-00004.safetensors",
|
||||
"model.layers.13.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.13.self_attn.v_proj.bias": "model-00002-of-00004.safetensors",
|
||||
"model.layers.13.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.14.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.14.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.14.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.14.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.14.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.14.self_attn.k_proj.bias": "model-00002-of-00004.safetensors",
|
||||
"model.layers.14.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.14.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.14.self_attn.q_proj.bias": "model-00002-of-00004.safetensors",
|
||||
"model.layers.14.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.14.self_attn.v_proj.bias": "model-00002-of-00004.safetensors",
|
||||
"model.layers.14.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.15.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.15.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.15.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.15.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.15.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.15.self_attn.k_proj.bias": "model-00002-of-00004.safetensors",
|
||||
"model.layers.15.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.15.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.15.self_attn.q_proj.bias": "model-00002-of-00004.safetensors",
|
||||
"model.layers.15.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.15.self_attn.v_proj.bias": "model-00002-of-00004.safetensors",
|
||||
"model.layers.15.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.16.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.16.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.16.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.16.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.16.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.16.self_attn.k_proj.bias": "model-00002-of-00004.safetensors",
|
||||
"model.layers.16.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.16.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.16.self_attn.q_proj.bias": "model-00002-of-00004.safetensors",
|
||||
"model.layers.16.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.16.self_attn.v_proj.bias": "model-00002-of-00004.safetensors",
|
||||
"model.layers.16.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.17.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.17.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.17.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.17.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.17.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.17.self_attn.k_proj.bias": "model-00002-of-00004.safetensors",
|
||||
"model.layers.17.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.17.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.17.self_attn.q_proj.bias": "model-00002-of-00004.safetensors",
|
||||
"model.layers.17.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.17.self_attn.v_proj.bias": "model-00002-of-00004.safetensors",
|
||||
"model.layers.17.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.18.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.18.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.18.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.18.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.18.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.18.self_attn.k_proj.bias": "model-00002-of-00004.safetensors",
|
||||
"model.layers.18.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.18.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.18.self_attn.q_proj.bias": "model-00002-of-00004.safetensors",
|
||||
"model.layers.18.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.18.self_attn.v_proj.bias": "model-00002-of-00004.safetensors",
|
||||
"model.layers.18.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.19.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.19.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.19.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.19.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.19.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.19.self_attn.k_proj.bias": "model-00003-of-00004.safetensors",
|
||||
"model.layers.19.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.19.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.19.self_attn.q_proj.bias": "model-00003-of-00004.safetensors",
|
||||
"model.layers.19.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.19.self_attn.v_proj.bias": "model-00003-of-00004.safetensors",
|
||||
"model.layers.19.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.2.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.2.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.2.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.2.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.2.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.2.self_attn.k_proj.bias": "model-00001-of-00004.safetensors",
|
||||
"model.layers.2.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.2.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.2.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
|
||||
"model.layers.2.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.2.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
|
||||
"model.layers.2.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.20.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.20.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.20.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.20.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.20.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.20.self_attn.k_proj.bias": "model-00003-of-00004.safetensors",
|
||||
"model.layers.20.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.20.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.20.self_attn.q_proj.bias": "model-00003-of-00004.safetensors",
|
||||
"model.layers.20.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.20.self_attn.v_proj.bias": "model-00003-of-00004.safetensors",
|
||||
"model.layers.20.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.21.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.21.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.21.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.21.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.21.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.21.self_attn.k_proj.bias": "model-00003-of-00004.safetensors",
|
||||
"model.layers.21.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.21.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.21.self_attn.q_proj.bias": "model-00003-of-00004.safetensors",
|
||||
"model.layers.21.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.21.self_attn.v_proj.bias": "model-00003-of-00004.safetensors",
|
||||
"model.layers.21.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.22.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.22.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.22.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.22.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.22.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.22.self_attn.k_proj.bias": "model-00003-of-00004.safetensors",
|
||||
"model.layers.22.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.22.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.22.self_attn.q_proj.bias": "model-00003-of-00004.safetensors",
|
||||
"model.layers.22.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.22.self_attn.v_proj.bias": "model-00003-of-00004.safetensors",
|
||||
"model.layers.22.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.23.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.23.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.23.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.23.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.23.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.23.self_attn.k_proj.bias": "model-00003-of-00004.safetensors",
|
||||
"model.layers.23.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.23.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.23.self_attn.q_proj.bias": "model-00003-of-00004.safetensors",
|
||||
"model.layers.23.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.23.self_attn.v_proj.bias": "model-00003-of-00004.safetensors",
|
||||
"model.layers.23.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.24.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.24.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.24.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.24.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.24.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.24.self_attn.k_proj.bias": "model-00003-of-00004.safetensors",
|
||||
"model.layers.24.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.24.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.24.self_attn.q_proj.bias": "model-00003-of-00004.safetensors",
|
||||
"model.layers.24.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.24.self_attn.v_proj.bias": "model-00003-of-00004.safetensors",
|
||||
"model.layers.24.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.25.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.25.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.25.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.25.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.25.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.25.self_attn.k_proj.bias": "model-00003-of-00004.safetensors",
|
||||
"model.layers.25.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.25.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.25.self_attn.q_proj.bias": "model-00003-of-00004.safetensors",
|
||||
"model.layers.25.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.25.self_attn.v_proj.bias": "model-00003-of-00004.safetensors",
|
||||
"model.layers.25.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.26.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.26.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.26.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.26.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.26.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.26.self_attn.k_proj.bias": "model-00003-of-00004.safetensors",
|
||||
"model.layers.26.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.26.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.26.self_attn.q_proj.bias": "model-00003-of-00004.safetensors",
|
||||
"model.layers.26.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.26.self_attn.v_proj.bias": "model-00003-of-00004.safetensors",
|
||||
"model.layers.26.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.27.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.27.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.27.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.27.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.27.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.27.self_attn.k_proj.bias": "model-00003-of-00004.safetensors",
|
||||
"model.layers.27.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.27.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.27.self_attn.q_proj.bias": "model-00003-of-00004.safetensors",
|
||||
"model.layers.27.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.27.self_attn.v_proj.bias": "model-00003-of-00004.safetensors",
|
||||
"model.layers.27.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.3.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.3.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.3.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.3.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.3.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.3.self_attn.k_proj.bias": "model-00001-of-00004.safetensors",
|
||||
"model.layers.3.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.3.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.3.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
|
||||
"model.layers.3.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.3.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
|
||||
"model.layers.3.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.4.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.4.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.4.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.4.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.4.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.4.self_attn.k_proj.bias": "model-00001-of-00004.safetensors",
|
||||
"model.layers.4.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.4.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.4.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
|
||||
"model.layers.4.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.4.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
|
||||
"model.layers.4.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.5.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.5.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.5.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.5.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.5.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.5.self_attn.k_proj.bias": "model-00001-of-00004.safetensors",
|
||||
"model.layers.5.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.5.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.5.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
|
||||
"model.layers.5.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.5.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
|
||||
"model.layers.5.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.6.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.6.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.6.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.6.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.6.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.6.self_attn.k_proj.bias": "model-00001-of-00004.safetensors",
|
||||
"model.layers.6.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.6.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.6.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
|
||||
"model.layers.6.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.6.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
|
||||
"model.layers.6.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.7.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.7.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.7.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.7.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.7.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.7.self_attn.k_proj.bias": "model-00001-of-00004.safetensors",
|
||||
"model.layers.7.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.7.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.7.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
|
||||
"model.layers.7.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.7.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
|
||||
"model.layers.7.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.8.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.8.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.8.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.8.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.8.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.8.self_attn.k_proj.bias": "model-00001-of-00004.safetensors",
|
||||
"model.layers.8.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.8.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.8.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
|
||||
"model.layers.8.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.8.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
|
||||
"model.layers.8.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.9.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.9.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.9.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.9.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.9.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.9.self_attn.k_proj.bias": "model-00002-of-00004.safetensors",
|
||||
"model.layers.9.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.9.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.9.self_attn.q_proj.bias": "model-00002-of-00004.safetensors",
|
||||
"model.layers.9.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.9.self_attn.v_proj.bias": "model-00002-of-00004.safetensors",
|
||||
"model.layers.9.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.norm.weight": "model-00003-of-00004.safetensors"
|
||||
}
|
||||
}
|
||||
20
special_tokens_map.json
Normal file
20
special_tokens_map.json
Normal file
@@ -0,0 +1,20 @@
|
||||
{
|
||||
"additional_special_tokens": [
|
||||
"<FINALIZE>",
|
||||
"<RETRY>"
|
||||
],
|
||||
"eos_token": {
|
||||
"content": "<|im_end|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false
|
||||
},
|
||||
"pad_token": {
|
||||
"content": "<|endoftext|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false
|
||||
}
|
||||
}
|
||||
3
tokenizer.json
Normal file
3
tokenizer.json
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:029ecc1187816d6508c4c93e653ee6a4589ded82784c1722a103493acb34dca0
|
||||
size 11516550
|
||||
4316
tokenizer_config.json
Normal file
4316
tokenizer_config.json
Normal file
File diff suppressed because it is too large
Load Diff
1
vocab.json
Normal file
1
vocab.json
Normal file
File diff suppressed because one or more lines are too long
Reference in New Issue
Block a user