初始化项目,由ModelHub XC社区提供模型

Model: laion/nemotron-terminal-data_processing__Qwen3-8B
Source: Original Platform
This commit is contained in:
ModelHub XC
2026-05-12 05:15:30 +08:00
commit 754d4ef06b
23 changed files with 154088 additions and 0 deletions

36
.gitattributes vendored Normal file
View File

@@ -0,0 +1,36 @@
*.7z filter=lfs diff=lfs merge=lfs -text
*.arrow filter=lfs diff=lfs merge=lfs -text
*.bin filter=lfs diff=lfs merge=lfs -text
*.bz2 filter=lfs diff=lfs merge=lfs -text
*.ckpt filter=lfs diff=lfs merge=lfs -text
*.ftz filter=lfs diff=lfs merge=lfs -text
*.gz filter=lfs diff=lfs merge=lfs -text
*.h5 filter=lfs diff=lfs merge=lfs -text
*.joblib filter=lfs diff=lfs merge=lfs -text
*.lfs.* filter=lfs diff=lfs merge=lfs -text
*.mlmodel filter=lfs diff=lfs merge=lfs -text
*.model filter=lfs diff=lfs merge=lfs -text
*.msgpack filter=lfs diff=lfs merge=lfs -text
*.npy filter=lfs diff=lfs merge=lfs -text
*.npz filter=lfs diff=lfs merge=lfs -text
*.onnx filter=lfs diff=lfs merge=lfs -text
*.ot filter=lfs diff=lfs merge=lfs -text
*.parquet filter=lfs diff=lfs merge=lfs -text
*.pb filter=lfs diff=lfs merge=lfs -text
*.pickle filter=lfs diff=lfs merge=lfs -text
*.pkl filter=lfs diff=lfs merge=lfs -text
*.pt filter=lfs diff=lfs merge=lfs -text
*.pth filter=lfs diff=lfs merge=lfs -text
*.rar filter=lfs diff=lfs merge=lfs -text
*.safetensors filter=lfs diff=lfs merge=lfs -text
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
*.tar.* filter=lfs diff=lfs merge=lfs -text
*.tar filter=lfs diff=lfs merge=lfs -text
*.tflite filter=lfs diff=lfs merge=lfs -text
*.tgz filter=lfs diff=lfs merge=lfs -text
*.wasm filter=lfs diff=lfs merge=lfs -text
*.xz filter=lfs diff=lfs merge=lfs -text
*.zip filter=lfs diff=lfs merge=lfs -text
*.zst filter=lfs diff=lfs merge=lfs -text
*tfevents* filter=lfs diff=lfs merge=lfs -text
tokenizer.json filter=lfs diff=lfs merge=lfs -text

61
README.md Normal file
View File

@@ -0,0 +1,61 @@
---
library_name: transformers
license: other
base_model: Qwen/Qwen3-8B
tags:
- llama-factory
- full
- generated_from_trainer
model-index:
- name: nemotron-data-processing__Qwen3-8B
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# nemotron-data-processing__Qwen3-8B
This model is a fine-tuned version of [Qwen/Qwen3-8B](https://huggingface.co/Qwen/Qwen3-8B) on the /e/data1/datasets/playground/ot/hf_hub/datasets--laion--nemotron-terminal-data_processing/snapshots/78e341b1c482ae93ac8ef8d3f560eafd7afd5406_thinking_preprocessed dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 4e-05
- train_batch_size: 1
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- num_devices: 32
- gradient_accumulation_steps: 3
- total_train_batch_size: 96
- total_eval_batch_size: 256
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.98) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 7.0
### Training results
### Framework versions
- Transformers 4.57.6
- Pytorch 2.9.1+cu130
- Datasets 4.7.0
- Tokenizers 0.22.2

28
added_tokens.json Normal file
View File

@@ -0,0 +1,28 @@
{
"</think>": 151668,
"</tool_call>": 151658,
"</tool_response>": 151666,
"<think>": 151667,
"<tool_call>": 151657,
"<tool_response>": 151665,
"<|box_end|>": 151649,
"<|box_start|>": 151648,
"<|endoftext|>": 151643,
"<|file_sep|>": 151664,
"<|fim_middle|>": 151660,
"<|fim_pad|>": 151662,
"<|fim_prefix|>": 151659,
"<|fim_suffix|>": 151661,
"<|im_end|>": 151645,
"<|im_start|>": 151644,
"<|image_pad|>": 151655,
"<|object_ref_end|>": 151647,
"<|object_ref_start|>": 151646,
"<|quad_end|>": 151651,
"<|quad_start|>": 151650,
"<|repo_name|>": 151663,
"<|video_pad|>": 151656,
"<|vision_end|>": 151653,
"<|vision_pad|>": 151654,
"<|vision_start|>": 151652
}

16
all_results.json Normal file
View File

@@ -0,0 +1,16 @@
{
"achieved_tflops_per_gpu": 55493.00432659637,
"achieved_tflops_per_gpu_theoretical": 1813433.511628669,
"epoch": 7.0,
"loss_nan_ranks": 0,
"loss_rank_avg": 0.11563363671302795,
"mfu_percent": 3921.7670902188247,
"mfu_percent_theoretical": 128157.84534478225,
"total_flos": 2.2525720316251996e+18,
"train_loss": 0.0,
"train_runtime": 1.2685,
"train_samples_per_second": 51247.723,
"train_steps_per_second": 535.267,
"valid_targets_mean": 7221.4,
"valid_targets_min": 1533
}

89
chat_template.jinja Normal file
View File

@@ -0,0 +1,89 @@
{%- if tools %}
{{- '<|im_start|>system\n' }}
{%- if messages[0].role == 'system' %}
{{- messages[0].content + '\n\n' }}
{%- endif %}
{{- "# Tools\n\nYou may call one or more functions to assist with the user query.\n\nYou are provided with function signatures within <tools></tools> XML tags:\n<tools>" }}
{%- for tool in tools %}
{{- "\n" }}
{{- tool | tojson }}
{%- endfor %}
{{- "\n</tools>\n\nFor each function call, return a json object with function name and arguments within <tool_call></tool_call> XML tags:\n<tool_call>\n{\"name\": <function-name>, \"arguments\": <args-json-object>}\n</tool_call><|im_end|>\n" }}
{%- else %}
{%- if messages[0].role == 'system' %}
{{- '<|im_start|>system\n' + messages[0].content + '<|im_end|>\n' }}
{%- endif %}
{%- endif %}
{%- set ns = namespace(multi_step_tool=true, last_query_index=messages|length - 1) %}
{%- for message in messages[::-1] %}
{%- set index = (messages|length - 1) - loop.index0 %}
{%- if ns.multi_step_tool and message.role == "user" and message.content is string and not(message.content.startswith('<tool_response>') and message.content.endswith('</tool_response>')) %}
{%- set ns.multi_step_tool = false %}
{%- set ns.last_query_index = index %}
{%- endif %}
{%- endfor %}
{%- for message in messages %}
{%- if message.content is string %}
{%- set content = message.content %}
{%- else %}
{%- set content = '' %}
{%- endif %}
{%- if (message.role == "user") or (message.role == "system" and not loop.first) %}
{{- '<|im_start|>' + message.role + '\n' + content + '<|im_end|>' + '\n' }}
{%- elif message.role == "assistant" %}
{%- set reasoning_content = '' %}
{%- if message.reasoning_content is string %}
{%- set reasoning_content = message.reasoning_content %}
{%- else %}
{%- if '</think>' in content %}
{%- set reasoning_content = content.split('</think>')[0].rstrip('\n').split('<think>')[-1].lstrip('\n') %}
{%- set content = content.split('</think>')[-1].lstrip('\n') %}
{%- endif %}
{%- endif %}
{%- if loop.index0 > ns.last_query_index %}
{%- if loop.last or (not loop.last and reasoning_content) %}
{{- '<|im_start|>' + message.role + '\n<think>\n' + reasoning_content.strip('\n') + '\n</think>\n\n' + content.lstrip('\n') }}
{%- else %}
{{- '<|im_start|>' + message.role + '\n' + content }}
{%- endif %}
{%- else %}
{{- '<|im_start|>' + message.role + '\n' + content }}
{%- endif %}
{%- if message.tool_calls %}
{%- for tool_call in message.tool_calls %}
{%- if (loop.first and content) or (not loop.first) %}
{{- '\n' }}
{%- endif %}
{%- if tool_call.function %}
{%- set tool_call = tool_call.function %}
{%- endif %}
{{- '<tool_call>\n{"name": "' }}
{{- tool_call.name }}
{{- '", "arguments": ' }}
{%- if tool_call.arguments is string %}
{{- tool_call.arguments }}
{%- else %}
{{- tool_call.arguments | tojson }}
{%- endif %}
{{- '}\n</tool_call>' }}
{%- endfor %}
{%- endif %}
{{- '<|im_end|>\n' }}
{%- elif message.role == "tool" %}
{%- if loop.first or (messages[loop.index0 - 1].role != "tool") %}
{{- '<|im_start|>user' }}
{%- endif %}
{{- '\n<tool_response>\n' }}
{{- content }}
{{- '\n</tool_response>' }}
{%- if loop.last or (messages[loop.index0 + 1].role != "tool") %}
{{- '<|im_end|>\n' }}
{%- endif %}
{%- endif %}
{%- endfor %}
{%- if add_generation_prompt %}
{{- '<|im_start|>assistant\n' }}
{%- if enable_thinking is defined and enable_thinking is false %}
{{- '<think>\n\n</think>\n\n' }}
{%- endif %}
{%- endif %}

68
config.json Normal file
View File

@@ -0,0 +1,68 @@
{
"architectures": [
"Qwen3ForCausalLM"
],
"attention_bias": false,
"attention_dropout": 0.0,
"dtype": "bfloat16",
"eos_token_id": 151645,
"head_dim": 128,
"hidden_act": "silu",
"hidden_size": 4096,
"initializer_range": 0.02,
"intermediate_size": 12288,
"layer_types": [
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention"
],
"max_position_embeddings": 40960,
"max_window_layers": 36,
"model_type": "qwen3",
"num_attention_heads": 32,
"num_hidden_layers": 36,
"num_key_value_heads": 8,
"pad_token_id": 151643,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000,
"sliding_window": null,
"tie_word_embeddings": false,
"transformers_version": "4.57.6",
"use_cache": false,
"use_sliding_window": false,
"vocab_size": 151936
}

12
generation_config.json Normal file
View File

@@ -0,0 +1,12 @@
{
"do_sample": true,
"eos_token_id": [
151645,
151643
],
"pad_token_id": 151643,
"temperature": 0.6,
"top_k": 20,
"top_p": 0.95,
"transformers_version": "4.57.6"
}

151388
merges.txt Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:711c78932164bc1ec5ea2105e36014329ae026f933e492e44947d8fd10b9da9a
size 4902257696

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:0cc43894b4433ac5abc70434bd6c19b68c94fa7710c970a8aa0cc79d3666ea89
size 4915960368

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:6c7097f2fa1e69dda974a0982c6535d36d524499163d24253c2ea2b1fb83af81
size 4983068496

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:263b1339c1cf9314b20ff67cf8ee33617c1d7dbaa088551a5b5b9a40b7161b02
size 1580230264

View File

@@ -0,0 +1,407 @@
{
"metadata": {
"total_parameters": 308224,
"total_size": 16381470720
},
"weight_map": {
"lm_head.weight": "model-00004-of-00004.safetensors",
"model.embed_tokens.weight": "model-00001-of-00004.safetensors",
"model.layers.0.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.0.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.0.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.0.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.0.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.0.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.0.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.0.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.0.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.0.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.0.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.1.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.1.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.1.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.1.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.1.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.1.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.1.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.1.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.1.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.1.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.1.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.10.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.10.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.10.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.10.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.10.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.10.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.10.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.10.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.10.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.10.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.10.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.11.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.11.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.11.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.11.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.11.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.11.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.11.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.11.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.11.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.11.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.11.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.12.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.12.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.12.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.12.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.12.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.12.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.12.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.12.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.12.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.12.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.12.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.13.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.13.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.13.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.13.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.13.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.13.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.13.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.13.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.13.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.13.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.13.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.14.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.14.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.14.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.14.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.14.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.14.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.14.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.14.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.14.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.14.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.14.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.15.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.15.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.15.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.15.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.15.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.15.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.15.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.15.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.15.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.15.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.15.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.16.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.16.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.16.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.16.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.16.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.16.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.16.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.16.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.16.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.16.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.16.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.17.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.17.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.17.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.17.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.17.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.17.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.17.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.17.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.17.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.17.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.17.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.18.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.18.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.18.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.18.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.18.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.18.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.18.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.18.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.18.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.18.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.18.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.19.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.19.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.19.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.19.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.19.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.19.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.19.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.19.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.19.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.19.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.19.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.2.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.2.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.2.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.2.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.2.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.2.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.2.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.2.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.2.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.2.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.2.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.20.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.20.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.20.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.20.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.20.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.20.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.20.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.20.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.20.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.20.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.20.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.21.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.21.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.21.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.21.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.21.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.21.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.21.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.21.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.21.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.21.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.21.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.22.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.22.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.22.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.22.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.22.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.22.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.22.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.22.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.22.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.22.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.22.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.23.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.23.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.23.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.23.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.23.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.23.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.23.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.23.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.23.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.23.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.23.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.24.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.24.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.24.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.24.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.24.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.24.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.24.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.24.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.24.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.24.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.24.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.25.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.25.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.25.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.25.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.25.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.25.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.25.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.25.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.25.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.25.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.25.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.26.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.26.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.26.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.26.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.26.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.26.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.26.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.26.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.26.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.26.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.26.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.27.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.27.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.27.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.27.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.27.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.27.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.27.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.27.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.27.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.27.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.27.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.28.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.28.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.28.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.28.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.28.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.28.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.28.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.28.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.28.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.28.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.28.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.29.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.29.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.29.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.29.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.29.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.29.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.29.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.29.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.29.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.29.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.29.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.3.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.3.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.3.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.3.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.3.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.3.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.3.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.3.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.3.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.3.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.3.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.30.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.30.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.30.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.30.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.30.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.30.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.30.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.30.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.30.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.30.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.30.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.31.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.31.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.31.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.31.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.31.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.31.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.31.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.31.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.31.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.31.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.31.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.32.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.32.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.32.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.32.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.32.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.32.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.32.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.32.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.32.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.32.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.32.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.33.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.33.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.33.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.33.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.33.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.33.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.33.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.33.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.33.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.33.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.33.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.34.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.34.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.34.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.34.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.34.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.34.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.34.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.34.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.34.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.34.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.34.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.35.input_layernorm.weight": "model-00004-of-00004.safetensors",
"model.layers.35.mlp.down_proj.weight": "model-00004-of-00004.safetensors",
"model.layers.35.mlp.gate_proj.weight": "model-00004-of-00004.safetensors",
"model.layers.35.mlp.up_proj.weight": "model-00004-of-00004.safetensors",
"model.layers.35.post_attention_layernorm.weight": "model-00004-of-00004.safetensors",
"model.layers.35.self_attn.k_norm.weight": "model-00004-of-00004.safetensors",
"model.layers.35.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.35.self_attn.o_proj.weight": "model-00004-of-00004.safetensors",
"model.layers.35.self_attn.q_norm.weight": "model-00004-of-00004.safetensors",
"model.layers.35.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.35.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.4.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.4.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.4.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.4.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.4.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.4.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.4.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.4.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.4.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.4.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.4.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.5.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.5.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.5.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.5.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.5.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.5.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.5.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.5.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.5.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.5.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.5.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.6.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.6.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.6.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.6.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.6.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.6.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.6.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.6.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.6.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.6.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.6.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.7.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.7.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.7.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.7.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.7.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.7.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.7.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.7.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.7.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.7.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.7.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.8.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.8.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.8.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.8.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.8.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.8.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.8.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.8.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.8.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.8.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.8.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.9.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.9.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.9.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.9.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.9.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.9.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.9.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.9.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.9.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.9.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.9.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.norm.weight": "model-00004-of-00004.safetensors"
}
}

12
run_summary.json Normal file
View File

@@ -0,0 +1,12 @@
{
"agent_name": "78e341b1c482ae93ac8ef8d3f560eafd7afd5406_thinking_preprocessed",
"training_start": null,
"training_end": null,
"created_by": "DCAgent",
"base_model_name": "Qwen/Qwen3-8B",
"dataset_name": "/e/data1/datasets/playground/ot/hf_hub/datasets--laion--nemotron-terminal-data_processing/snapshots/78e341b1c482ae93ac8ef8d3f560eafd7afd5406_thinking_preprocessed",
"training_type": "SFT",
"training_parameters": "https://huggingface.co/laion/nemotron-terminal-data_processing__Qwen3-8B/blob/main/config.json",
"wandb_link": null,
"traces_location_s3": null
}

31
special_tokens_map.json Normal file
View File

@@ -0,0 +1,31 @@
{
"additional_special_tokens": [
"<|im_start|>",
"<|im_end|>",
"<|object_ref_start|>",
"<|object_ref_end|>",
"<|box_start|>",
"<|box_end|>",
"<|quad_start|>",
"<|quad_end|>",
"<|vision_start|>",
"<|vision_end|>",
"<|vision_pad|>",
"<|image_pad|>",
"<|video_pad|>"
],
"eos_token": {
"content": "<|im_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false
},
"pad_token": {
"content": "<|endoftext|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false
}
}

3
tokenizer.json Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:aeb13307a71acd8fe81861d94ad54ab689df773318809eed3cbe794b4492dae4
size 11422654

240
tokenizer_config.json Normal file
View File

@@ -0,0 +1,240 @@
{
"add_bos_token": false,
"add_prefix_space": false,
"added_tokens_decoder": {
"151643": {
"content": "<|endoftext|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151644": {
"content": "<|im_start|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151645": {
"content": "<|im_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151646": {
"content": "<|object_ref_start|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151647": {
"content": "<|object_ref_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151648": {
"content": "<|box_start|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151649": {
"content": "<|box_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151650": {
"content": "<|quad_start|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151651": {
"content": "<|quad_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151652": {
"content": "<|vision_start|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151653": {
"content": "<|vision_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151654": {
"content": "<|vision_pad|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151655": {
"content": "<|image_pad|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151656": {
"content": "<|video_pad|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151657": {
"content": "<tool_call>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151658": {
"content": "</tool_call>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151659": {
"content": "<|fim_prefix|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151660": {
"content": "<|fim_middle|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151661": {
"content": "<|fim_suffix|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151662": {
"content": "<|fim_pad|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151663": {
"content": "<|repo_name|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151664": {
"content": "<|file_sep|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151665": {
"content": "<tool_response>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151666": {
"content": "</tool_response>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151667": {
"content": "<think>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151668": {
"content": "</think>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
}
},
"additional_special_tokens": [
"<|im_start|>",
"<|im_end|>",
"<|object_ref_start|>",
"<|object_ref_end|>",
"<|box_start|>",
"<|box_end|>",
"<|quad_start|>",
"<|quad_end|>",
"<|vision_start|>",
"<|vision_end|>",
"<|vision_pad|>",
"<|image_pad|>",
"<|video_pad|>"
],
"bos_token": null,
"clean_up_tokenization_spaces": false,
"eos_token": "<|im_end|>",
"errors": "replace",
"extra_special_tokens": {},
"model_max_length": 32768,
"pad_token": "<|endoftext|>",
"padding_side": "right",
"split_special_tokens": false,
"tokenizer_class": "Qwen2Tokenizer",
"unk_token": null
}

12
train_results.json Normal file
View File

@@ -0,0 +1,12 @@
{
"achieved_tflops_per_gpu": 55493.00432659637,
"achieved_tflops_per_gpu_theoretical": 1813433.511628669,
"epoch": 7.0,
"mfu_percent": 3921.7670902188247,
"mfu_percent_theoretical": 128157.84534478225,
"total_flos": 2.2525720316251996e+18,
"train_loss": 0.0,
"train_runtime": 1.2685,
"train_samples_per_second": 51247.723,
"train_steps_per_second": 535.267
}

141
trainer_log.jsonl Normal file
View File

@@ -0,0 +1,141 @@
{"current_steps": 5, "total_steps": 679, "loss": 0.8719, "lr": 2.3529411764705885e-06, "epoch": 0.05154639175257732, "percentage": 0.74, "elapsed_time": "0:01:48", "remaining_time": "4:02:48"}
{"current_steps": 10, "total_steps": 679, "loss": 0.8419, "lr": 5.294117647058824e-06, "epoch": 0.10309278350515463, "percentage": 1.47, "elapsed_time": "0:03:38", "remaining_time": "4:03:51"}
{"current_steps": 15, "total_steps": 679, "loss": 0.7585, "lr": 8.23529411764706e-06, "epoch": 0.15463917525773196, "percentage": 2.21, "elapsed_time": "0:05:12", "remaining_time": "3:50:28"}
{"current_steps": 20, "total_steps": 679, "loss": 0.7092, "lr": 1.1176470588235295e-05, "epoch": 0.20618556701030927, "percentage": 2.95, "elapsed_time": "0:06:51", "remaining_time": "3:46:04"}
{"current_steps": 25, "total_steps": 679, "loss": 0.6704, "lr": 1.4117647058823532e-05, "epoch": 0.25773195876288657, "percentage": 3.68, "elapsed_time": "0:08:29", "remaining_time": "3:42:04"}
{"current_steps": 30, "total_steps": 679, "loss": 0.6411, "lr": 1.7058823529411767e-05, "epoch": 0.30927835051546393, "percentage": 4.42, "elapsed_time": "0:10:04", "remaining_time": "3:38:04"}
{"current_steps": 35, "total_steps": 679, "loss": 0.602, "lr": 2e-05, "epoch": 0.36082474226804123, "percentage": 5.15, "elapsed_time": "0:11:49", "remaining_time": "3:37:38"}
{"current_steps": 40, "total_steps": 679, "loss": 0.584, "lr": 2.2941176470588237e-05, "epoch": 0.41237113402061853, "percentage": 5.89, "elapsed_time": "0:13:32", "remaining_time": "3:36:25"}
{"current_steps": 45, "total_steps": 679, "loss": 0.5619, "lr": 2.5882352941176475e-05, "epoch": 0.4639175257731959, "percentage": 6.63, "elapsed_time": "0:15:23", "remaining_time": "3:36:47"}
{"current_steps": 50, "total_steps": 679, "loss": 0.5199, "lr": 2.8823529411764707e-05, "epoch": 0.5154639175257731, "percentage": 7.36, "elapsed_time": "0:17:01", "remaining_time": "3:34:11"}
{"current_steps": 55, "total_steps": 679, "loss": 0.5072, "lr": 3.1764705882352945e-05, "epoch": 0.5670103092783505, "percentage": 8.1, "elapsed_time": "0:18:47", "remaining_time": "3:33:10"}
{"current_steps": 60, "total_steps": 679, "loss": 0.497, "lr": 3.470588235294118e-05, "epoch": 0.6185567010309279, "percentage": 8.84, "elapsed_time": "0:20:30", "remaining_time": "3:31:34"}
{"current_steps": 65, "total_steps": 679, "loss": 0.4782, "lr": 3.7647058823529415e-05, "epoch": 0.6701030927835051, "percentage": 9.57, "elapsed_time": "0:22:19", "remaining_time": "3:30:48"}
{"current_steps": 70, "total_steps": 679, "loss": 0.4671, "lr": 3.999973562744509e-05, "epoch": 0.7216494845360825, "percentage": 10.31, "elapsed_time": "0:24:04", "remaining_time": "3:29:23"}
{"current_steps": 75, "total_steps": 679, "loss": 0.4527, "lr": 3.999048332187732e-05, "epoch": 0.7731958762886598, "percentage": 11.05, "elapsed_time": "0:25:40", "remaining_time": "3:26:44"}
{"current_steps": 80, "total_steps": 679, "loss": 0.4505, "lr": 3.996801937701812e-05, "epoch": 0.8247422680412371, "percentage": 11.78, "elapsed_time": "0:27:17", "remaining_time": "3:24:17"}
{"current_steps": 85, "total_steps": 679, "loss": 0.4436, "lr": 3.9932358639208713e-05, "epoch": 0.8762886597938144, "percentage": 12.52, "elapsed_time": "0:28:50", "remaining_time": "3:21:35"}
{"current_steps": 90, "total_steps": 679, "loss": 0.4405, "lr": 3.988352467650382e-05, "epoch": 0.9278350515463918, "percentage": 13.25, "elapsed_time": "0:30:28", "remaining_time": "3:19:28"}
{"current_steps": 95, "total_steps": 679, "loss": 0.4406, "lr": 3.9821549763095606e-05, "epoch": 0.979381443298969, "percentage": 13.99, "elapsed_time": "0:32:15", "remaining_time": "3:18:20"}
{"current_steps": 100, "total_steps": 679, "loss": 0.4333, "lr": 3.9746474857983815e-05, "epoch": 1.0309278350515463, "percentage": 14.73, "elapsed_time": "0:34:01", "remaining_time": "3:16:59"}
{"current_steps": 105, "total_steps": 679, "loss": 0.4253, "lr": 3.965834957790608e-05, "epoch": 1.0824742268041236, "percentage": 15.46, "elapsed_time": "0:35:44", "remaining_time": "3:15:23"}
{"current_steps": 110, "total_steps": 679, "loss": 0.4246, "lr": 3.9557232164546405e-05, "epoch": 1.134020618556701, "percentage": 16.2, "elapsed_time": "0:37:35", "remaining_time": "3:14:27"}
{"current_steps": 115, "total_steps": 679, "loss": 0.4175, "lr": 3.944318944604347e-05, "epoch": 1.1855670103092784, "percentage": 16.94, "elapsed_time": "0:39:14", "remaining_time": "3:12:28"}
{"current_steps": 120, "total_steps": 679, "loss": 0.4075, "lr": 3.9316296792824096e-05, "epoch": 1.2371134020618557, "percentage": 17.67, "elapsed_time": "0:40:54", "remaining_time": "3:10:36"}
{"current_steps": 125, "total_steps": 679, "loss": 0.4208, "lr": 3.917663806779125e-05, "epoch": 1.2886597938144329, "percentage": 18.41, "elapsed_time": "0:42:36", "remaining_time": "3:08:50"}
{"current_steps": 130, "total_steps": 679, "loss": 0.4144, "lr": 3.9024305570899345e-05, "epoch": 1.3402061855670104, "percentage": 19.15, "elapsed_time": "0:44:23", "remaining_time": "3:07:29"}
{"current_steps": 135, "total_steps": 679, "loss": 0.4105, "lr": 3.885939997815349e-05, "epoch": 1.3917525773195876, "percentage": 19.88, "elapsed_time": "0:46:06", "remaining_time": "3:05:47"}
{"current_steps": 140, "total_steps": 679, "loss": 0.406, "lr": 3.8682030275073125e-05, "epoch": 1.443298969072165, "percentage": 20.62, "elapsed_time": "0:47:41", "remaining_time": "3:03:37"}
{"current_steps": 145, "total_steps": 679, "loss": 0.4086, "lr": 3.849231368466383e-05, "epoch": 1.4948453608247423, "percentage": 21.35, "elapsed_time": "0:49:20", "remaining_time": "3:01:42"}
{"current_steps": 150, "total_steps": 679, "loss": 0.4011, "lr": 3.829037558994513e-05, "epoch": 1.5463917525773194, "percentage": 22.09, "elapsed_time": "0:51:09", "remaining_time": "3:00:23"}
{"current_steps": 155, "total_steps": 679, "loss": 0.3975, "lr": 3.807634945108521e-05, "epoch": 1.597938144329897, "percentage": 22.83, "elapsed_time": "0:52:39", "remaining_time": "2:58:00"}
{"current_steps": 160, "total_steps": 679, "loss": 0.3997, "lr": 3.785037671719763e-05, "epoch": 1.6494845360824741, "percentage": 23.56, "elapsed_time": "0:54:15", "remaining_time": "2:55:58"}
{"current_steps": 165, "total_steps": 679, "loss": 0.3994, "lr": 3.76126067328581e-05, "epoch": 1.7010309278350515, "percentage": 24.3, "elapsed_time": "1:02:00", "remaining_time": "3:13:11"}
{"current_steps": 170, "total_steps": 679, "loss": 0.3986, "lr": 3.736319663940316e-05, "epoch": 1.7525773195876289, "percentage": 25.04, "elapsed_time": "1:03:39", "remaining_time": "3:10:36"}
{"current_steps": 175, "total_steps": 679, "loss": 0.3985, "lr": 3.710231127107606e-05, "epoch": 1.8041237113402062, "percentage": 25.77, "elapsed_time": "1:05:20", "remaining_time": "3:08:10"}
{"current_steps": 180, "total_steps": 679, "loss": 0.401, "lr": 3.683012304608837e-05, "epoch": 1.8556701030927836, "percentage": 26.51, "elapsed_time": "1:06:58", "remaining_time": "3:05:41"}
{"current_steps": 185, "total_steps": 679, "loss": 0.3945, "lr": 3.654681185266939e-05, "epoch": 1.9072164948453607, "percentage": 27.25, "elapsed_time": "1:08:36", "remaining_time": "3:03:11"}
{"current_steps": 190, "total_steps": 679, "loss": 0.3976, "lr": 3.6252564930178705e-05, "epoch": 1.9587628865979383, "percentage": 27.98, "elapsed_time": "1:10:11", "remaining_time": "3:00:38"}
{"current_steps": 195, "total_steps": 679, "loss": 0.392, "lr": 3.59475767453603e-05, "epoch": 2.0103092783505154, "percentage": 28.72, "elapsed_time": "1:11:47", "remaining_time": "2:58:12"}
{"current_steps": 200, "total_steps": 679, "loss": 0.3861, "lr": 3.563204886382024e-05, "epoch": 2.0618556701030926, "percentage": 29.46, "elapsed_time": "1:13:25", "remaining_time": "2:55:51"}
{"current_steps": 205, "total_steps": 679, "loss": 0.3852, "lr": 3.530618981681261e-05, "epoch": 2.11340206185567, "percentage": 30.19, "elapsed_time": "1:14:56", "remaining_time": "2:53:16"}
{"current_steps": 210, "total_steps": 679, "loss": 0.3826, "lr": 3.497021496342203e-05, "epoch": 2.1649484536082473, "percentage": 30.93, "elapsed_time": "1:16:34", "remaining_time": "2:51:00"}
{"current_steps": 215, "total_steps": 679, "loss": 0.375, "lr": 3.4624346348233526e-05, "epoch": 2.216494845360825, "percentage": 31.66, "elapsed_time": "1:18:07", "remaining_time": "2:48:36"}
{"current_steps": 220, "total_steps": 679, "loss": 0.3844, "lr": 3.426881255458411e-05, "epoch": 2.268041237113402, "percentage": 32.4, "elapsed_time": "1:19:49", "remaining_time": "2:46:31"}
{"current_steps": 225, "total_steps": 679, "loss": 0.3832, "lr": 3.390384855349285e-05, "epoch": 2.319587628865979, "percentage": 33.14, "elapsed_time": "1:21:33", "remaining_time": "2:44:33"}
{"current_steps": 230, "total_steps": 679, "loss": 0.3846, "lr": 3.352969554836933e-05, "epoch": 2.3711340206185567, "percentage": 33.87, "elapsed_time": "1:23:26", "remaining_time": "2:42:53"}
{"current_steps": 235, "total_steps": 679, "loss": 0.3852, "lr": 3.314660081560323e-05, "epoch": 2.422680412371134, "percentage": 34.61, "elapsed_time": "1:25:08", "remaining_time": "2:40:51"}
{"current_steps": 240, "total_steps": 679, "loss": 0.3791, "lr": 3.2754817541140185e-05, "epoch": 2.4742268041237114, "percentage": 35.35, "elapsed_time": "1:26:44", "remaining_time": "2:38:40"}
{"current_steps": 245, "total_steps": 679, "loss": 0.3866, "lr": 3.235460465315213e-05, "epoch": 2.5257731958762886, "percentage": 36.08, "elapsed_time": "1:28:26", "remaining_time": "2:36:39"}
{"current_steps": 250, "total_steps": 679, "loss": 0.3772, "lr": 3.194622665091258e-05, "epoch": 2.5773195876288657, "percentage": 36.82, "elapsed_time": "1:30:08", "remaining_time": "2:34:40"}
{"current_steps": 255, "total_steps": 679, "loss": 0.3809, "lr": 3.152995342999002e-05, "epoch": 2.6288659793814433, "percentage": 37.56, "elapsed_time": "1:31:51", "remaining_time": "2:32:44"}
{"current_steps": 260, "total_steps": 679, "loss": 0.3868, "lr": 3.110606010387483e-05, "epoch": 2.680412371134021, "percentage": 38.29, "elapsed_time": "1:33:39", "remaining_time": "2:30:56"}
{"current_steps": 265, "total_steps": 679, "loss": 0.3782, "lr": 3.067482682215783e-05, "epoch": 2.731958762886598, "percentage": 39.03, "elapsed_time": "1:35:12", "remaining_time": "2:28:43"}
{"current_steps": 270, "total_steps": 679, "loss": 0.3757, "lr": 3.023653858538037e-05, "epoch": 2.783505154639175, "percentage": 39.76, "elapsed_time": "1:36:50", "remaining_time": "2:26:41"}
{"current_steps": 275, "total_steps": 679, "loss": 0.3823, "lr": 2.9791485056678478e-05, "epoch": 2.8350515463917527, "percentage": 40.5, "elapsed_time": "1:38:29", "remaining_time": "2:24:40"}
{"current_steps": 280, "total_steps": 679, "loss": 0.3747, "lr": 2.933996037034556e-05, "epoch": 2.88659793814433, "percentage": 41.24, "elapsed_time": "1:40:10", "remaining_time": "2:22:45"}
{"current_steps": 285, "total_steps": 679, "loss": 0.3789, "lr": 2.8882262937440076e-05, "epoch": 2.9381443298969074, "percentage": 41.97, "elapsed_time": "1:41:49", "remaining_time": "2:20:45"}
{"current_steps": 290, "total_steps": 679, "loss": 0.3714, "lr": 2.8418695248566703e-05, "epoch": 2.9896907216494846, "percentage": 42.71, "elapsed_time": "1:43:23", "remaining_time": "2:18:41"}
{"current_steps": 295, "total_steps": 679, "loss": 0.3692, "lr": 2.7949563673961425e-05, "epoch": 3.0412371134020617, "percentage": 43.45, "elapsed_time": "1:44:59", "remaining_time": "2:16:40"}
{"current_steps": 300, "total_steps": 679, "loss": 0.3701, "lr": 2.7475178261012492e-05, "epoch": 3.0927835051546393, "percentage": 44.18, "elapsed_time": "1:46:37", "remaining_time": "2:14:42"}
{"current_steps": 305, "total_steps": 679, "loss": 0.3644, "lr": 2.6995852529351227e-05, "epoch": 3.1443298969072164, "percentage": 44.92, "elapsed_time": "1:48:22", "remaining_time": "2:12:53"}
{"current_steps": 310, "total_steps": 679, "loss": 0.3665, "lr": 2.6511903263648008e-05, "epoch": 3.195876288659794, "percentage": 45.66, "elapsed_time": "1:49:58", "remaining_time": "2:10:53"}
{"current_steps": 315, "total_steps": 679, "loss": 0.366, "lr": 2.602365030425041e-05, "epoch": 3.247422680412371, "percentage": 46.39, "elapsed_time": "1:51:35", "remaining_time": "2:08:57"}
{"current_steps": 320, "total_steps": 679, "loss": 0.3719, "lr": 2.553141633580185e-05, "epoch": 3.2989690721649483, "percentage": 47.13, "elapsed_time": "1:53:21", "remaining_time": "2:07:10"}
{"current_steps": 325, "total_steps": 679, "loss": 0.3645, "lr": 2.5035526673980463e-05, "epoch": 3.350515463917526, "percentage": 47.86, "elapsed_time": "1:54:59", "remaining_time": "2:05:15"}
{"current_steps": 330, "total_steps": 679, "loss": 0.3742, "lr": 2.453630905049913e-05, "epoch": 3.402061855670103, "percentage": 48.6, "elapsed_time": "1:56:39", "remaining_time": "2:03:22"}
{"current_steps": 335, "total_steps": 679, "loss": 0.3687, "lr": 2.4034093396508752e-05, "epoch": 3.4536082474226806, "percentage": 49.34, "elapsed_time": "1:58:20", "remaining_time": "2:01:30"}
{"current_steps": 340, "total_steps": 679, "loss": 0.3689, "lr": 2.3529211624547914e-05, "epoch": 3.5051546391752577, "percentage": 50.07, "elapsed_time": "2:00:05", "remaining_time": "1:59:44"}
{"current_steps": 345, "total_steps": 679, "loss": 0.3664, "lr": 2.3021997409183086e-05, "epoch": 3.556701030927835, "percentage": 50.81, "elapsed_time": "2:01:59", "remaining_time": "1:58:06"}
{"current_steps": 350, "total_steps": 679, "loss": 0.3688, "lr": 2.2512785966484286e-05, "epoch": 3.6082474226804124, "percentage": 51.55, "elapsed_time": "2:03:40", "remaining_time": "1:56:15"}
{"current_steps": 355, "total_steps": 679, "loss": 0.3704, "lr": 2.200191383248197e-05, "epoch": 3.6597938144329896, "percentage": 52.28, "elapsed_time": "2:05:29", "remaining_time": "1:54:32"}
{"current_steps": 360, "total_steps": 679, "loss": 0.3668, "lr": 2.148971864075156e-05, "epoch": 3.711340206185567, "percentage": 53.02, "elapsed_time": "2:07:06", "remaining_time": "1:52:38"}
{"current_steps": 365, "total_steps": 679, "loss": 0.3645, "lr": 2.0976538899272632e-05, "epoch": 3.7628865979381443, "percentage": 53.76, "elapsed_time": "2:08:38", "remaining_time": "1:50:40"}
{"current_steps": 370, "total_steps": 679, "loss": 0.3662, "lr": 2.0462713766710184e-05, "epoch": 3.8144329896907214, "percentage": 54.49, "elapsed_time": "2:10:12", "remaining_time": "1:48:44"}
{"current_steps": 375, "total_steps": 679, "loss": 0.3641, "lr": 1.994858282826588e-05, "epoch": 3.865979381443299, "percentage": 55.23, "elapsed_time": "2:11:54", "remaining_time": "1:46:56"}
{"current_steps": 380, "total_steps": 679, "loss": 0.3724, "lr": 1.9434485871247414e-05, "epoch": 3.917525773195876, "percentage": 55.96, "elapsed_time": "2:13:29", "remaining_time": "1:45:01"}
{"current_steps": 385, "total_steps": 679, "loss": 0.3626, "lr": 1.8920762660504254e-05, "epoch": 3.9690721649484537, "percentage": 56.7, "elapsed_time": "2:15:08", "remaining_time": "1:43:12"}
{"current_steps": 390, "total_steps": 679, "loss": 0.3648, "lr": 1.840775271387831e-05, "epoch": 4.020618556701031, "percentage": 57.44, "elapsed_time": "2:16:43", "remaining_time": "1:41:18"}
{"current_steps": 395, "total_steps": 679, "loss": 0.3571, "lr": 1.7895795077817766e-05, "epoch": 4.072164948453608, "percentage": 58.17, "elapsed_time": "2:18:27", "remaining_time": "1:39:32"}
{"current_steps": 400, "total_steps": 679, "loss": 0.363, "lr": 1.7385228103302486e-05, "epoch": 4.123711340206185, "percentage": 58.91, "elapsed_time": "2:20:05", "remaining_time": "1:37:42"}
{"current_steps": 405, "total_steps": 679, "loss": 0.3653, "lr": 1.6876389222229095e-05, "epoch": 4.175257731958763, "percentage": 59.65, "elapsed_time": "2:21:46", "remaining_time": "1:35:55"}
{"current_steps": 410, "total_steps": 679, "loss": 0.3585, "lr": 1.6369614724403374e-05, "epoch": 4.22680412371134, "percentage": 60.38, "elapsed_time": "2:23:18", "remaining_time": "1:34:01"}
{"current_steps": 415, "total_steps": 679, "loss": 0.3584, "lr": 1.5865239535287603e-05, "epoch": 4.278350515463917, "percentage": 61.12, "elapsed_time": "2:24:53", "remaining_time": "1:32:10"}
{"current_steps": 420, "total_steps": 679, "loss": 0.3616, "lr": 1.5363596994649433e-05, "epoch": 4.329896907216495, "percentage": 61.86, "elapsed_time": "2:26:32", "remaining_time": "1:30:22"}
{"current_steps": 425, "total_steps": 679, "loss": 0.3599, "lr": 1.4865018636258902e-05, "epoch": 4.381443298969073, "percentage": 62.59, "elapsed_time": "2:28:13", "remaining_time": "1:28:35"}
{"current_steps": 430, "total_steps": 679, "loss": 0.3585, "lr": 1.4369833968778868e-05, "epoch": 4.43298969072165, "percentage": 63.33, "elapsed_time": "2:29:53", "remaining_time": "1:26:47"}
{"current_steps": 435, "total_steps": 679, "loss": 0.3572, "lr": 1.3878370257993954e-05, "epoch": 4.484536082474227, "percentage": 64.06, "elapsed_time": "2:31:29", "remaining_time": "1:24:58"}
{"current_steps": 440, "total_steps": 679, "loss": 0.3623, "lr": 1.3390952310521765e-05, "epoch": 4.536082474226804, "percentage": 64.8, "elapsed_time": "2:33:08", "remaining_time": "1:23:11"}
{"current_steps": 445, "total_steps": 679, "loss": 0.3655, "lr": 1.2907902259149287e-05, "epoch": 4.587628865979381, "percentage": 65.54, "elapsed_time": "2:34:52", "remaining_time": "1:21:26"}
{"current_steps": 450, "total_steps": 679, "loss": 0.3624, "lr": 1.2429539349936567e-05, "epoch": 4.639175257731958, "percentage": 66.27, "elapsed_time": "2:36:34", "remaining_time": "1:19:40"}
{"current_steps": 455, "total_steps": 679, "loss": 0.356, "lr": 1.1956179731228033e-05, "epoch": 4.690721649484536, "percentage": 67.01, "elapsed_time": "2:38:11", "remaining_time": "1:17:52"}
{"current_steps": 460, "total_steps": 679, "loss": 0.3608, "lr": 1.1488136244711254e-05, "epoch": 4.742268041237113, "percentage": 67.75, "elapsed_time": "2:39:41", "remaining_time": "1:16:01"}
{"current_steps": 465, "total_steps": 679, "loss": 0.3555, "lr": 1.1025718218660947e-05, "epoch": 4.793814432989691, "percentage": 68.48, "elapsed_time": "2:41:17", "remaining_time": "1:14:13"}
{"current_steps": 470, "total_steps": 679, "loss": 0.3538, "lr": 1.056923126350499e-05, "epoch": 4.845360824742268, "percentage": 69.22, "elapsed_time": "2:42:47", "remaining_time": "1:12:23"}
{"current_steps": 475, "total_steps": 679, "loss": 0.3563, "lr": 1.0118977069847661e-05, "epoch": 4.896907216494846, "percentage": 69.96, "elapsed_time": "2:44:29", "remaining_time": "1:10:38"}
{"current_steps": 480, "total_steps": 679, "loss": 0.3625, "lr": 9.675253209083315e-06, "epoch": 4.948453608247423, "percentage": 70.69, "elapsed_time": "2:46:10", "remaining_time": "1:08:53"}
{"current_steps": 485, "total_steps": 679, "loss": 0.3567, "lr": 9.238352936732549e-06, "epoch": 5.0, "percentage": 71.43, "elapsed_time": "2:47:44", "remaining_time": "1:07:05"}
{"current_steps": 490, "total_steps": 679, "loss": 0.3568, "lr": 8.808564998630639e-06, "epoch": 5.051546391752577, "percentage": 72.16, "elapsed_time": "2:49:35", "remaining_time": "1:05:24"}
{"current_steps": 495, "total_steps": 679, "loss": 0.3537, "lr": 8.386173440096499e-06, "epoch": 5.103092783505154, "percentage": 72.9, "elapsed_time": "2:51:05", "remaining_time": "1:03:35"}
{"current_steps": 500, "total_steps": 679, "loss": 0.3592, "lr": 7.971457418208081e-06, "epoch": 5.154639175257732, "percentage": 73.64, "elapsed_time": "2:52:43", "remaining_time": "1:01:49"}
{"current_steps": 505, "total_steps": 679, "loss": 0.355, "lr": 7.56469101730847e-06, "epoch": 5.206185567010309, "percentage": 74.37, "elapsed_time": "2:54:27", "remaining_time": "1:00:06"}
{"current_steps": 510, "total_steps": 679, "loss": 0.3535, "lr": 7.1661430678645635e-06, "epoch": 5.257731958762887, "percentage": 75.11, "elapsed_time": "2:55:59", "remaining_time": "0:58:19"}
{"current_steps": 515, "total_steps": 679, "loss": 0.3516, "lr": 6.776076968797906e-06, "epoch": 5.309278350515464, "percentage": 75.85, "elapsed_time": "2:57:38", "remaining_time": "0:56:34"}
{"current_steps": 520, "total_steps": 679, "loss": 0.3524, "lr": 6.394750513405336e-06, "epoch": 5.360824742268041, "percentage": 76.58, "elapsed_time": "2:59:14", "remaining_time": "0:54:48"}
{"current_steps": 525, "total_steps": 679, "loss": 0.3514, "lr": 6.022415718984262e-06, "epoch": 5.412371134020619, "percentage": 77.32, "elapsed_time": "3:00:52", "remaining_time": "0:53:03"}
{"current_steps": 530, "total_steps": 679, "loss": 0.3573, "lr": 5.659318660275386e-06, "epoch": 5.463917525773196, "percentage": 78.06, "elapsed_time": "3:02:36", "remaining_time": "0:51:20"}
{"current_steps": 535, "total_steps": 679, "loss": 0.3525, "lr": 5.305699306832708e-06, "epoch": 5.515463917525773, "percentage": 78.79, "elapsed_time": "3:04:05", "remaining_time": "0:49:33"}
{"current_steps": 540, "total_steps": 679, "loss": 0.3554, "lr": 4.961791364428572e-06, "epoch": 5.56701030927835, "percentage": 79.53, "elapsed_time": "3:05:42", "remaining_time": "0:47:48"}
{"current_steps": 545, "total_steps": 679, "loss": 0.3511, "lr": 4.627822120598327e-06, "epoch": 5.618556701030927, "percentage": 80.27, "elapsed_time": "3:07:20", "remaining_time": "0:46:03"}
{"current_steps": 550, "total_steps": 679, "loss": 0.3533, "lr": 4.304012294426781e-06, "epoch": 5.670103092783505, "percentage": 81.0, "elapsed_time": "3:08:59", "remaining_time": "0:44:19"}
{"current_steps": 555, "total_steps": 679, "loss": 0.3531, "lr": 3.990575890675787e-06, "epoch": 5.721649484536083, "percentage": 81.74, "elapsed_time": "3:10:34", "remaining_time": "0:42:34"}
{"current_steps": 560, "total_steps": 679, "loss": 0.3555, "lr": 3.6877200583492202e-06, "epoch": 5.77319587628866, "percentage": 82.47, "elapsed_time": "3:12:10", "remaining_time": "0:40:50"}
{"current_steps": 565, "total_steps": 679, "loss": 0.3577, "lr": 3.3956449537889545e-06, "epoch": 5.824742268041237, "percentage": 83.21, "elapsed_time": "3:13:51", "remaining_time": "0:39:06"}
{"current_steps": 570, "total_steps": 679, "loss": 0.3563, "lr": 3.114543608392242e-06, "epoch": 5.876288659793815, "percentage": 83.95, "elapsed_time": "3:15:38", "remaining_time": "0:37:24"}
{"current_steps": 575, "total_steps": 679, "loss": 0.3572, "lr": 2.8446018010379584e-06, "epoch": 5.927835051546392, "percentage": 84.68, "elapsed_time": "3:17:23", "remaining_time": "0:35:42"}
{"current_steps": 580, "total_steps": 679, "loss": 0.3611, "lr": 2.585997935306004e-06, "epoch": 5.979381443298969, "percentage": 85.42, "elapsed_time": "3:19:10", "remaining_time": "0:33:59"}
{"current_steps": 585, "total_steps": 679, "loss": 0.3577, "lr": 2.3389029215709935e-06, "epoch": 6.030927835051546, "percentage": 86.16, "elapsed_time": "3:20:42", "remaining_time": "0:32:15"}
{"current_steps": 590, "total_steps": 679, "loss": 0.3508, "lr": 2.1034800640482266e-06, "epoch": 6.082474226804123, "percentage": 86.89, "elapsed_time": "3:22:10", "remaining_time": "0:30:29"}
{"current_steps": 595, "total_steps": 679, "loss": 0.3502, "lr": 1.8798849528664864e-06, "epoch": 6.134020618556701, "percentage": 87.63, "elapsed_time": "3:23:35", "remaining_time": "0:28:44"}
{"current_steps": 600, "total_steps": 679, "loss": 0.36, "lr": 1.668265361239092e-06, "epoch": 6.185567010309279, "percentage": 88.37, "elapsed_time": "3:25:25", "remaining_time": "0:27:02"}
{"current_steps": 605, "total_steps": 679, "loss": 0.358, "lr": 1.4687611478010943e-06, "epoch": 6.237113402061856, "percentage": 89.1, "elapsed_time": "3:27:15", "remaining_time": "0:25:20"}
{"current_steps": 610, "total_steps": 679, "loss": 0.3541, "lr": 1.281504164177232e-06, "epoch": 6.288659793814433, "percentage": 89.84, "elapsed_time": "3:28:57", "remaining_time": "0:23:38"}
{"current_steps": 615, "total_steps": 679, "loss": 0.3555, "lr": 1.1066181678416266e-06, "epoch": 6.34020618556701, "percentage": 90.57, "elapsed_time": "3:30:30", "remaining_time": "0:21:54"}
{"current_steps": 620, "total_steps": 679, "loss": 0.3495, "lr": 9.442187403269187e-07, "epoch": 6.391752577319588, "percentage": 91.31, "elapsed_time": "3:32:13", "remaining_time": "0:20:11"}
{"current_steps": 625, "total_steps": 679, "loss": 0.3638, "lr": 7.94413210836864e-07, "epoch": 6.443298969072165, "percentage": 92.05, "elapsed_time": "3:33:53", "remaining_time": "0:18:28"}
{"current_steps": 630, "total_steps": 679, "loss": 0.3461, "lr": 6.573005853128145e-07, "epoch": 6.494845360824742, "percentage": 92.78, "elapsed_time": "3:35:31", "remaining_time": "0:16:45"}
{"current_steps": 635, "total_steps": 679, "loss": 0.3555, "lr": 5.3297148100107e-07, "epoch": 6.546391752577319, "percentage": 93.52, "elapsed_time": "3:37:18", "remaining_time": "0:15:03"}
{"current_steps": 640, "total_steps": 679, "loss": 0.3533, "lr": 4.2150806656424014e-07, "epoch": 6.597938144329897, "percentage": 94.26, "elapsed_time": "3:39:01", "remaining_time": "0:13:20"}
{"current_steps": 645, "total_steps": 679, "loss": 0.3534, "lr": 3.2298400777629027e-07, "epoch": 6.649484536082475, "percentage": 94.99, "elapsed_time": "3:40:49", "remaining_time": "0:11:38"}
{"current_steps": 650, "total_steps": 679, "loss": 0.352, "lr": 2.3746441883707006e-07, "epoch": 6.701030927835052, "percentage": 95.73, "elapsed_time": "3:42:19", "remaining_time": "0:09:55"}
{"current_steps": 655, "total_steps": 679, "loss": 0.3499, "lr": 1.6500581933859284e-07, "epoch": 6.752577319587629, "percentage": 96.47, "elapsed_time": "3:43:54", "remaining_time": "0:08:12"}
{"current_steps": 660, "total_steps": 679, "loss": 0.3521, "lr": 1.0565609691142398e-07, "epoch": 6.804123711340206, "percentage": 97.2, "elapsed_time": "3:45:38", "remaining_time": "0:06:29"}
{"current_steps": 665, "total_steps": 679, "loss": 0.3479, "lr": 5.9454475575917434e-08, "epoch": 6.855670103092783, "percentage": 97.94, "elapsed_time": "3:47:15", "remaining_time": "0:04:47"}
{"current_steps": 670, "total_steps": 679, "loss": 0.3466, "lr": 2.6431489819207512e-08, "epoch": 6.907216494845361, "percentage": 98.67, "elapsed_time": "3:48:51", "remaining_time": "0:03:04"}
{"current_steps": 675, "total_steps": 679, "loss": 0.3576, "lr": 6.608964415066865e-09, "epoch": 6.958762886597938, "percentage": 99.41, "elapsed_time": "3:50:25", "remaining_time": "0:01:21"}
{"current_steps": 679, "total_steps": 679, "epoch": 7.0, "percentage": 100.0, "elapsed_time": "3:51:49", "remaining_time": "0:00:00"}
{"current_steps": 679, "total_steps": 679, "epoch": 7.0, "percentage": 100.0, "elapsed_time": "0:00:00", "remaining_time": "0:00:00"}
{"current_steps": 679, "total_steps": 679, "epoch": 7.0, "percentage": 100.0, "elapsed_time": "0:00:00", "remaining_time": "0:00:00"}
{"current_steps": 679, "total_steps": 679, "epoch": 7.0, "percentage": 100.0, "elapsed_time": "0:00:00", "remaining_time": "0:00:00"}
{"current_steps": 679, "total_steps": 679, "epoch": 7.0, "percentage": 100.0, "elapsed_time": "0:00:00", "remaining_time": "0:00:00"}
{"current_steps": 679, "total_steps": 679, "epoch": 7.0, "percentage": 100.0, "elapsed_time": "0:00:00", "remaining_time": "0:00:00"}

1528
trainer_state.json Normal file

File diff suppressed because it is too large Load Diff

3
training_args.bin Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:b00bf2b3a43870aade158aed1eb1859b059b5c47e4e55f2922be3b504d423465
size 8721

BIN
training_loss.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 37 KiB

1
vocab.json Normal file

File diff suppressed because one or more lines are too long