初始化项目,由ModelHub XC社区提供模型

Model: laion/nemotron-terminal-dependency_management__Qwen3-8B
Source: Original Platform
This commit is contained in:
ModelHub XC
2026-04-23 16:38:11 +08:00
commit 6b631eb202
23 changed files with 154208 additions and 0 deletions

36
.gitattributes vendored Normal file
View File

@@ -0,0 +1,36 @@
*.7z filter=lfs diff=lfs merge=lfs -text
*.arrow filter=lfs diff=lfs merge=lfs -text
*.bin filter=lfs diff=lfs merge=lfs -text
*.bz2 filter=lfs diff=lfs merge=lfs -text
*.ckpt filter=lfs diff=lfs merge=lfs -text
*.ftz filter=lfs diff=lfs merge=lfs -text
*.gz filter=lfs diff=lfs merge=lfs -text
*.h5 filter=lfs diff=lfs merge=lfs -text
*.joblib filter=lfs diff=lfs merge=lfs -text
*.lfs.* filter=lfs diff=lfs merge=lfs -text
*.mlmodel filter=lfs diff=lfs merge=lfs -text
*.model filter=lfs diff=lfs merge=lfs -text
*.msgpack filter=lfs diff=lfs merge=lfs -text
*.npy filter=lfs diff=lfs merge=lfs -text
*.npz filter=lfs diff=lfs merge=lfs -text
*.onnx filter=lfs diff=lfs merge=lfs -text
*.ot filter=lfs diff=lfs merge=lfs -text
*.parquet filter=lfs diff=lfs merge=lfs -text
*.pb filter=lfs diff=lfs merge=lfs -text
*.pickle filter=lfs diff=lfs merge=lfs -text
*.pkl filter=lfs diff=lfs merge=lfs -text
*.pt filter=lfs diff=lfs merge=lfs -text
*.pth filter=lfs diff=lfs merge=lfs -text
*.rar filter=lfs diff=lfs merge=lfs -text
*.safetensors filter=lfs diff=lfs merge=lfs -text
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
*.tar.* filter=lfs diff=lfs merge=lfs -text
*.tar filter=lfs diff=lfs merge=lfs -text
*.tflite filter=lfs diff=lfs merge=lfs -text
*.tgz filter=lfs diff=lfs merge=lfs -text
*.wasm filter=lfs diff=lfs merge=lfs -text
*.xz filter=lfs diff=lfs merge=lfs -text
*.zip filter=lfs diff=lfs merge=lfs -text
*.zst filter=lfs diff=lfs merge=lfs -text
*tfevents* filter=lfs diff=lfs merge=lfs -text
tokenizer.json filter=lfs diff=lfs merge=lfs -text

61
README.md Normal file
View File

@@ -0,0 +1,61 @@
---
library_name: transformers
license: other
base_model: Qwen/Qwen3-8B
tags:
- llama-factory
- full
- generated_from_trainer
model-index:
- name: nemotron-dependency-management__Qwen3-8B
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# nemotron-dependency-management__Qwen3-8B
This model is a fine-tuned version of [Qwen/Qwen3-8B](https://huggingface.co/Qwen/Qwen3-8B) on the /e/data1/datasets/playground/ot/hf_hub/datasets--laion--nemotron-terminal-dependency_management/snapshots/e9ee8860828fcf38e70569d00375ff8427186562_thinking_preprocessed dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 4e-05
- train_batch_size: 1
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- num_devices: 32
- gradient_accumulation_steps: 3
- total_train_batch_size: 96
- total_eval_batch_size: 256
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.98) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 7.0
### Training results
### Framework versions
- Transformers 4.57.6
- Pytorch 2.9.1+cu130
- Datasets 4.7.0
- Tokenizers 0.22.2

28
added_tokens.json Normal file
View File

@@ -0,0 +1,28 @@
{
"</think>": 151668,
"</tool_call>": 151658,
"</tool_response>": 151666,
"<think>": 151667,
"<tool_call>": 151657,
"<tool_response>": 151665,
"<|box_end|>": 151649,
"<|box_start|>": 151648,
"<|endoftext|>": 151643,
"<|file_sep|>": 151664,
"<|fim_middle|>": 151660,
"<|fim_pad|>": 151662,
"<|fim_prefix|>": 151659,
"<|fim_suffix|>": 151661,
"<|im_end|>": 151645,
"<|im_start|>": 151644,
"<|image_pad|>": 151655,
"<|object_ref_end|>": 151647,
"<|object_ref_start|>": 151646,
"<|quad_end|>": 151651,
"<|quad_start|>": 151650,
"<|repo_name|>": 151663,
"<|video_pad|>": 151656,
"<|vision_end|>": 151653,
"<|vision_pad|>": 151654,
"<|vision_start|>": 151652
}

16
all_results.json Normal file
View File

@@ -0,0 +1,16 @@
{
"achieved_tflops_per_gpu": 101767.32661666165,
"achieved_tflops_per_gpu_theoretical": 2989585.235467485,
"epoch": 7.0,
"loss_nan_ranks": 0,
"loss_rank_avg": 0.2247583568096161,
"mfu_percent": 7192.037216725205,
"mfu_percent_theoretical": 211278.1085136032,
"total_flos": 2.6866574226798674e+18,
"train_loss": 0.0,
"train_runtime": 0.825,
"train_samples_per_second": 84297.403,
"train_steps_per_second": 882.429,
"valid_targets_mean": 7573.6,
"valid_targets_min": 2570
}

89
chat_template.jinja Normal file
View File

@@ -0,0 +1,89 @@
{%- if tools %}
{{- '<|im_start|>system\n' }}
{%- if messages[0].role == 'system' %}
{{- messages[0].content + '\n\n' }}
{%- endif %}
{{- "# Tools\n\nYou may call one or more functions to assist with the user query.\n\nYou are provided with function signatures within <tools></tools> XML tags:\n<tools>" }}
{%- for tool in tools %}
{{- "\n" }}
{{- tool | tojson }}
{%- endfor %}
{{- "\n</tools>\n\nFor each function call, return a json object with function name and arguments within <tool_call></tool_call> XML tags:\n<tool_call>\n{\"name\": <function-name>, \"arguments\": <args-json-object>}\n</tool_call><|im_end|>\n" }}
{%- else %}
{%- if messages[0].role == 'system' %}
{{- '<|im_start|>system\n' + messages[0].content + '<|im_end|>\n' }}
{%- endif %}
{%- endif %}
{%- set ns = namespace(multi_step_tool=true, last_query_index=messages|length - 1) %}
{%- for message in messages[::-1] %}
{%- set index = (messages|length - 1) - loop.index0 %}
{%- if ns.multi_step_tool and message.role == "user" and message.content is string and not(message.content.startswith('<tool_response>') and message.content.endswith('</tool_response>')) %}
{%- set ns.multi_step_tool = false %}
{%- set ns.last_query_index = index %}
{%- endif %}
{%- endfor %}
{%- for message in messages %}
{%- if message.content is string %}
{%- set content = message.content %}
{%- else %}
{%- set content = '' %}
{%- endif %}
{%- if (message.role == "user") or (message.role == "system" and not loop.first) %}
{{- '<|im_start|>' + message.role + '\n' + content + '<|im_end|>' + '\n' }}
{%- elif message.role == "assistant" %}
{%- set reasoning_content = '' %}
{%- if message.reasoning_content is string %}
{%- set reasoning_content = message.reasoning_content %}
{%- else %}
{%- if '</think>' in content %}
{%- set reasoning_content = content.split('</think>')[0].rstrip('\n').split('<think>')[-1].lstrip('\n') %}
{%- set content = content.split('</think>')[-1].lstrip('\n') %}
{%- endif %}
{%- endif %}
{%- if loop.index0 > ns.last_query_index %}
{%- if loop.last or (not loop.last and reasoning_content) %}
{{- '<|im_start|>' + message.role + '\n<think>\n' + reasoning_content.strip('\n') + '\n</think>\n\n' + content.lstrip('\n') }}
{%- else %}
{{- '<|im_start|>' + message.role + '\n' + content }}
{%- endif %}
{%- else %}
{{- '<|im_start|>' + message.role + '\n' + content }}
{%- endif %}
{%- if message.tool_calls %}
{%- for tool_call in message.tool_calls %}
{%- if (loop.first and content) or (not loop.first) %}
{{- '\n' }}
{%- endif %}
{%- if tool_call.function %}
{%- set tool_call = tool_call.function %}
{%- endif %}
{{- '<tool_call>\n{"name": "' }}
{{- tool_call.name }}
{{- '", "arguments": ' }}
{%- if tool_call.arguments is string %}
{{- tool_call.arguments }}
{%- else %}
{{- tool_call.arguments | tojson }}
{%- endif %}
{{- '}\n</tool_call>' }}
{%- endfor %}
{%- endif %}
{{- '<|im_end|>\n' }}
{%- elif message.role == "tool" %}
{%- if loop.first or (messages[loop.index0 - 1].role != "tool") %}
{{- '<|im_start|>user' }}
{%- endif %}
{{- '\n<tool_response>\n' }}
{{- content }}
{{- '\n</tool_response>' }}
{%- if loop.last or (messages[loop.index0 + 1].role != "tool") %}
{{- '<|im_end|>\n' }}
{%- endif %}
{%- endif %}
{%- endfor %}
{%- if add_generation_prompt %}
{{- '<|im_start|>assistant\n' }}
{%- if enable_thinking is defined and enable_thinking is false %}
{{- '<think>\n\n</think>\n\n' }}
{%- endif %}
{%- endif %}

68
config.json Normal file
View File

@@ -0,0 +1,68 @@
{
"architectures": [
"Qwen3ForCausalLM"
],
"attention_bias": false,
"attention_dropout": 0.0,
"dtype": "bfloat16",
"eos_token_id": 151645,
"head_dim": 128,
"hidden_act": "silu",
"hidden_size": 4096,
"initializer_range": 0.02,
"intermediate_size": 12288,
"layer_types": [
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention"
],
"max_position_embeddings": 40960,
"max_window_layers": 36,
"model_type": "qwen3",
"num_attention_heads": 32,
"num_hidden_layers": 36,
"num_key_value_heads": 8,
"pad_token_id": 151643,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000,
"sliding_window": null,
"tie_word_embeddings": false,
"transformers_version": "4.57.6",
"use_cache": false,
"use_sliding_window": false,
"vocab_size": 151936
}

12
generation_config.json Normal file
View File

@@ -0,0 +1,12 @@
{
"do_sample": true,
"eos_token_id": [
151645,
151643
],
"pad_token_id": 151643,
"temperature": 0.6,
"top_k": 20,
"top_p": 0.95,
"transformers_version": "4.57.6"
}

151388
merges.txt Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:8be988847f8464000d60d9247d4047f0e96c0fbb6e3be08a5fb5be05a35e4b88
size 4902257696

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:c468ce4676f171900109add05f07013eb63b014f8b466a34d2e4efdf5d1d2763
size 4915960368

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:b424296889090c9b0403d47cead791bdeca5a4cb7a7ca70a201fbf5cffa4e7e9
size 4983068496

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:d93becd66ae2dfd72a1753767febfd3c312ba37ee9ec041106c037bc26054e01
size 1580230264

View File

@@ -0,0 +1,407 @@
{
"metadata": {
"total_parameters": 308224,
"total_size": 16381470720
},
"weight_map": {
"lm_head.weight": "model-00004-of-00004.safetensors",
"model.embed_tokens.weight": "model-00001-of-00004.safetensors",
"model.layers.0.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.0.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.0.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.0.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.0.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.0.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.0.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.0.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.0.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.0.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.0.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.1.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.1.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.1.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.1.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.1.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.1.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.1.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.1.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.1.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.1.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.1.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.10.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.10.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.10.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.10.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.10.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.10.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.10.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.10.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.10.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.10.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.10.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.11.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.11.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.11.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.11.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.11.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.11.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.11.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.11.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.11.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.11.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.11.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.12.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.12.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.12.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.12.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.12.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.12.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.12.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.12.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.12.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.12.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.12.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.13.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.13.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.13.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.13.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.13.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.13.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.13.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.13.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.13.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.13.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.13.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.14.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.14.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.14.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.14.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.14.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.14.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.14.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.14.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.14.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.14.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.14.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.15.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.15.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.15.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.15.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.15.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.15.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.15.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.15.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.15.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.15.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.15.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.16.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.16.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.16.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.16.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.16.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.16.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.16.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.16.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.16.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.16.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.16.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.17.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.17.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.17.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.17.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.17.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.17.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.17.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.17.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.17.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.17.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.17.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.18.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.18.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.18.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.18.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.18.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.18.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.18.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.18.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.18.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.18.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.18.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.19.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.19.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.19.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.19.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.19.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.19.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.19.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.19.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.19.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.19.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.19.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.2.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.2.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.2.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.2.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.2.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.2.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.2.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.2.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.2.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.2.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.2.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.20.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.20.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.20.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.20.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.20.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.20.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.20.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.20.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.20.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.20.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.20.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.21.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.21.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.21.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.21.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.21.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.21.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.21.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.21.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.21.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.21.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.21.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.22.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.22.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.22.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.22.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.22.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.22.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.22.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.22.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.22.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.22.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.22.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.23.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.23.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.23.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.23.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.23.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.23.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.23.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.23.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.23.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.23.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.23.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.24.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.24.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.24.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.24.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.24.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.24.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.24.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.24.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.24.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.24.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.24.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.25.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.25.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.25.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.25.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.25.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.25.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.25.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.25.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.25.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.25.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.25.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.26.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.26.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.26.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.26.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.26.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.26.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.26.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.26.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.26.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.26.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.26.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.27.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.27.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.27.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.27.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.27.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.27.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.27.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.27.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.27.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.27.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.27.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.28.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.28.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.28.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.28.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.28.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.28.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.28.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.28.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.28.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.28.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.28.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.29.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.29.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.29.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.29.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.29.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.29.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.29.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.29.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.29.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.29.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.29.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.3.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.3.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.3.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.3.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.3.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.3.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.3.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.3.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.3.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.3.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.3.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.30.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.30.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.30.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.30.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.30.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.30.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.30.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.30.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.30.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.30.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.30.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.31.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.31.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.31.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.31.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.31.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.31.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.31.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.31.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.31.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.31.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.31.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.32.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.32.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.32.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.32.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.32.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.32.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.32.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.32.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.32.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.32.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.32.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.33.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.33.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.33.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.33.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.33.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.33.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.33.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.33.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.33.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.33.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.33.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.34.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.34.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.34.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.34.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.34.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.34.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.34.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.34.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.34.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.34.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.34.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.35.input_layernorm.weight": "model-00004-of-00004.safetensors",
"model.layers.35.mlp.down_proj.weight": "model-00004-of-00004.safetensors",
"model.layers.35.mlp.gate_proj.weight": "model-00004-of-00004.safetensors",
"model.layers.35.mlp.up_proj.weight": "model-00004-of-00004.safetensors",
"model.layers.35.post_attention_layernorm.weight": "model-00004-of-00004.safetensors",
"model.layers.35.self_attn.k_norm.weight": "model-00004-of-00004.safetensors",
"model.layers.35.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.35.self_attn.o_proj.weight": "model-00004-of-00004.safetensors",
"model.layers.35.self_attn.q_norm.weight": "model-00004-of-00004.safetensors",
"model.layers.35.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.35.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.4.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.4.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.4.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.4.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.4.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.4.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.4.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.4.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.4.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.4.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.4.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.5.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.5.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.5.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.5.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.5.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.5.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.5.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.5.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.5.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.5.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.5.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.6.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.6.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.6.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.6.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.6.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.6.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.6.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.6.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.6.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.6.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.6.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.7.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.7.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.7.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.7.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.7.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.7.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.7.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.7.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.7.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.7.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.7.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.8.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.8.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.8.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.8.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.8.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.8.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.8.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.8.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.8.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.8.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.8.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.9.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.9.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.9.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.9.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.9.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.9.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.9.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.9.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.9.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.9.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.9.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.norm.weight": "model-00004-of-00004.safetensors"
}
}

12
run_summary.json Normal file
View File

@@ -0,0 +1,12 @@
{
"agent_name": "e9ee8860828fcf38e70569d00375ff8427186562_thinking_preprocessed",
"training_start": null,
"training_end": null,
"created_by": "DCAgent",
"base_model_name": "Qwen/Qwen3-8B",
"dataset_name": "/e/data1/datasets/playground/ot/hf_hub/datasets--laion--nemotron-terminal-dependency_management/snapshots/e9ee8860828fcf38e70569d00375ff8427186562_thinking_preprocessed",
"training_type": "SFT",
"training_parameters": "https://huggingface.co/laion/nemotron-terminal-dependency_management__Qwen3-8B/blob/main/config.json",
"wandb_link": null,
"traces_location_s3": null
}

31
special_tokens_map.json Normal file
View File

@@ -0,0 +1,31 @@
{
"additional_special_tokens": [
"<|im_start|>",
"<|im_end|>",
"<|object_ref_start|>",
"<|object_ref_end|>",
"<|box_start|>",
"<|box_end|>",
"<|quad_start|>",
"<|quad_end|>",
"<|vision_start|>",
"<|vision_end|>",
"<|vision_pad|>",
"<|image_pad|>",
"<|video_pad|>"
],
"eos_token": {
"content": "<|im_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false
},
"pad_token": {
"content": "<|endoftext|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false
}
}

3
tokenizer.json Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:aeb13307a71acd8fe81861d94ad54ab689df773318809eed3cbe794b4492dae4
size 11422654

240
tokenizer_config.json Normal file
View File

@@ -0,0 +1,240 @@
{
"add_bos_token": false,
"add_prefix_space": false,
"added_tokens_decoder": {
"151643": {
"content": "<|endoftext|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151644": {
"content": "<|im_start|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151645": {
"content": "<|im_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151646": {
"content": "<|object_ref_start|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151647": {
"content": "<|object_ref_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151648": {
"content": "<|box_start|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151649": {
"content": "<|box_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151650": {
"content": "<|quad_start|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151651": {
"content": "<|quad_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151652": {
"content": "<|vision_start|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151653": {
"content": "<|vision_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151654": {
"content": "<|vision_pad|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151655": {
"content": "<|image_pad|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151656": {
"content": "<|video_pad|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151657": {
"content": "<tool_call>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151658": {
"content": "</tool_call>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151659": {
"content": "<|fim_prefix|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151660": {
"content": "<|fim_middle|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151661": {
"content": "<|fim_suffix|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151662": {
"content": "<|fim_pad|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151663": {
"content": "<|repo_name|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151664": {
"content": "<|file_sep|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151665": {
"content": "<tool_response>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151666": {
"content": "</tool_response>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151667": {
"content": "<think>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151668": {
"content": "</think>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
}
},
"additional_special_tokens": [
"<|im_start|>",
"<|im_end|>",
"<|object_ref_start|>",
"<|object_ref_end|>",
"<|box_start|>",
"<|box_end|>",
"<|quad_start|>",
"<|quad_end|>",
"<|vision_start|>",
"<|vision_end|>",
"<|vision_pad|>",
"<|image_pad|>",
"<|video_pad|>"
],
"bos_token": null,
"clean_up_tokenization_spaces": false,
"eos_token": "<|im_end|>",
"errors": "replace",
"extra_special_tokens": {},
"model_max_length": 32768,
"pad_token": "<|endoftext|>",
"padding_side": "right",
"split_special_tokens": false,
"tokenizer_class": "Qwen2Tokenizer",
"unk_token": null
}

12
train_results.json Normal file
View File

@@ -0,0 +1,12 @@
{
"achieved_tflops_per_gpu": 101767.32661666165,
"achieved_tflops_per_gpu_theoretical": 2989585.235467485,
"epoch": 7.0,
"mfu_percent": 7192.037216725205,
"mfu_percent_theoretical": 211278.1085136032,
"total_flos": 2.6866574226798674e+18,
"train_loss": 0.0,
"train_runtime": 0.825,
"train_samples_per_second": 84297.403,
"train_steps_per_second": 882.429
}

151
trainer_log.jsonl Normal file
View File

@@ -0,0 +1,151 @@
{"current_steps": 5, "total_steps": 728, "loss": 1.0168, "lr": 2.191780821917808e-06, "epoch": 0.04823151125401929, "percentage": 0.69, "elapsed_time": "0:02:07", "remaining_time": "5:07:44"}
{"current_steps": 10, "total_steps": 728, "loss": 0.9698, "lr": 4.931506849315069e-06, "epoch": 0.09646302250803858, "percentage": 1.37, "elapsed_time": "0:04:15", "remaining_time": "5:05:26"}
{"current_steps": 15, "total_steps": 728, "loss": 0.8828, "lr": 7.671232876712329e-06, "epoch": 0.14469453376205788, "percentage": 2.06, "elapsed_time": "0:06:04", "remaining_time": "4:49:08"}
{"current_steps": 20, "total_steps": 728, "loss": 0.8296, "lr": 1.0410958904109589e-05, "epoch": 0.19292604501607716, "percentage": 2.75, "elapsed_time": "0:08:00", "remaining_time": "4:43:22"}
{"current_steps": 25, "total_steps": 728, "loss": 0.7939, "lr": 1.3150684931506849e-05, "epoch": 0.24115755627009647, "percentage": 3.43, "elapsed_time": "0:09:54", "remaining_time": "4:38:29"}
{"current_steps": 30, "total_steps": 728, "loss": 0.7572, "lr": 1.589041095890411e-05, "epoch": 0.28938906752411575, "percentage": 4.12, "elapsed_time": "0:11:46", "remaining_time": "4:33:57"}
{"current_steps": 35, "total_steps": 728, "loss": 0.7285, "lr": 1.863013698630137e-05, "epoch": 0.33762057877813506, "percentage": 4.81, "elapsed_time": "0:13:40", "remaining_time": "4:30:46"}
{"current_steps": 40, "total_steps": 728, "loss": 0.6917, "lr": 2.1369863013698632e-05, "epoch": 0.3858520900321543, "percentage": 5.49, "elapsed_time": "0:15:28", "remaining_time": "4:26:16"}
{"current_steps": 45, "total_steps": 728, "loss": 0.6598, "lr": 2.410958904109589e-05, "epoch": 0.4340836012861736, "percentage": 6.18, "elapsed_time": "0:17:16", "remaining_time": "4:22:10"}
{"current_steps": 50, "total_steps": 728, "loss": 0.6235, "lr": 2.6849315068493153e-05, "epoch": 0.48231511254019294, "percentage": 6.87, "elapsed_time": "0:19:10", "remaining_time": "4:20:01"}
{"current_steps": 55, "total_steps": 728, "loss": 0.6019, "lr": 2.958904109589041e-05, "epoch": 0.5305466237942122, "percentage": 7.55, "elapsed_time": "0:21:05", "remaining_time": "4:18:09"}
{"current_steps": 60, "total_steps": 728, "loss": 0.5868, "lr": 3.2328767123287676e-05, "epoch": 0.5787781350482315, "percentage": 8.24, "elapsed_time": "0:22:58", "remaining_time": "4:15:52"}
{"current_steps": 65, "total_steps": 728, "loss": 0.5745, "lr": 3.506849315068493e-05, "epoch": 0.6270096463022508, "percentage": 8.93, "elapsed_time": "0:24:53", "remaining_time": "4:13:51"}
{"current_steps": 70, "total_steps": 728, "loss": 0.5558, "lr": 3.780821917808219e-05, "epoch": 0.6752411575562701, "percentage": 9.62, "elapsed_time": "0:26:46", "remaining_time": "4:11:39"}
{"current_steps": 75, "total_steps": 728, "loss": 0.5459, "lr": 3.999976995313839e-05, "epoch": 0.7234726688102894, "percentage": 10.3, "elapsed_time": "0:28:33", "remaining_time": "4:08:37"}
{"current_steps": 80, "total_steps": 728, "loss": 0.5318, "lr": 3.999171886864457e-05, "epoch": 0.7717041800643086, "percentage": 10.99, "elapsed_time": "0:30:24", "remaining_time": "4:06:20"}
{"current_steps": 85, "total_steps": 728, "loss": 0.5179, "lr": 3.997217073267859e-05, "epoch": 0.819935691318328, "percentage": 11.68, "elapsed_time": "0:32:18", "remaining_time": "4:04:21"}
{"current_steps": 90, "total_steps": 728, "loss": 0.5178, "lr": 3.9941136787191535e-05, "epoch": 0.8681672025723473, "percentage": 12.36, "elapsed_time": "0:34:08", "remaining_time": "4:02:03"}
{"current_steps": 95, "total_steps": 728, "loss": 0.5085, "lr": 3.989863487951665e-05, "epoch": 0.9163987138263665, "percentage": 13.05, "elapsed_time": "0:36:00", "remaining_time": "3:59:54"}
{"current_steps": 100, "total_steps": 728, "loss": 0.5051, "lr": 3.984468945210548e-05, "epoch": 0.9646302250803859, "percentage": 13.74, "elapsed_time": "0:37:57", "remaining_time": "3:58:22"}
{"current_steps": 105, "total_steps": 728, "loss": 0.4967, "lr": 3.977933152847132e-05, "epoch": 1.0096463022508038, "percentage": 14.42, "elapsed_time": "0:39:44", "remaining_time": "3:55:47"}
{"current_steps": 110, "total_steps": 728, "loss": 0.4907, "lr": 3.9702598695347794e-05, "epoch": 1.0578778135048232, "percentage": 15.11, "elapsed_time": "0:41:43", "remaining_time": "3:54:26"}
{"current_steps": 115, "total_steps": 728, "loss": 0.4852, "lr": 3.961453508107314e-05, "epoch": 1.1061093247588425, "percentage": 15.8, "elapsed_time": "0:43:34", "remaining_time": "3:52:16"}
{"current_steps": 120, "total_steps": 728, "loss": 0.4785, "lr": 3.951519133021237e-05, "epoch": 1.1543408360128617, "percentage": 16.48, "elapsed_time": "0:45:26", "remaining_time": "3:50:13"}
{"current_steps": 125, "total_steps": 728, "loss": 0.4809, "lr": 3.94046245744321e-05, "epoch": 1.202572347266881, "percentage": 17.17, "elapsed_time": "0:47:14", "remaining_time": "3:47:55"}
{"current_steps": 130, "total_steps": 728, "loss": 0.4764, "lr": 3.928289839964459e-05, "epoch": 1.2508038585209003, "percentage": 17.86, "elapsed_time": "0:49:03", "remaining_time": "3:45:38"}
{"current_steps": 135, "total_steps": 728, "loss": 0.4767, "lr": 3.915008280944014e-05, "epoch": 1.2990353697749195, "percentage": 18.54, "elapsed_time": "0:50:53", "remaining_time": "3:43:33"}
{"current_steps": 140, "total_steps": 728, "loss": 0.4751, "lr": 3.900625418482867e-05, "epoch": 1.347266881028939, "percentage": 19.23, "elapsed_time": "0:57:24", "remaining_time": "4:01:05"}
{"current_steps": 145, "total_steps": 728, "loss": 0.4704, "lr": 3.885149524031366e-05, "epoch": 1.3954983922829582, "percentage": 19.92, "elapsed_time": "1:00:03", "remaining_time": "4:01:29"}
{"current_steps": 150, "total_steps": 728, "loss": 0.4664, "lr": 3.868589497632388e-05, "epoch": 1.4437299035369775, "percentage": 20.6, "elapsed_time": "1:01:50", "remaining_time": "3:58:17"}
{"current_steps": 155, "total_steps": 728, "loss": 0.4595, "lr": 3.850954862803001e-05, "epoch": 1.4919614147909968, "percentage": 21.29, "elapsed_time": "1:03:35", "remaining_time": "3:55:05"}
{"current_steps": 160, "total_steps": 728, "loss": 0.4695, "lr": 3.8322557610575826e-05, "epoch": 1.540192926045016, "percentage": 21.98, "elapsed_time": "1:05:19", "remaining_time": "3:51:52"}
{"current_steps": 165, "total_steps": 728, "loss": 0.4623, "lr": 3.812502946075527e-05, "epoch": 1.5884244372990355, "percentage": 22.66, "elapsed_time": "1:07:14", "remaining_time": "3:49:26"}
{"current_steps": 170, "total_steps": 728, "loss": 0.4611, "lr": 3.791707777516904e-05, "epoch": 1.6366559485530545, "percentage": 23.35, "elapsed_time": "1:09:08", "remaining_time": "3:46:58"}
{"current_steps": 175, "total_steps": 728, "loss": 0.4593, "lr": 3.769882214489626e-05, "epoch": 1.684887459807074, "percentage": 24.04, "elapsed_time": "1:11:04", "remaining_time": "3:44:35"}
{"current_steps": 180, "total_steps": 728, "loss": 0.457, "lr": 3.7470388086718745e-05, "epoch": 1.7331189710610932, "percentage": 24.73, "elapsed_time": "1:12:52", "remaining_time": "3:41:53"}
{"current_steps": 185, "total_steps": 728, "loss": 0.4552, "lr": 3.7231906970937464e-05, "epoch": 1.7813504823151125, "percentage": 25.41, "elapsed_time": "1:14:49", "remaining_time": "3:39:36"}
{"current_steps": 190, "total_steps": 728, "loss": 0.4525, "lr": 3.6983515945822736e-05, "epoch": 1.829581993569132, "percentage": 26.1, "elapsed_time": "1:16:51", "remaining_time": "3:37:38"}
{"current_steps": 195, "total_steps": 728, "loss": 0.4459, "lr": 3.672535785874148e-05, "epoch": 1.877813504823151, "percentage": 26.79, "elapsed_time": "1:18:46", "remaining_time": "3:35:17"}
{"current_steps": 200, "total_steps": 728, "loss": 0.4481, "lr": 3.64575811740071e-05, "epoch": 1.9260450160771705, "percentage": 27.47, "elapsed_time": "1:20:29", "remaining_time": "3:32:30"}
{"current_steps": 205, "total_steps": 728, "loss": 0.4473, "lr": 3.6180339887498953e-05, "epoch": 1.9742765273311897, "percentage": 28.16, "elapsed_time": "1:22:29", "remaining_time": "3:30:26"}
{"current_steps": 210, "total_steps": 728, "loss": 0.445, "lr": 3.589379343810083e-05, "epoch": 2.0192926045016075, "percentage": 28.85, "elapsed_time": "1:24:09", "remaining_time": "3:27:35"}
{"current_steps": 215, "total_steps": 728, "loss": 0.4391, "lr": 3.559810661600907e-05, "epoch": 2.067524115755627, "percentage": 29.53, "elapsed_time": "1:26:03", "remaining_time": "3:25:19"}
{"current_steps": 220, "total_steps": 728, "loss": 0.4439, "lr": 3.529344946796333e-05, "epoch": 2.1157556270096465, "percentage": 30.22, "elapsed_time": "1:27:53", "remaining_time": "3:22:56"}
{"current_steps": 225, "total_steps": 728, "loss": 0.4375, "lr": 3.4979997199454195e-05, "epoch": 2.1639871382636655, "percentage": 30.91, "elapsed_time": "1:29:47", "remaining_time": "3:20:44"}
{"current_steps": 230, "total_steps": 728, "loss": 0.4348, "lr": 3.465793007396421e-05, "epoch": 2.212218649517685, "percentage": 31.59, "elapsed_time": "1:31:39", "remaining_time": "3:18:27"}
{"current_steps": 235, "total_steps": 728, "loss": 0.4405, "lr": 3.4327433309299986e-05, "epoch": 2.260450160771704, "percentage": 32.28, "elapsed_time": "1:33:25", "remaining_time": "3:16:00"}
{"current_steps": 240, "total_steps": 728, "loss": 0.4402, "lr": 3.398869697107517e-05, "epoch": 2.3086816720257235, "percentage": 32.97, "elapsed_time": "1:35:13", "remaining_time": "3:13:36"}
{"current_steps": 245, "total_steps": 728, "loss": 0.4427, "lr": 3.3641915863405486e-05, "epoch": 2.356913183279743, "percentage": 33.65, "elapsed_time": "1:37:00", "remaining_time": "3:11:15"}
{"current_steps": 250, "total_steps": 728, "loss": 0.4288, "lr": 3.328728941687871e-05, "epoch": 2.405144694533762, "percentage": 34.34, "elapsed_time": "1:38:50", "remaining_time": "3:08:59"}
{"current_steps": 255, "total_steps": 728, "loss": 0.4357, "lr": 3.292502157386397e-05, "epoch": 2.4533762057877815, "percentage": 35.03, "elapsed_time": "1:40:35", "remaining_time": "3:06:36"}
{"current_steps": 260, "total_steps": 728, "loss": 0.4327, "lr": 3.2555320671226405e-05, "epoch": 2.5016077170418005, "percentage": 35.71, "elapsed_time": "1:42:30", "remaining_time": "3:04:31"}
{"current_steps": 265, "total_steps": 728, "loss": 0.4325, "lr": 3.217839932051457e-05, "epoch": 2.54983922829582, "percentage": 36.4, "elapsed_time": "1:44:19", "remaining_time": "3:02:16"}
{"current_steps": 270, "total_steps": 728, "loss": 0.4306, "lr": 3.179447428568952e-05, "epoch": 2.598070739549839, "percentage": 37.09, "elapsed_time": "1:46:14", "remaining_time": "3:00:13"}
{"current_steps": 275, "total_steps": 728, "loss": 0.4321, "lr": 3.1403766358465833e-05, "epoch": 2.6463022508038585, "percentage": 37.77, "elapsed_time": "1:48:06", "remaining_time": "2:58:05"}
{"current_steps": 280, "total_steps": 728, "loss": 0.4293, "lr": 3.100650023133643e-05, "epoch": 2.694533762057878, "percentage": 38.46, "elapsed_time": "1:49:52", "remaining_time": "2:55:48"}
{"current_steps": 285, "total_steps": 728, "loss": 0.434, "lr": 3.060290436835392e-05, "epoch": 2.742765273311897, "percentage": 39.15, "elapsed_time": "1:51:45", "remaining_time": "2:53:43"}
{"current_steps": 290, "total_steps": 728, "loss": 0.4355, "lr": 3.019321087374313e-05, "epoch": 2.7909967845659165, "percentage": 39.84, "elapsed_time": "1:53:40", "remaining_time": "2:51:41"}
{"current_steps": 295, "total_steps": 728, "loss": 0.4329, "lr": 2.977765535842007e-05, "epoch": 2.839228295819936, "percentage": 40.52, "elapsed_time": "1:55:26", "remaining_time": "2:49:26"}
{"current_steps": 300, "total_steps": 728, "loss": 0.4286, "lr": 2.9356476804494306e-05, "epoch": 2.887459807073955, "percentage": 41.21, "elapsed_time": "1:57:26", "remaining_time": "2:47:33"}
{"current_steps": 305, "total_steps": 728, "loss": 0.4296, "lr": 2.892991742783259e-05, "epoch": 2.935691318327974, "percentage": 41.9, "elapsed_time": "1:59:24", "remaining_time": "2:45:36"}
{"current_steps": 310, "total_steps": 728, "loss": 0.4283, "lr": 2.8498222538762737e-05, "epoch": 2.9839228295819935, "percentage": 42.58, "elapsed_time": "2:01:17", "remaining_time": "2:43:33"}
{"current_steps": 315, "total_steps": 728, "loss": 0.4285, "lr": 2.8061640400997966e-05, "epoch": 3.0289389067524115, "percentage": 43.27, "elapsed_time": "2:03:08", "remaining_time": "2:41:27"}
{"current_steps": 320, "total_steps": 728, "loss": 0.4192, "lr": 2.7620422088862736e-05, "epoch": 3.077170418006431, "percentage": 43.96, "elapsed_time": "2:04:54", "remaining_time": "2:39:15"}
{"current_steps": 325, "total_steps": 728, "loss": 0.4203, "lr": 2.7174821342902234e-05, "epoch": 3.12540192926045, "percentage": 44.64, "elapsed_time": "2:06:46", "remaining_time": "2:37:12"}
{"current_steps": 330, "total_steps": 728, "loss": 0.4257, "lr": 2.6725094423958574e-05, "epoch": 3.1736334405144695, "percentage": 45.33, "elapsed_time": "2:08:31", "remaining_time": "2:35:00"}
{"current_steps": 335, "total_steps": 728, "loss": 0.4209, "lr": 2.6271499965797532e-05, "epoch": 3.221864951768489, "percentage": 46.02, "elapsed_time": "2:10:15", "remaining_time": "2:32:48"}
{"current_steps": 340, "total_steps": 728, "loss": 0.4234, "lr": 2.5814298826370702e-05, "epoch": 3.270096463022508, "percentage": 46.7, "elapsed_time": "2:12:07", "remaining_time": "2:30:47"}
{"current_steps": 345, "total_steps": 728, "loss": 0.4197, "lr": 2.5353753937798527e-05, "epoch": 3.3183279742765275, "percentage": 47.39, "elapsed_time": "2:13:59", "remaining_time": "2:28:45"}
{"current_steps": 350, "total_steps": 728, "loss": 0.4199, "lr": 2.4890130155160427e-05, "epoch": 3.3665594855305465, "percentage": 48.08, "elapsed_time": "2:15:45", "remaining_time": "2:26:36"}
{"current_steps": 355, "total_steps": 728, "loss": 0.4195, "lr": 2.4423694104179176e-05, "epoch": 3.414790996784566, "percentage": 48.76, "elapsed_time": "2:17:34", "remaining_time": "2:24:32"}
{"current_steps": 360, "total_steps": 728, "loss": 0.4137, "lr": 2.3954714027886904e-05, "epoch": 3.463022508038585, "percentage": 49.45, "elapsed_time": "2:19:27", "remaining_time": "2:22:33"}
{"current_steps": 365, "total_steps": 728, "loss": 0.4151, "lr": 2.3483459632361e-05, "epoch": 3.5112540192926045, "percentage": 50.14, "elapsed_time": "2:21:21", "remaining_time": "2:20:35"}
{"current_steps": 370, "total_steps": 728, "loss": 0.4159, "lr": 2.3010201931618696e-05, "epoch": 3.559485530546624, "percentage": 50.82, "elapsed_time": "2:23:14", "remaining_time": "2:18:36"}
{"current_steps": 375, "total_steps": 728, "loss": 0.4141, "lr": 2.2535213091759404e-05, "epoch": 3.607717041800643, "percentage": 51.51, "elapsed_time": "2:25:03", "remaining_time": "2:16:32"}
{"current_steps": 380, "total_steps": 728, "loss": 0.4213, "lr": 2.205876627444452e-05, "epoch": 3.6559485530546625, "percentage": 52.2, "elapsed_time": "2:27:02", "remaining_time": "2:14:39"}
{"current_steps": 385, "total_steps": 728, "loss": 0.42, "lr": 2.1581135479804735e-05, "epoch": 3.7041800643086815, "percentage": 52.88, "elapsed_time": "2:29:00", "remaining_time": "2:12:45"}
{"current_steps": 390, "total_steps": 728, "loss": 0.4152, "lr": 2.1102595388865054e-05, "epoch": 3.752411575562701, "percentage": 53.57, "elapsed_time": "2:30:51", "remaining_time": "2:10:44"}
{"current_steps": 395, "total_steps": 728, "loss": 0.42, "lr": 2.062342120557834e-05, "epoch": 3.80064308681672, "percentage": 54.26, "elapsed_time": "2:32:42", "remaining_time": "2:08:44"}
{"current_steps": 400, "total_steps": 728, "loss": 0.415, "lr": 2.0143888498558046e-05, "epoch": 3.8488745980707395, "percentage": 54.95, "elapsed_time": "2:34:24", "remaining_time": "2:06:36"}
{"current_steps": 405, "total_steps": 728, "loss": 0.4243, "lr": 1.9664273042601302e-05, "epoch": 3.897106109324759, "percentage": 55.63, "elapsed_time": "2:36:07", "remaining_time": "2:04:30"}
{"current_steps": 410, "total_steps": 728, "loss": 0.4175, "lr": 1.918485066009338e-05, "epoch": 3.945337620578778, "percentage": 56.32, "elapsed_time": "2:37:53", "remaining_time": "2:02:27"}
{"current_steps": 415, "total_steps": 728, "loss": 0.4122, "lr": 1.87058970623848e-05, "epoch": 3.9935691318327975, "percentage": 57.01, "elapsed_time": "2:39:38", "remaining_time": "2:00:24"}
{"current_steps": 420, "total_steps": 728, "loss": 0.4168, "lr": 1.8227687691232322e-05, "epoch": 4.038585209003215, "percentage": 57.69, "elapsed_time": "2:41:18", "remaining_time": "1:58:17"}
{"current_steps": 425, "total_steps": 728, "loss": 0.4123, "lr": 1.7750497560394918e-05, "epoch": 4.086816720257235, "percentage": 58.38, "elapsed_time": "2:43:07", "remaining_time": "1:56:17"}
{"current_steps": 430, "total_steps": 728, "loss": 0.4148, "lr": 1.7274601097475957e-05, "epoch": 4.135048231511254, "percentage": 59.07, "elapsed_time": "2:44:59", "remaining_time": "1:54:20"}
{"current_steps": 435, "total_steps": 728, "loss": 0.4111, "lr": 1.6800271986102418e-05, "epoch": 4.183279742765273, "percentage": 59.75, "elapsed_time": "2:46:49", "remaining_time": "1:52:21"}
{"current_steps": 440, "total_steps": 728, "loss": 0.4115, "lr": 1.6327783008532e-05, "epoch": 4.231511254019293, "percentage": 60.44, "elapsed_time": "2:48:37", "remaining_time": "1:50:22"}
{"current_steps": 445, "total_steps": 728, "loss": 0.4048, "lr": 1.5857405888778568e-05, "epoch": 4.279742765273312, "percentage": 61.13, "elapsed_time": "2:50:25", "remaining_time": "1:48:23"}
{"current_steps": 450, "total_steps": 728, "loss": 0.4165, "lr": 1.5389411136346225e-05, "epoch": 4.327974276527331, "percentage": 61.81, "elapsed_time": "2:52:11", "remaining_time": "1:46:22"}
{"current_steps": 455, "total_steps": 728, "loss": 0.4085, "lr": 1.4924067890661778e-05, "epoch": 4.37620578778135, "percentage": 62.5, "elapsed_time": "2:53:53", "remaining_time": "1:44:19"}
{"current_steps": 460, "total_steps": 728, "loss": 0.4115, "lr": 1.4461643766295196e-05, "epoch": 4.42443729903537, "percentage": 63.19, "elapsed_time": "2:55:39", "remaining_time": "1:42:20"}
{"current_steps": 465, "total_steps": 728, "loss": 0.4088, "lr": 1.4002404699056946e-05, "epoch": 4.472668810289389, "percentage": 63.87, "elapsed_time": "2:57:36", "remaining_time": "1:40:27"}
{"current_steps": 470, "total_steps": 728, "loss": 0.4089, "lr": 1.3546614793060757e-05, "epoch": 4.520900321543408, "percentage": 64.56, "elapsed_time": "2:59:22", "remaining_time": "1:38:28"}
{"current_steps": 475, "total_steps": 728, "loss": 0.4053, "lr": 1.3094536168839853e-05, "epoch": 4.569131832797428, "percentage": 65.25, "elapsed_time": "3:01:10", "remaining_time": "1:36:29"}
{"current_steps": 480, "total_steps": 728, "loss": 0.4039, "lr": 1.2646428812603838e-05, "epoch": 4.617363344051447, "percentage": 65.93, "elapsed_time": "3:02:52", "remaining_time": "1:34:29"}
{"current_steps": 485, "total_steps": 728, "loss": 0.4041, "lr": 1.2202550426723053e-05, "epoch": 4.665594855305466, "percentage": 66.62, "elapsed_time": "3:04:47", "remaining_time": "1:32:35"}
{"current_steps": 490, "total_steps": 728, "loss": 0.4106, "lr": 1.1763156281526348e-05, "epoch": 4.713826366559486, "percentage": 67.31, "elapsed_time": "3:06:50", "remaining_time": "1:30:45"}
{"current_steps": 495, "total_steps": 728, "loss": 0.4102, "lr": 1.1328499068497478e-05, "epoch": 4.762057877813505, "percentage": 67.99, "elapsed_time": "3:08:40", "remaining_time": "1:28:48"}
{"current_steps": 500, "total_steps": 728, "loss": 0.4073, "lr": 1.0898828754954618e-05, "epoch": 4.810289389067524, "percentage": 68.68, "elapsed_time": "3:10:32", "remaining_time": "1:26:53"}
{"current_steps": 505, "total_steps": 728, "loss": 0.4088, "lr": 1.047439244029642e-05, "epoch": 4.858520900321543, "percentage": 69.37, "elapsed_time": "3:12:20", "remaining_time": "1:24:56"}
{"current_steps": 510, "total_steps": 728, "loss": 0.4047, "lr": 1.0055434213897529e-05, "epoch": 4.906752411575563, "percentage": 70.05, "elapsed_time": "3:14:07", "remaining_time": "1:22:58"}
{"current_steps": 515, "total_steps": 728, "loss": 0.4104, "lr": 9.642195014734972e-06, "epoch": 4.954983922829582, "percentage": 70.74, "elapsed_time": "3:15:53", "remaining_time": "1:21:01"}
{"current_steps": 520, "total_steps": 728, "loss": 0.4062, "lr": 9.234912492826454e-06, "epoch": 5.0, "percentage": 71.43, "elapsed_time": "3:17:32", "remaining_time": "1:19:01"}
{"current_steps": 525, "total_steps": 728, "loss": 0.404, "lr": 8.833820872560035e-06, "epoch": 5.048231511254019, "percentage": 72.12, "elapsed_time": "3:19:16", "remaining_time": "1:17:03"}
{"current_steps": 530, "total_steps": 728, "loss": 0.4039, "lr": 8.439150817993836e-06, "epoch": 5.096463022508039, "percentage": 72.8, "elapsed_time": "3:21:13", "remaining_time": "1:15:10"}
{"current_steps": 535, "total_steps": 728, "loss": 0.4042, "lr": 8.051129300203324e-06, "epoch": 5.144694533762058, "percentage": 73.49, "elapsed_time": "3:22:58", "remaining_time": "1:13:13"}
{"current_steps": 540, "total_steps": 728, "loss": 0.4096, "lr": 7.669979466752322e-06, "epoch": 5.192926045016077, "percentage": 74.18, "elapsed_time": "3:24:49", "remaining_time": "1:11:18"}
{"current_steps": 545, "total_steps": 728, "loss": 0.4025, "lr": 7.295920513362957e-06, "epoch": 5.241157556270096, "percentage": 74.86, "elapsed_time": "3:26:33", "remaining_time": "1:09:21"}
{"current_steps": 550, "total_steps": 728, "loss": 0.4075, "lr": 6.92916755785821e-06, "epoch": 5.289389067524116, "percentage": 75.55, "elapsed_time": "3:28:22", "remaining_time": "1:07:26"}
{"current_steps": 555, "total_steps": 728, "loss": 0.4026, "lr": 6.5699315164496635e-06, "epoch": 5.337620578778135, "percentage": 76.24, "elapsed_time": "3:30:10", "remaining_time": "1:05:30"}
{"current_steps": 560, "total_steps": 728, "loss": 0.4059, "lr": 6.2184189824415855e-06, "epoch": 5.385852090032154, "percentage": 76.92, "elapsed_time": "3:31:57", "remaining_time": "1:03:35"}
{"current_steps": 565, "total_steps": 728, "loss": 0.407, "lr": 5.87483210742098e-06, "epoch": 5.434083601286174, "percentage": 77.61, "elapsed_time": "3:33:52", "remaining_time": "1:01:42"}
{"current_steps": 570, "total_steps": 728, "loss": 0.4, "lr": 5.539368485002161e-06, "epoch": 5.482315112540193, "percentage": 78.3, "elapsed_time": "3:35:44", "remaining_time": "0:59:48"}
{"current_steps": 575, "total_steps": 728, "loss": 0.4055, "lr": 5.21222103719244e-06, "epoch": 5.530546623794212, "percentage": 78.98, "elapsed_time": "3:37:41", "remaining_time": "0:57:55"}
{"current_steps": 580, "total_steps": 728, "loss": 0.4023, "lr": 4.893577903444524e-06, "epoch": 5.578778135048232, "percentage": 79.67, "elapsed_time": "3:39:31", "remaining_time": "0:56:00"}
{"current_steps": 585, "total_steps": 728, "loss": 0.3995, "lr": 4.58362233245923e-06, "epoch": 5.627009646302251, "percentage": 80.36, "elapsed_time": "3:41:26", "remaining_time": "0:54:07"}
{"current_steps": 590, "total_steps": 728, "loss": 0.4042, "lr": 4.2825325768008905e-06, "epoch": 5.67524115755627, "percentage": 81.04, "elapsed_time": "3:43:18", "remaining_time": "0:52:13"}
{"current_steps": 595, "total_steps": 728, "loss": 0.4002, "lr": 3.990481790385963e-06, "epoch": 5.723472668810289, "percentage": 81.73, "elapsed_time": "3:44:59", "remaining_time": "0:50:17"}
{"current_steps": 600, "total_steps": 728, "loss": 0.4069, "lr": 3.7076379289037755e-06, "epoch": 5.771704180064309, "percentage": 82.42, "elapsed_time": "3:46:40", "remaining_time": "0:48:21"}
{"current_steps": 605, "total_steps": 728, "loss": 0.399, "lr": 3.4341636532268476e-06, "epoch": 5.819935691318328, "percentage": 83.1, "elapsed_time": "3:48:44", "remaining_time": "0:46:30"}
{"current_steps": 610, "total_steps": 728, "loss": 0.403, "lr": 3.170216235866075e-06, "epoch": 5.868167202572347, "percentage": 83.79, "elapsed_time": "3:50:31", "remaining_time": "0:44:35"}
{"current_steps": 615, "total_steps": 728, "loss": 0.4056, "lr": 2.9159474705248093e-06, "epoch": 5.916398713826366, "percentage": 84.48, "elapsed_time": "3:52:25", "remaining_time": "0:42:42"}
{"current_steps": 620, "total_steps": 728, "loss": 0.407, "lr": 2.6715035848036962e-06, "epoch": 5.964630225080386, "percentage": 85.16, "elapsed_time": "3:54:13", "remaining_time": "0:40:48"}
{"current_steps": 625, "total_steps": 728, "loss": 0.4043, "lr": 2.4370251561065363e-06, "epoch": 6.009646302250804, "percentage": 85.85, "elapsed_time": "3:55:56", "remaining_time": "0:38:52"}
{"current_steps": 630, "total_steps": 728, "loss": 0.4, "lr": 2.2126470307955515e-06, "epoch": 6.057877813504823, "percentage": 86.54, "elapsed_time": "3:57:49", "remaining_time": "0:36:59"}
{"current_steps": 635, "total_steps": 728, "loss": 0.4046, "lr": 1.998498246642464e-06, "epoch": 6.106109324758842, "percentage": 87.23, "elapsed_time": "3:59:28", "remaining_time": "0:35:04"}
{"current_steps": 640, "total_steps": 728, "loss": 0.402, "lr": 1.7947019586201152e-06, "epoch": 6.154340836012862, "percentage": 87.91, "elapsed_time": "4:01:17", "remaining_time": "0:33:10"}
{"current_steps": 645, "total_steps": 728, "loss": 0.4019, "lr": 1.6013753680771493e-06, "epoch": 6.202572347266881, "percentage": 88.6, "elapsed_time": "4:03:04", "remaining_time": "0:31:16"}
{"current_steps": 650, "total_steps": 728, "loss": 0.4073, "lr": 1.4186296553366274e-06, "epoch": 6.2508038585209, "percentage": 89.29, "elapsed_time": "4:04:52", "remaining_time": "0:29:23"}
{"current_steps": 655, "total_steps": 728, "loss": 0.4038, "lr": 1.246569915757263e-06, "epoch": 6.29903536977492, "percentage": 89.97, "elapsed_time": "4:06:39", "remaining_time": "0:27:29"}
{"current_steps": 660, "total_steps": 728, "loss": 0.4024, "lr": 1.0852950992940415e-06, "epoch": 6.347266881028939, "percentage": 90.66, "elapsed_time": "4:08:31", "remaining_time": "0:25:36"}
{"current_steps": 665, "total_steps": 728, "loss": 0.3998, "lr": 9.348979535930391e-07, "epoch": 6.395498392282958, "percentage": 91.35, "elapsed_time": "4:10:28", "remaining_time": "0:23:43"}
{"current_steps": 670, "total_steps": 728, "loss": 0.4027, "lr": 7.95464970653106e-07, "epoch": 6.443729903536978, "percentage": 92.03, "elapsed_time": "4:12:19", "remaining_time": "0:21:50"}
{"current_steps": 675, "total_steps": 728, "loss": 0.3971, "lr": 6.670763370851241e-07, "epoch": 6.491961414790997, "percentage": 92.72, "elapsed_time": "4:14:11", "remaining_time": "0:19:57"}
{"current_steps": 680, "total_steps": 728, "loss": 0.3984, "lr": 5.4980588799743e-07, "epoch": 6.540192926045016, "percentage": 93.41, "elapsed_time": "4:15:57", "remaining_time": "0:18:04"}
{"current_steps": 685, "total_steps": 728, "loss": 0.4015, "lr": 4.4372106453394405e-07, "epoch": 6.588424437299035, "percentage": 94.09, "elapsed_time": "4:17:43", "remaining_time": "0:16:10"}
{"current_steps": 690, "total_steps": 728, "loss": 0.3977, "lr": 3.48882875089378e-07, "epoch": 6.636655948553055, "percentage": 94.78, "elapsed_time": "4:19:28", "remaining_time": "0:14:17"}
{"current_steps": 695, "total_steps": 728, "loss": 0.407, "lr": 2.653458602238845e-07, "epoch": 6.684887459807074, "percentage": 95.47, "elapsed_time": "4:21:10", "remaining_time": "0:12:24"}
{"current_steps": 700, "total_steps": 728, "loss": 0.4057, "lr": 1.931580612972983e-07, "epoch": 6.733118971061093, "percentage": 96.15, "elapsed_time": "4:22:56", "remaining_time": "0:10:31"}
{"current_steps": 705, "total_steps": 728, "loss": 0.4041, "lr": 1.3236099284097415e-07, "epoch": 6.781350482315112, "percentage": 96.84, "elapsed_time": "4:24:38", "remaining_time": "0:08:38"}
{"current_steps": 710, "total_steps": 728, "loss": 0.3964, "lr": 8.298961868318689e-08, "epoch": 6.829581993569132, "percentage": 97.53, "elapsed_time": "4:26:34", "remaining_time": "0:06:45"}
{"current_steps": 715, "total_steps": 728, "loss": 0.4001, "lr": 4.507233184174675e-08, "epoch": 6.877813504823151, "percentage": 98.21, "elapsed_time": "4:28:30", "remaining_time": "0:04:52"}
{"current_steps": 720, "total_steps": 728, "loss": 0.3993, "lr": 1.863093819545192e-08, "epoch": 6.92604501607717, "percentage": 98.9, "elapsed_time": "4:30:23", "remaining_time": "0:03:00"}
{"current_steps": 725, "total_steps": 728, "loss": 0.4013, "lr": 3.680643943708706e-09, "epoch": 6.97427652733119, "percentage": 99.59, "elapsed_time": "4:32:17", "remaining_time": "0:01:07"}
{"current_steps": 728, "total_steps": 728, "epoch": 7.0, "percentage": 100.0, "elapsed_time": "4:33:22", "remaining_time": "0:00:00"}
{"current_steps": 728, "total_steps": 728, "epoch": 7.0, "percentage": 100.0, "elapsed_time": "0:00:00", "remaining_time": "0:00:00"}
{"current_steps": 728, "total_steps": 728, "epoch": 7.0, "percentage": 100.0, "elapsed_time": "0:00:00", "remaining_time": "0:00:00"}
{"current_steps": 728, "total_steps": 728, "epoch": 7.0, "percentage": 100.0, "elapsed_time": "0:00:00", "remaining_time": "0:00:00"}
{"current_steps": 728, "total_steps": 728, "epoch": 7.0, "percentage": 100.0, "elapsed_time": "0:00:00", "remaining_time": "0:00:00"}
{"current_steps": 728, "total_steps": 728, "epoch": 7.0, "percentage": 100.0, "elapsed_time": "0:00:00", "remaining_time": "0:00:00"}

1638
trainer_state.json Normal file

File diff suppressed because it is too large Load Diff

3
training_args.bin Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:19c5dec9d4f85a5f62b125e69bdf7b66adba30c6e49637ebe30bb1ccf45fd710
size 8721

BIN
training_loss.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 37 KiB

1
vocab.json Normal file

File diff suppressed because one or more lines are too long