初始化项目,由ModelHub XC社区提供模型

Model: laion/nemotron-terminal-data_querying__Qwen3-8B
Source: Original Platform
This commit is contained in:
ModelHub XC
2026-04-23 16:38:11 +08:00
commit 0267e54605
23 changed files with 154148 additions and 0 deletions

36
.gitattributes vendored Normal file
View File

@@ -0,0 +1,36 @@
*.7z filter=lfs diff=lfs merge=lfs -text
*.arrow filter=lfs diff=lfs merge=lfs -text
*.bin filter=lfs diff=lfs merge=lfs -text
*.bz2 filter=lfs diff=lfs merge=lfs -text
*.ckpt filter=lfs diff=lfs merge=lfs -text
*.ftz filter=lfs diff=lfs merge=lfs -text
*.gz filter=lfs diff=lfs merge=lfs -text
*.h5 filter=lfs diff=lfs merge=lfs -text
*.joblib filter=lfs diff=lfs merge=lfs -text
*.lfs.* filter=lfs diff=lfs merge=lfs -text
*.mlmodel filter=lfs diff=lfs merge=lfs -text
*.model filter=lfs diff=lfs merge=lfs -text
*.msgpack filter=lfs diff=lfs merge=lfs -text
*.npy filter=lfs diff=lfs merge=lfs -text
*.npz filter=lfs diff=lfs merge=lfs -text
*.onnx filter=lfs diff=lfs merge=lfs -text
*.ot filter=lfs diff=lfs merge=lfs -text
*.parquet filter=lfs diff=lfs merge=lfs -text
*.pb filter=lfs diff=lfs merge=lfs -text
*.pickle filter=lfs diff=lfs merge=lfs -text
*.pkl filter=lfs diff=lfs merge=lfs -text
*.pt filter=lfs diff=lfs merge=lfs -text
*.pth filter=lfs diff=lfs merge=lfs -text
*.rar filter=lfs diff=lfs merge=lfs -text
*.safetensors filter=lfs diff=lfs merge=lfs -text
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
*.tar.* filter=lfs diff=lfs merge=lfs -text
*.tar filter=lfs diff=lfs merge=lfs -text
*.tflite filter=lfs diff=lfs merge=lfs -text
*.tgz filter=lfs diff=lfs merge=lfs -text
*.wasm filter=lfs diff=lfs merge=lfs -text
*.xz filter=lfs diff=lfs merge=lfs -text
*.zip filter=lfs diff=lfs merge=lfs -text
*.zst filter=lfs diff=lfs merge=lfs -text
*tfevents* filter=lfs diff=lfs merge=lfs -text
tokenizer.json filter=lfs diff=lfs merge=lfs -text

61
README.md Normal file
View File

@@ -0,0 +1,61 @@
---
library_name: transformers
license: other
base_model: Qwen/Qwen3-8B
tags:
- llama-factory
- full
- generated_from_trainer
model-index:
- name: nemotron-data-querying__Qwen3-8B
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# nemotron-data-querying__Qwen3-8B
This model is a fine-tuned version of [Qwen/Qwen3-8B](https://huggingface.co/Qwen/Qwen3-8B) on the /e/data1/datasets/playground/ot/hf_hub/datasets--laion--nemotron-terminal-data_querying/snapshots/d916c690c1e21f34515e500e36eeae6463c91d5f_thinking_preprocessed dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 4e-05
- train_batch_size: 1
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- num_devices: 32
- gradient_accumulation_steps: 3
- total_train_batch_size: 96
- total_eval_batch_size: 256
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.98) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 7.0
### Training results
### Framework versions
- Transformers 4.57.6
- Pytorch 2.9.1+cu130
- Datasets 4.7.0
- Tokenizers 0.22.2

28
added_tokens.json Normal file
View File

@@ -0,0 +1,28 @@
{
"</think>": 151668,
"</tool_call>": 151658,
"</tool_response>": 151666,
"<think>": 151667,
"<tool_call>": 151657,
"<tool_response>": 151665,
"<|box_end|>": 151649,
"<|box_start|>": 151648,
"<|endoftext|>": 151643,
"<|file_sep|>": 151664,
"<|fim_middle|>": 151660,
"<|fim_pad|>": 151662,
"<|fim_prefix|>": 151659,
"<|fim_suffix|>": 151661,
"<|im_end|>": 151645,
"<|im_start|>": 151644,
"<|image_pad|>": 151655,
"<|object_ref_end|>": 151647,
"<|object_ref_start|>": 151646,
"<|quad_end|>": 151651,
"<|quad_start|>": 151650,
"<|repo_name|>": 151663,
"<|video_pad|>": 151656,
"<|vision_end|>": 151653,
"<|vision_pad|>": 151654,
"<|vision_start|>": 151652
}

16
all_results.json Normal file
View File

@@ -0,0 +1,16 @@
{
"achieved_tflops_per_gpu": 99403.5744791307,
"achieved_tflops_per_gpu_theoretical": 2858818.8951516366,
"epoch": 7.0,
"loss_nan_ranks": 0,
"loss_rank_avg": 0.3694714307785034,
"mfu_percent": 7024.987595698282,
"mfu_percent_theoretical": 202036.67103545135,
"total_flos": 2.638568480974045e+18,
"train_loss": 0.0,
"train_runtime": 0.8295,
"train_samples_per_second": 80459.329,
"train_steps_per_second": 843.831,
"valid_targets_mean": 8521.5,
"valid_targets_min": 2998
}

89
chat_template.jinja Normal file
View File

@@ -0,0 +1,89 @@
{%- if tools %}
{{- '<|im_start|>system\n' }}
{%- if messages[0].role == 'system' %}
{{- messages[0].content + '\n\n' }}
{%- endif %}
{{- "# Tools\n\nYou may call one or more functions to assist with the user query.\n\nYou are provided with function signatures within <tools></tools> XML tags:\n<tools>" }}
{%- for tool in tools %}
{{- "\n" }}
{{- tool | tojson }}
{%- endfor %}
{{- "\n</tools>\n\nFor each function call, return a json object with function name and arguments within <tool_call></tool_call> XML tags:\n<tool_call>\n{\"name\": <function-name>, \"arguments\": <args-json-object>}\n</tool_call><|im_end|>\n" }}
{%- else %}
{%- if messages[0].role == 'system' %}
{{- '<|im_start|>system\n' + messages[0].content + '<|im_end|>\n' }}
{%- endif %}
{%- endif %}
{%- set ns = namespace(multi_step_tool=true, last_query_index=messages|length - 1) %}
{%- for message in messages[::-1] %}
{%- set index = (messages|length - 1) - loop.index0 %}
{%- if ns.multi_step_tool and message.role == "user" and message.content is string and not(message.content.startswith('<tool_response>') and message.content.endswith('</tool_response>')) %}
{%- set ns.multi_step_tool = false %}
{%- set ns.last_query_index = index %}
{%- endif %}
{%- endfor %}
{%- for message in messages %}
{%- if message.content is string %}
{%- set content = message.content %}
{%- else %}
{%- set content = '' %}
{%- endif %}
{%- if (message.role == "user") or (message.role == "system" and not loop.first) %}
{{- '<|im_start|>' + message.role + '\n' + content + '<|im_end|>' + '\n' }}
{%- elif message.role == "assistant" %}
{%- set reasoning_content = '' %}
{%- if message.reasoning_content is string %}
{%- set reasoning_content = message.reasoning_content %}
{%- else %}
{%- if '</think>' in content %}
{%- set reasoning_content = content.split('</think>')[0].rstrip('\n').split('<think>')[-1].lstrip('\n') %}
{%- set content = content.split('</think>')[-1].lstrip('\n') %}
{%- endif %}
{%- endif %}
{%- if loop.index0 > ns.last_query_index %}
{%- if loop.last or (not loop.last and reasoning_content) %}
{{- '<|im_start|>' + message.role + '\n<think>\n' + reasoning_content.strip('\n') + '\n</think>\n\n' + content.lstrip('\n') }}
{%- else %}
{{- '<|im_start|>' + message.role + '\n' + content }}
{%- endif %}
{%- else %}
{{- '<|im_start|>' + message.role + '\n' + content }}
{%- endif %}
{%- if message.tool_calls %}
{%- for tool_call in message.tool_calls %}
{%- if (loop.first and content) or (not loop.first) %}
{{- '\n' }}
{%- endif %}
{%- if tool_call.function %}
{%- set tool_call = tool_call.function %}
{%- endif %}
{{- '<tool_call>\n{"name": "' }}
{{- tool_call.name }}
{{- '", "arguments": ' }}
{%- if tool_call.arguments is string %}
{{- tool_call.arguments }}
{%- else %}
{{- tool_call.arguments | tojson }}
{%- endif %}
{{- '}\n</tool_call>' }}
{%- endfor %}
{%- endif %}
{{- '<|im_end|>\n' }}
{%- elif message.role == "tool" %}
{%- if loop.first or (messages[loop.index0 - 1].role != "tool") %}
{{- '<|im_start|>user' }}
{%- endif %}
{{- '\n<tool_response>\n' }}
{{- content }}
{{- '\n</tool_response>' }}
{%- if loop.last or (messages[loop.index0 + 1].role != "tool") %}
{{- '<|im_end|>\n' }}
{%- endif %}
{%- endif %}
{%- endfor %}
{%- if add_generation_prompt %}
{{- '<|im_start|>assistant\n' }}
{%- if enable_thinking is defined and enable_thinking is false %}
{{- '<think>\n\n</think>\n\n' }}
{%- endif %}
{%- endif %}

68
config.json Normal file
View File

@@ -0,0 +1,68 @@
{
"architectures": [
"Qwen3ForCausalLM"
],
"attention_bias": false,
"attention_dropout": 0.0,
"dtype": "bfloat16",
"eos_token_id": 151645,
"head_dim": 128,
"hidden_act": "silu",
"hidden_size": 4096,
"initializer_range": 0.02,
"intermediate_size": 12288,
"layer_types": [
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention"
],
"max_position_embeddings": 40960,
"max_window_layers": 36,
"model_type": "qwen3",
"num_attention_heads": 32,
"num_hidden_layers": 36,
"num_key_value_heads": 8,
"pad_token_id": 151643,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000,
"sliding_window": null,
"tie_word_embeddings": false,
"transformers_version": "4.57.6",
"use_cache": false,
"use_sliding_window": false,
"vocab_size": 151936
}

12
generation_config.json Normal file
View File

@@ -0,0 +1,12 @@
{
"do_sample": true,
"eos_token_id": [
151645,
151643
],
"pad_token_id": 151643,
"temperature": 0.6,
"top_k": 20,
"top_p": 0.95,
"transformers_version": "4.57.6"
}

151388
merges.txt Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:3bc8255a40e0722427bd8c344fe504fc3413ec34b1256bedfbc4aa902fc6434a
size 4902257696

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:0bfff865b32ecccb4028605a886b3429a331b0e29c81409f37775da5270a7f9d
size 4915960368

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:00ad3031f2d92319195fe67b0a4ef52243188dca0b4811046113f24e205b69c5
size 4983068496

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:567637bca903d9b5b81b3609f1368a58d91bd8fb9c5ad7728a559444a29bba93
size 1580230264

View File

@@ -0,0 +1,407 @@
{
"metadata": {
"total_parameters": 308224,
"total_size": 16381470720
},
"weight_map": {
"lm_head.weight": "model-00004-of-00004.safetensors",
"model.embed_tokens.weight": "model-00001-of-00004.safetensors",
"model.layers.0.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.0.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.0.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.0.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.0.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.0.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.0.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.0.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.0.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.0.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.0.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.1.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.1.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.1.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.1.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.1.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.1.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.1.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.1.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.1.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.1.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.1.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.10.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.10.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.10.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.10.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.10.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.10.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.10.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.10.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.10.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.10.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.10.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.11.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.11.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.11.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.11.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.11.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.11.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.11.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.11.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.11.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.11.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.11.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.12.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.12.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.12.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.12.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.12.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.12.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.12.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.12.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.12.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.12.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.12.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.13.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.13.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.13.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.13.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.13.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.13.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.13.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.13.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.13.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.13.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.13.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.14.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.14.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.14.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.14.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.14.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.14.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.14.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.14.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.14.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.14.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.14.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.15.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.15.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.15.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.15.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.15.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.15.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.15.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.15.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.15.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.15.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.15.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.16.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.16.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.16.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.16.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.16.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.16.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.16.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.16.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.16.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.16.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.16.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.17.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.17.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.17.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.17.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.17.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.17.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.17.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.17.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.17.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.17.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.17.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.18.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.18.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.18.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.18.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.18.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.18.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.18.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.18.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.18.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.18.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.18.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.19.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.19.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.19.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.19.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.19.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.19.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.19.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.19.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.19.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.19.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.19.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.2.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.2.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.2.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.2.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.2.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.2.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.2.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.2.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.2.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.2.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.2.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.20.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.20.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.20.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.20.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.20.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.20.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.20.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.20.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.20.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.20.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.20.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.21.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.21.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.21.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.21.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.21.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.21.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.21.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.21.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.21.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.21.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.21.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.22.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.22.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.22.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.22.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.22.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.22.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.22.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.22.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.22.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.22.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.22.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.23.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.23.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.23.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.23.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.23.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.23.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.23.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.23.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.23.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.23.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.23.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.24.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.24.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.24.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.24.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.24.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.24.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.24.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.24.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.24.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.24.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.24.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.25.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.25.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.25.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.25.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.25.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.25.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.25.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.25.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.25.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.25.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.25.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.26.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.26.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.26.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.26.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.26.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.26.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.26.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.26.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.26.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.26.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.26.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.27.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.27.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.27.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.27.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.27.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.27.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.27.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.27.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.27.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.27.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.27.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.28.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.28.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.28.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.28.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.28.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.28.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.28.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.28.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.28.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.28.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.28.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.29.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.29.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.29.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.29.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.29.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.29.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.29.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.29.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.29.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.29.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.29.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.3.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.3.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.3.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.3.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.3.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.3.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.3.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.3.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.3.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.3.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.3.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.30.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.30.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.30.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.30.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.30.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.30.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.30.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.30.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.30.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.30.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.30.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.31.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.31.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.31.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.31.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.31.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.31.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.31.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.31.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.31.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.31.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.31.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.32.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.32.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.32.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.32.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.32.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.32.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.32.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.32.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.32.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.32.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.32.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.33.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.33.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.33.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.33.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.33.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.33.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.33.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.33.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.33.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.33.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.33.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.34.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.34.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.34.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.34.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.34.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.34.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.34.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.34.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.34.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.34.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.34.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.35.input_layernorm.weight": "model-00004-of-00004.safetensors",
"model.layers.35.mlp.down_proj.weight": "model-00004-of-00004.safetensors",
"model.layers.35.mlp.gate_proj.weight": "model-00004-of-00004.safetensors",
"model.layers.35.mlp.up_proj.weight": "model-00004-of-00004.safetensors",
"model.layers.35.post_attention_layernorm.weight": "model-00004-of-00004.safetensors",
"model.layers.35.self_attn.k_norm.weight": "model-00004-of-00004.safetensors",
"model.layers.35.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.35.self_attn.o_proj.weight": "model-00004-of-00004.safetensors",
"model.layers.35.self_attn.q_norm.weight": "model-00004-of-00004.safetensors",
"model.layers.35.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.35.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.4.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.4.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.4.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.4.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.4.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.4.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.4.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.4.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.4.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.4.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.4.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.5.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.5.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.5.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.5.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.5.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.5.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.5.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.5.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.5.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.5.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.5.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.6.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.6.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.6.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.6.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.6.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.6.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.6.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.6.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.6.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.6.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.6.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.7.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.7.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.7.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.7.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.7.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.7.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.7.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.7.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.7.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.7.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.7.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.8.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.8.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.8.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.8.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.8.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.8.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.8.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.8.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.8.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.8.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.8.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.9.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.9.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.9.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.9.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.9.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.9.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.9.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.9.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.9.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.9.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.9.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.norm.weight": "model-00004-of-00004.safetensors"
}
}

12
run_summary.json Normal file
View File

@@ -0,0 +1,12 @@
{
"agent_name": "d916c690c1e21f34515e500e36eeae6463c91d5f_thinking_preprocessed",
"training_start": null,
"training_end": null,
"created_by": "DCAgent",
"base_model_name": "Qwen/Qwen3-8B",
"dataset_name": "/e/data1/datasets/playground/ot/hf_hub/datasets--laion--nemotron-terminal-data_querying/snapshots/d916c690c1e21f34515e500e36eeae6463c91d5f_thinking_preprocessed",
"training_type": "SFT",
"training_parameters": "https://huggingface.co/laion/nemotron-terminal-data_querying__Qwen3-8B/blob/main/config.json",
"wandb_link": null,
"traces_location_s3": null
}

31
special_tokens_map.json Normal file
View File

@@ -0,0 +1,31 @@
{
"additional_special_tokens": [
"<|im_start|>",
"<|im_end|>",
"<|object_ref_start|>",
"<|object_ref_end|>",
"<|box_start|>",
"<|box_end|>",
"<|quad_start|>",
"<|quad_end|>",
"<|vision_start|>",
"<|vision_end|>",
"<|vision_pad|>",
"<|image_pad|>",
"<|video_pad|>"
],
"eos_token": {
"content": "<|im_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false
},
"pad_token": {
"content": "<|endoftext|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false
}
}

3
tokenizer.json Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:aeb13307a71acd8fe81861d94ad54ab689df773318809eed3cbe794b4492dae4
size 11422654

240
tokenizer_config.json Normal file
View File

@@ -0,0 +1,240 @@
{
"add_bos_token": false,
"add_prefix_space": false,
"added_tokens_decoder": {
"151643": {
"content": "<|endoftext|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151644": {
"content": "<|im_start|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151645": {
"content": "<|im_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151646": {
"content": "<|object_ref_start|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151647": {
"content": "<|object_ref_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151648": {
"content": "<|box_start|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151649": {
"content": "<|box_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151650": {
"content": "<|quad_start|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151651": {
"content": "<|quad_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151652": {
"content": "<|vision_start|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151653": {
"content": "<|vision_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151654": {
"content": "<|vision_pad|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151655": {
"content": "<|image_pad|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151656": {
"content": "<|video_pad|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151657": {
"content": "<tool_call>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151658": {
"content": "</tool_call>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151659": {
"content": "<|fim_prefix|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151660": {
"content": "<|fim_middle|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151661": {
"content": "<|fim_suffix|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151662": {
"content": "<|fim_pad|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151663": {
"content": "<|repo_name|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151664": {
"content": "<|file_sep|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151665": {
"content": "<tool_response>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151666": {
"content": "</tool_response>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151667": {
"content": "<think>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151668": {
"content": "</think>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
}
},
"additional_special_tokens": [
"<|im_start|>",
"<|im_end|>",
"<|object_ref_start|>",
"<|object_ref_end|>",
"<|box_start|>",
"<|box_end|>",
"<|quad_start|>",
"<|quad_end|>",
"<|vision_start|>",
"<|vision_end|>",
"<|vision_pad|>",
"<|image_pad|>",
"<|video_pad|>"
],
"bos_token": null,
"clean_up_tokenization_spaces": false,
"eos_token": "<|im_end|>",
"errors": "replace",
"extra_special_tokens": {},
"model_max_length": 32768,
"pad_token": "<|endoftext|>",
"padding_side": "right",
"split_special_tokens": false,
"tokenizer_class": "Qwen2Tokenizer",
"unk_token": null
}

12
train_results.json Normal file
View File

@@ -0,0 +1,12 @@
{
"achieved_tflops_per_gpu": 99403.5744791307,
"achieved_tflops_per_gpu_theoretical": 2858818.8951516366,
"epoch": 7.0,
"mfu_percent": 7024.987595698282,
"mfu_percent_theoretical": 202036.67103545135,
"total_flos": 2.638568480974045e+18,
"train_loss": 0.0,
"train_runtime": 0.8295,
"train_samples_per_second": 80459.329,
"train_steps_per_second": 843.831
}

146
trainer_log.jsonl Normal file
View File

@@ -0,0 +1,146 @@
{"current_steps": 5, "total_steps": 700, "loss": 0.9272, "lr": 2.285714285714286e-06, "epoch": 0.050335570469798654, "percentage": 0.71, "elapsed_time": "0:02:16", "remaining_time": "5:16:49"}
{"current_steps": 10, "total_steps": 700, "loss": 0.8868, "lr": 5.142857142857142e-06, "epoch": 0.10067114093959731, "percentage": 1.43, "elapsed_time": "0:04:15", "remaining_time": "4:53:41"}
{"current_steps": 15, "total_steps": 700, "loss": 0.7848, "lr": 8.000000000000001e-06, "epoch": 0.15100671140939598, "percentage": 2.14, "elapsed_time": "0:06:25", "remaining_time": "4:53:08"}
{"current_steps": 20, "total_steps": 700, "loss": 0.7382, "lr": 1.0857142857142858e-05, "epoch": 0.20134228187919462, "percentage": 2.86, "elapsed_time": "0:08:29", "remaining_time": "4:48:33"}
{"current_steps": 25, "total_steps": 700, "loss": 0.7031, "lr": 1.3714285714285716e-05, "epoch": 0.2516778523489933, "percentage": 3.57, "elapsed_time": "0:10:34", "remaining_time": "4:45:33"}
{"current_steps": 30, "total_steps": 700, "loss": 0.6708, "lr": 1.6571428571428574e-05, "epoch": 0.30201342281879195, "percentage": 4.29, "elapsed_time": "0:12:39", "remaining_time": "4:42:32"}
{"current_steps": 35, "total_steps": 700, "loss": 0.6344, "lr": 1.942857142857143e-05, "epoch": 0.3523489932885906, "percentage": 5.0, "elapsed_time": "0:14:52", "remaining_time": "4:42:45"}
{"current_steps": 40, "total_steps": 700, "loss": 0.6074, "lr": 2.2285714285714287e-05, "epoch": 0.40268456375838924, "percentage": 5.71, "elapsed_time": "0:17:00", "remaining_time": "4:40:36"}
{"current_steps": 45, "total_steps": 700, "loss": 0.5747, "lr": 2.5142857142857143e-05, "epoch": 0.45302013422818793, "percentage": 6.43, "elapsed_time": "0:19:08", "remaining_time": "4:38:39"}
{"current_steps": 50, "total_steps": 700, "loss": 0.5489, "lr": 2.8e-05, "epoch": 0.5033557046979866, "percentage": 7.14, "elapsed_time": "0:21:12", "remaining_time": "4:35:46"}
{"current_steps": 55, "total_steps": 700, "loss": 0.5397, "lr": 3.085714285714286e-05, "epoch": 0.5536912751677853, "percentage": 7.86, "elapsed_time": "0:23:17", "remaining_time": "4:33:13"}
{"current_steps": 60, "total_steps": 700, "loss": 0.5154, "lr": 3.3714285714285716e-05, "epoch": 0.6040268456375839, "percentage": 8.57, "elapsed_time": "0:25:31", "remaining_time": "4:32:13"}
{"current_steps": 65, "total_steps": 700, "loss": 0.5001, "lr": 3.6571428571428576e-05, "epoch": 0.6543624161073825, "percentage": 9.29, "elapsed_time": "0:27:33", "remaining_time": "4:29:09"}
{"current_steps": 70, "total_steps": 700, "loss": 0.4901, "lr": 3.9428571428571435e-05, "epoch": 0.7046979865771812, "percentage": 10.0, "elapsed_time": "0:29:37", "remaining_time": "4:26:37"}
{"current_steps": 75, "total_steps": 700, "loss": 0.4879, "lr": 3.9996021455410475e-05, "epoch": 0.7550335570469798, "percentage": 10.71, "elapsed_time": "0:31:43", "remaining_time": "4:24:18"}
{"current_steps": 80, "total_steps": 700, "loss": 0.4644, "lr": 3.9979861330826295e-05, "epoch": 0.8053691275167785, "percentage": 11.43, "elapsed_time": "0:33:52", "remaining_time": "4:22:35"}
{"current_steps": 85, "total_steps": 700, "loss": 0.4603, "lr": 3.9951281005196486e-05, "epoch": 0.8557046979865772, "percentage": 12.14, "elapsed_time": "0:35:57", "remaining_time": "4:20:09"}
{"current_steps": 90, "total_steps": 700, "loss": 0.4573, "lr": 3.99102982450803e-05, "epoch": 0.9060402684563759, "percentage": 12.86, "elapsed_time": "0:38:08", "remaining_time": "4:18:28"}
{"current_steps": 95, "total_steps": 700, "loss": 0.4411, "lr": 3.985693852683675e-05, "epoch": 0.9563758389261745, "percentage": 13.57, "elapsed_time": "0:40:14", "remaining_time": "4:16:15"}
{"current_steps": 100, "total_steps": 700, "loss": 0.4358, "lr": 3.9791235020787546e-05, "epoch": 1.0, "percentage": 14.29, "elapsed_time": "0:42:07", "remaining_time": "4:12:43"}
{"current_steps": 105, "total_steps": 700, "loss": 0.4463, "lr": 3.971322857059726e-05, "epoch": 1.0503355704697988, "percentage": 15.0, "elapsed_time": "0:44:07", "remaining_time": "4:10:02"}
{"current_steps": 110, "total_steps": 700, "loss": 0.4345, "lr": 3.962296766788345e-05, "epoch": 1.1006711409395973, "percentage": 15.71, "elapsed_time": "0:46:13", "remaining_time": "4:07:58"}
{"current_steps": 115, "total_steps": 700, "loss": 0.4304, "lr": 3.952050842207249e-05, "epoch": 1.151006711409396, "percentage": 16.43, "elapsed_time": "0:48:21", "remaining_time": "4:05:57"}
{"current_steps": 120, "total_steps": 700, "loss": 0.4331, "lr": 3.940591452551993e-05, "epoch": 1.2013422818791946, "percentage": 17.14, "elapsed_time": "0:50:25", "remaining_time": "4:03:44"}
{"current_steps": 125, "total_steps": 700, "loss": 0.4195, "lr": 3.927925721391707e-05, "epoch": 1.2516778523489933, "percentage": 17.86, "elapsed_time": "0:52:34", "remaining_time": "4:01:50"}
{"current_steps": 130, "total_steps": 700, "loss": 0.4256, "lr": 3.914061522200825e-05, "epoch": 1.302013422818792, "percentage": 18.57, "elapsed_time": "1:01:03", "remaining_time": "4:27:43"}
{"current_steps": 135, "total_steps": 700, "loss": 0.4201, "lr": 3.899007473464653e-05, "epoch": 1.3523489932885906, "percentage": 19.29, "elapsed_time": "1:03:13", "remaining_time": "4:24:37"}
{"current_steps": 140, "total_steps": 700, "loss": 0.4214, "lr": 3.882772933321807e-05, "epoch": 1.4026845637583891, "percentage": 20.0, "elapsed_time": "1:05:23", "remaining_time": "4:21:34"}
{"current_steps": 145, "total_steps": 700, "loss": 0.4147, "lr": 3.8653679937468556e-05, "epoch": 1.4530201342281879, "percentage": 20.71, "elapsed_time": "1:07:26", "remaining_time": "4:18:08"}
{"current_steps": 150, "total_steps": 700, "loss": 0.4086, "lr": 3.846803474276789e-05, "epoch": 1.5033557046979866, "percentage": 21.43, "elapsed_time": "1:09:26", "remaining_time": "4:14:37"}
{"current_steps": 155, "total_steps": 700, "loss": 0.4146, "lr": 3.827090915285202e-05, "epoch": 1.5536912751677852, "percentage": 22.14, "elapsed_time": "1:11:39", "remaining_time": "4:11:56"}
{"current_steps": 160, "total_steps": 700, "loss": 0.4118, "lr": 3.806242570808384e-05, "epoch": 1.604026845637584, "percentage": 22.86, "elapsed_time": "1:13:43", "remaining_time": "4:08:48"}
{"current_steps": 165, "total_steps": 700, "loss": 0.4131, "lr": 3.7842714009277675e-05, "epoch": 1.6543624161073827, "percentage": 23.57, "elapsed_time": "1:15:46", "remaining_time": "4:05:43"}
{"current_steps": 170, "total_steps": 700, "loss": 0.4071, "lr": 3.761191063713476e-05, "epoch": 1.7046979865771812, "percentage": 24.29, "elapsed_time": "1:17:50", "remaining_time": "4:02:40"}
{"current_steps": 175, "total_steps": 700, "loss": 0.4106, "lr": 3.737015906733978e-05, "epoch": 1.7550335570469797, "percentage": 25.0, "elapsed_time": "1:20:02", "remaining_time": "4:00:06"}
{"current_steps": 180, "total_steps": 700, "loss": 0.4056, "lr": 3.711760958137118e-05, "epoch": 1.8053691275167785, "percentage": 25.71, "elapsed_time": "1:22:09", "remaining_time": "3:57:19"}
{"current_steps": 185, "total_steps": 700, "loss": 0.4115, "lr": 3.6854419173080784e-05, "epoch": 1.8557046979865772, "percentage": 26.43, "elapsed_time": "1:24:13", "remaining_time": "3:54:27"}
{"current_steps": 190, "total_steps": 700, "loss": 0.4071, "lr": 3.658075145110083e-05, "epoch": 1.9060402684563758, "percentage": 27.14, "elapsed_time": "1:26:12", "remaining_time": "3:51:24"}
{"current_steps": 195, "total_steps": 700, "loss": 0.4078, "lr": 3.6296776537138905e-05, "epoch": 1.9563758389261745, "percentage": 27.86, "elapsed_time": "1:28:15", "remaining_time": "3:48:33"}
{"current_steps": 200, "total_steps": 700, "loss": 0.4128, "lr": 3.600267096022413e-05, "epoch": 2.0, "percentage": 28.57, "elapsed_time": "1:30:04", "remaining_time": "3:45:12"}
{"current_steps": 205, "total_steps": 700, "loss": 0.3993, "lr": 3.569861754697045e-05, "epoch": 2.0503355704697985, "percentage": 29.29, "elapsed_time": "1:31:59", "remaining_time": "3:42:08"}
{"current_steps": 210, "total_steps": 700, "loss": 0.3982, "lr": 3.538480530792498e-05, "epoch": 2.1006711409395975, "percentage": 30.0, "elapsed_time": "1:34:06", "remaining_time": "3:39:34"}
{"current_steps": 215, "total_steps": 700, "loss": 0.3894, "lr": 3.5061429320072225e-05, "epoch": 2.151006711409396, "percentage": 30.71, "elapsed_time": "1:36:07", "remaining_time": "3:36:50"}
{"current_steps": 220, "total_steps": 700, "loss": 0.3953, "lr": 3.472869060556724e-05, "epoch": 2.2013422818791946, "percentage": 31.43, "elapsed_time": "1:38:16", "remaining_time": "3:34:24"}
{"current_steps": 225, "total_steps": 700, "loss": 0.3862, "lr": 3.438679600677303e-05, "epoch": 2.251677852348993, "percentage": 32.14, "elapsed_time": "1:40:18", "remaining_time": "3:31:45"}
{"current_steps": 230, "total_steps": 700, "loss": 0.3993, "lr": 3.4035958057679836e-05, "epoch": 2.302013422818792, "percentage": 32.86, "elapsed_time": "1:42:31", "remaining_time": "3:29:29"}
{"current_steps": 235, "total_steps": 700, "loss": 0.3919, "lr": 3.36763948517864e-05, "epoch": 2.3523489932885906, "percentage": 33.57, "elapsed_time": "1:44:37", "remaining_time": "3:27:02"}
{"current_steps": 240, "total_steps": 700, "loss": 0.3958, "lr": 3.330832990652523e-05, "epoch": 2.402684563758389, "percentage": 34.29, "elapsed_time": "1:46:39", "remaining_time": "3:24:26"}
{"current_steps": 245, "total_steps": 700, "loss": 0.3947, "lr": 3.293199202431599e-05, "epoch": 2.453020134228188, "percentage": 35.0, "elapsed_time": "1:48:49", "remaining_time": "3:22:06"}
{"current_steps": 250, "total_steps": 700, "loss": 0.3863, "lr": 3.2547615150333855e-05, "epoch": 2.5033557046979866, "percentage": 35.71, "elapsed_time": "1:50:56", "remaining_time": "3:19:41"}
{"current_steps": 255, "total_steps": 700, "loss": 0.3934, "lr": 3.2155438227080607e-05, "epoch": 2.553691275167785, "percentage": 36.43, "elapsed_time": "1:53:00", "remaining_time": "3:17:12"}
{"current_steps": 260, "total_steps": 700, "loss": 0.3912, "lr": 3.1755705045849465e-05, "epoch": 2.604026845637584, "percentage": 37.14, "elapsed_time": "1:54:59", "remaining_time": "3:14:36"}
{"current_steps": 265, "total_steps": 700, "loss": 0.3956, "lr": 3.134866409517564e-05, "epoch": 2.6543624161073827, "percentage": 37.86, "elapsed_time": "1:57:02", "remaining_time": "3:12:06"}
{"current_steps": 270, "total_steps": 700, "loss": 0.3917, "lr": 3.0934568406366875e-05, "epoch": 2.704697986577181, "percentage": 38.57, "elapsed_time": "1:59:07", "remaining_time": "3:09:43"}
{"current_steps": 275, "total_steps": 700, "loss": 0.3851, "lr": 3.0513675396210094e-05, "epoch": 2.7550335570469797, "percentage": 39.29, "elapsed_time": "2:01:13", "remaining_time": "3:07:20"}
{"current_steps": 280, "total_steps": 700, "loss": 0.3859, "lr": 3.0086246706951888e-05, "epoch": 2.8053691275167782, "percentage": 40.0, "elapsed_time": "2:03:21", "remaining_time": "3:05:02"}
{"current_steps": 285, "total_steps": 700, "loss": 0.3828, "lr": 2.965254804365222e-05, "epoch": 2.8557046979865772, "percentage": 40.71, "elapsed_time": "2:05:24", "remaining_time": "3:02:36"}
{"current_steps": 290, "total_steps": 700, "loss": 0.3876, "lr": 2.921284900901265e-05, "epoch": 2.9060402684563758, "percentage": 41.43, "elapsed_time": "2:07:35", "remaining_time": "3:00:23"}
{"current_steps": 295, "total_steps": 700, "loss": 0.3867, "lr": 2.876742293578155e-05, "epoch": 2.9563758389261743, "percentage": 42.14, "elapsed_time": "2:09:39", "remaining_time": "2:58:00"}
{"current_steps": 300, "total_steps": 700, "loss": 0.3839, "lr": 2.831654671684066e-05, "epoch": 3.0, "percentage": 42.86, "elapsed_time": "2:11:26", "remaining_time": "2:55:15"}
{"current_steps": 305, "total_steps": 700, "loss": 0.3796, "lr": 2.7860500633078475e-05, "epoch": 3.0503355704697985, "percentage": 43.57, "elapsed_time": "2:13:43", "remaining_time": "2:53:10"}
{"current_steps": 310, "total_steps": 700, "loss": 0.3766, "lr": 2.7399568179157582e-05, "epoch": 3.1006711409395975, "percentage": 44.29, "elapsed_time": "2:15:48", "remaining_time": "2:50:51"}
{"current_steps": 315, "total_steps": 700, "loss": 0.3801, "lr": 2.693403588728415e-05, "epoch": 3.151006711409396, "percentage": 45.0, "elapsed_time": "2:17:53", "remaining_time": "2:48:31"}
{"current_steps": 320, "total_steps": 700, "loss": 0.3812, "lr": 2.6464193149089204e-05, "epoch": 3.2013422818791946, "percentage": 45.71, "elapsed_time": "2:19:52", "remaining_time": "2:46:06"}
{"current_steps": 325, "total_steps": 700, "loss": 0.3745, "lr": 2.5990332035732388e-05, "epoch": 3.251677852348993, "percentage": 46.43, "elapsed_time": "2:21:48", "remaining_time": "2:43:37"}
{"current_steps": 330, "total_steps": 700, "loss": 0.3774, "lr": 2.5512747116339985e-05, "epoch": 3.302013422818792, "percentage": 47.14, "elapsed_time": "2:23:53", "remaining_time": "2:41:20"}
{"current_steps": 335, "total_steps": 700, "loss": 0.3814, "lr": 2.5031735274890176e-05, "epoch": 3.3523489932885906, "percentage": 47.86, "elapsed_time": "2:25:57", "remaining_time": "2:39:01"}
{"current_steps": 340, "total_steps": 700, "loss": 0.376, "lr": 2.454759552565923e-05, "epoch": 3.402684563758389, "percentage": 48.57, "elapsed_time": "2:28:02", "remaining_time": "2:36:44"}
{"current_steps": 345, "total_steps": 700, "loss": 0.3756, "lr": 2.4060628827343525e-05, "epoch": 3.453020134228188, "percentage": 49.29, "elapsed_time": "2:30:09", "remaining_time": "2:34:30"}
{"current_steps": 350, "total_steps": 700, "loss": 0.3786, "lr": 2.3571137895972735e-05, "epoch": 3.5033557046979866, "percentage": 50.0, "elapsed_time": "2:32:17", "remaining_time": "2:32:17"}
{"current_steps": 355, "total_steps": 700, "loss": 0.3791, "lr": 2.307942701673067e-05, "epoch": 3.553691275167785, "percentage": 50.71, "elapsed_time": "2:34:24", "remaining_time": "2:30:03"}
{"current_steps": 360, "total_steps": 700, "loss": 0.3864, "lr": 2.258580185480067e-05, "epoch": 3.604026845637584, "percentage": 51.43, "elapsed_time": "2:36:30", "remaining_time": "2:27:48"}
{"current_steps": 365, "total_steps": 700, "loss": 0.3783, "lr": 2.209056926535307e-05, "epoch": 3.6543624161073827, "percentage": 52.14, "elapsed_time": "2:38:44", "remaining_time": "2:25:41"}
{"current_steps": 370, "total_steps": 700, "loss": 0.3763, "lr": 2.1594037102793054e-05, "epoch": 3.704697986577181, "percentage": 52.86, "elapsed_time": "2:40:41", "remaining_time": "2:23:19"}
{"current_steps": 375, "total_steps": 700, "loss": 0.3747, "lr": 2.1096514029387204e-05, "epoch": 3.7550335570469797, "percentage": 53.57, "elapsed_time": "2:42:51", "remaining_time": "2:21:09"}
{"current_steps": 380, "total_steps": 700, "loss": 0.3776, "lr": 2.0598309323387974e-05, "epoch": 3.8053691275167782, "percentage": 54.29, "elapsed_time": "2:44:50", "remaining_time": "2:18:48"}
{"current_steps": 385, "total_steps": 700, "loss": 0.3745, "lr": 2.0099732686775165e-05, "epoch": 3.8557046979865772, "percentage": 55.0, "elapsed_time": "2:46:53", "remaining_time": "2:16:32"}
{"current_steps": 390, "total_steps": 700, "loss": 0.3767, "lr": 1.9601094052734043e-05, "epoch": 3.9060402684563758, "percentage": 55.71, "elapsed_time": "2:48:58", "remaining_time": "2:14:18"}
{"current_steps": 395, "total_steps": 700, "loss": 0.374, "lr": 1.910270339298971e-05, "epoch": 3.9563758389261743, "percentage": 56.43, "elapsed_time": "2:51:08", "remaining_time": "2:12:08"}
{"current_steps": 400, "total_steps": 700, "loss": 0.3722, "lr": 1.8604870525117496e-05, "epoch": 4.0, "percentage": 57.14, "elapsed_time": "2:52:52", "remaining_time": "2:09:39"}
{"current_steps": 405, "total_steps": 700, "loss": 0.3755, "lr": 1.810790491994926e-05, "epoch": 4.050335570469799, "percentage": 57.86, "elapsed_time": "2:54:52", "remaining_time": "2:07:22"}
{"current_steps": 410, "total_steps": 700, "loss": 0.3774, "lr": 1.7612115509195118e-05, "epoch": 4.100671140939597, "percentage": 58.57, "elapsed_time": "2:56:53", "remaining_time": "2:05:06"}
{"current_steps": 415, "total_steps": 700, "loss": 0.3698, "lr": 1.7117810493400403e-05, "epoch": 4.151006711409396, "percentage": 59.29, "elapsed_time": "2:58:55", "remaining_time": "2:02:52"}
{"current_steps": 420, "total_steps": 700, "loss": 0.3661, "lr": 1.6625297150357103e-05, "epoch": 4.201342281879195, "percentage": 60.0, "elapsed_time": "3:01:05", "remaining_time": "2:00:43"}
{"current_steps": 425, "total_steps": 700, "loss": 0.3726, "lr": 1.613488164408894e-05, "epoch": 4.251677852348993, "percentage": 60.71, "elapsed_time": "3:03:03", "remaining_time": "1:58:26"}
{"current_steps": 430, "total_steps": 700, "loss": 0.3754, "lr": 1.5646868834528756e-05, "epoch": 4.302013422818792, "percentage": 61.43, "elapsed_time": "3:05:09", "remaining_time": "1:56:15"}
{"current_steps": 435, "total_steps": 700, "loss": 0.3683, "lr": 1.5161562088006649e-05, "epoch": 4.35234899328859, "percentage": 62.14, "elapsed_time": "3:07:15", "remaining_time": "1:54:04"}
{"current_steps": 440, "total_steps": 700, "loss": 0.3727, "lr": 1.46792630886665e-05, "epoch": 4.402684563758389, "percentage": 62.86, "elapsed_time": "3:09:15", "remaining_time": "1:51:50"}
{"current_steps": 445, "total_steps": 700, "loss": 0.3691, "lr": 1.4200271650928277e-05, "epoch": 4.453020134228188, "percentage": 63.57, "elapsed_time": "3:11:23", "remaining_time": "1:49:40"}
{"current_steps": 450, "total_steps": 700, "loss": 0.3713, "lr": 1.3724885533112595e-05, "epoch": 4.503355704697986, "percentage": 64.29, "elapsed_time": "3:13:26", "remaining_time": "1:47:28"}
{"current_steps": 455, "total_steps": 700, "loss": 0.3666, "lr": 1.3253400252343403e-05, "epoch": 4.553691275167785, "percentage": 65.0, "elapsed_time": "3:15:31", "remaining_time": "1:45:16"}
{"current_steps": 460, "total_steps": 700, "loss": 0.3675, "lr": 1.2786108900843927e-05, "epoch": 4.604026845637584, "percentage": 65.71, "elapsed_time": "3:17:32", "remaining_time": "1:43:04"}
{"current_steps": 465, "total_steps": 700, "loss": 0.3664, "lr": 1.2323301963739995e-05, "epoch": 4.654362416107382, "percentage": 66.43, "elapsed_time": "3:19:37", "remaining_time": "1:40:53"}
{"current_steps": 470, "total_steps": 700, "loss": 0.367, "lr": 1.1865267138484e-05, "epoch": 4.704697986577181, "percentage": 67.14, "elapsed_time": "3:21:47", "remaining_time": "1:38:45"}
{"current_steps": 475, "total_steps": 700, "loss": 0.3686, "lr": 1.1412289156011816e-05, "epoch": 4.75503355704698, "percentage": 67.86, "elapsed_time": "3:23:55", "remaining_time": "1:36:35"}
{"current_steps": 480, "total_steps": 700, "loss": 0.3681, "lr": 1.0964649603743837e-05, "epoch": 4.805369127516778, "percentage": 68.57, "elapsed_time": "3:25:59", "remaining_time": "1:34:24"}
{"current_steps": 485, "total_steps": 700, "loss": 0.3656, "lr": 1.0522626750540029e-05, "epoch": 4.855704697986577, "percentage": 69.29, "elapsed_time": "3:28:03", "remaining_time": "1:32:14"}
{"current_steps": 490, "total_steps": 700, "loss": 0.3677, "lr": 1.0086495373718048e-05, "epoch": 4.906040268456376, "percentage": 70.0, "elapsed_time": "3:30:08", "remaining_time": "1:30:03"}
{"current_steps": 495, "total_steps": 700, "loss": 0.3679, "lr": 9.656526588241745e-06, "epoch": 4.956375838926174, "percentage": 70.71, "elapsed_time": "3:32:12", "remaining_time": "1:27:53"}
{"current_steps": 500, "total_steps": 700, "loss": 0.3708, "lr": 9.232987678186357e-06, "epoch": 5.0, "percentage": 71.43, "elapsed_time": "3:34:02", "remaining_time": "1:25:37"}
{"current_steps": 505, "total_steps": 700, "loss": 0.3629, "lr": 8.816141930585067e-06, "epoch": 5.050335570469799, "percentage": 72.14, "elapsed_time": "3:36:02", "remaining_time": "1:23:25"}
{"current_steps": 510, "total_steps": 700, "loss": 0.3678, "lr": 8.406248471760357e-06, "epoch": 5.100671140939597, "percentage": 72.86, "elapsed_time": "3:37:59", "remaining_time": "1:21:12"}
{"current_steps": 515, "total_steps": 700, "loss": 0.3623, "lr": 8.003562106241727e-06, "epoch": 5.151006711409396, "percentage": 73.57, "elapsed_time": "3:40:00", "remaining_time": "1:19:01"}
{"current_steps": 520, "total_steps": 700, "loss": 0.3611, "lr": 7.608333158370036e-06, "epoch": 5.201342281879195, "percentage": 74.29, "elapsed_time": "3:41:59", "remaining_time": "1:16:50"}
{"current_steps": 525, "total_steps": 700, "loss": 0.3691, "lr": 7.220807316686886e-06, "epoch": 5.251677852348993, "percentage": 75.0, "elapsed_time": "3:43:54", "remaining_time": "1:14:38"}
{"current_steps": 530, "total_steps": 700, "loss": 0.3635, "lr": 6.841225481205749e-06, "epoch": 5.302013422818792, "percentage": 75.71, "elapsed_time": "3:45:57", "remaining_time": "1:12:28"}
{"current_steps": 535, "total_steps": 700, "loss": 0.3651, "lr": 6.469823613659896e-06, "epoch": 5.35234899328859, "percentage": 76.43, "elapsed_time": "3:48:04", "remaining_time": "1:10:20"}
{"current_steps": 540, "total_steps": 700, "loss": 0.365, "lr": 6.106832590820053e-06, "epoch": 5.402684563758389, "percentage": 77.14, "elapsed_time": "3:50:07", "remaining_time": "1:08:11"}
{"current_steps": 545, "total_steps": 700, "loss": 0.3669, "lr": 5.752478060973108e-06, "epoch": 5.453020134228188, "percentage": 77.86, "elapsed_time": "3:52:20", "remaining_time": "1:06:04"}
{"current_steps": 550, "total_steps": 700, "loss": 0.3632, "lr": 5.406980303650984e-06, "epoch": 5.503355704697986, "percentage": 78.57, "elapsed_time": "3:54:27", "remaining_time": "1:03:56"}
{"current_steps": 555, "total_steps": 700, "loss": 0.3658, "lr": 5.070554092696997e-06, "epoch": 5.553691275167785, "percentage": 79.29, "elapsed_time": "3:56:36", "remaining_time": "1:01:49"}
{"current_steps": 560, "total_steps": 700, "loss": 0.3649, "lr": 4.74340856275467e-06, "epoch": 5.604026845637584, "percentage": 80.0, "elapsed_time": "3:58:43", "remaining_time": "0:59:40"}
{"current_steps": 565, "total_steps": 700, "loss": 0.3674, "lr": 4.425747079262121e-06, "epoch": 5.654362416107382, "percentage": 80.71, "elapsed_time": "4:00:42", "remaining_time": "0:57:30"}
{"current_steps": 570, "total_steps": 700, "loss": 0.3626, "lr": 4.11776711203278e-06, "epoch": 5.704697986577181, "percentage": 81.43, "elapsed_time": "4:02:53", "remaining_time": "0:55:23"}
{"current_steps": 575, "total_steps": 700, "loss": 0.3604, "lr": 3.819660112501053e-06, "epoch": 5.75503355704698, "percentage": 82.14, "elapsed_time": "4:05:00", "remaining_time": "0:53:15"}
{"current_steps": 580, "total_steps": 700, "loss": 0.3684, "lr": 3.531611394709216e-06, "epoch": 5.805369127516778, "percentage": 82.86, "elapsed_time": "4:07:05", "remaining_time": "0:51:07"}
{"current_steps": 585, "total_steps": 700, "loss": 0.3663, "lr": 3.2538000201095363e-06, "epoch": 5.855704697986577, "percentage": 83.57, "elapsed_time": "4:09:09", "remaining_time": "0:48:58"}
{"current_steps": 590, "total_steps": 700, "loss": 0.3701, "lr": 2.986398686253211e-06, "epoch": 5.906040268456376, "percentage": 84.29, "elapsed_time": "4:11:06", "remaining_time": "0:46:49"}
{"current_steps": 595, "total_steps": 700, "loss": 0.363, "lr": 2.729573619435384e-06, "epoch": 5.956375838926174, "percentage": 85.0, "elapsed_time": "4:13:03", "remaining_time": "0:44:39"}
{"current_steps": 600, "total_steps": 700, "loss": 0.362, "lr": 2.483484471362869e-06, "epoch": 6.0, "percentage": 85.71, "elapsed_time": "4:14:48", "remaining_time": "0:42:28"}
{"current_steps": 605, "total_steps": 700, "loss": 0.3644, "lr": 2.248284219908918e-06, "epoch": 6.050335570469799, "percentage": 86.43, "elapsed_time": "4:17:05", "remaining_time": "0:40:22"}
{"current_steps": 610, "total_steps": 700, "loss": 0.3661, "lr": 2.024119074016664e-06, "epoch": 6.100671140939597, "percentage": 87.14, "elapsed_time": "4:19:07", "remaining_time": "0:38:13"}
{"current_steps": 615, "total_steps": 700, "loss": 0.3666, "lr": 1.8111283828103566e-06, "epoch": 6.151006711409396, "percentage": 87.86, "elapsed_time": "4:21:15", "remaining_time": "0:36:06"}
{"current_steps": 620, "total_steps": 700, "loss": 0.3646, "lr": 1.6094445489709886e-06, "epoch": 6.201342281879195, "percentage": 88.57, "elapsed_time": "4:23:18", "remaining_time": "0:33:58"}
{"current_steps": 625, "total_steps": 700, "loss": 0.3607, "lr": 1.4191929464299481e-06, "epoch": 6.251677852348993, "percentage": 89.29, "elapsed_time": "4:25:30", "remaining_time": "0:31:51"}
{"current_steps": 630, "total_steps": 700, "loss": 0.3607, "lr": 1.2404918424321277e-06, "epoch": 6.302013422818792, "percentage": 90.0, "elapsed_time": "4:27:34", "remaining_time": "0:29:43"}
{"current_steps": 635, "total_steps": 700, "loss": 0.3614, "lr": 1.073452324016715e-06, "epoch": 6.35234899328859, "percentage": 90.71, "elapsed_time": "4:29:34", "remaining_time": "0:27:35"}
{"current_steps": 640, "total_steps": 700, "loss": 0.3569, "lr": 9.181782289615149e-07, "epoch": 6.402684563758389, "percentage": 91.43, "elapsed_time": "4:31:41", "remaining_time": "0:25:28"}
{"current_steps": 645, "total_steps": 700, "loss": 0.3612, "lr": 7.747660812336221e-07, "epoch": 6.453020134228188, "percentage": 92.14, "elapsed_time": "4:33:45", "remaining_time": "0:23:20"}
{"current_steps": 650, "total_steps": 700, "loss": 0.3636, "lr": 6.433050309866717e-07, "epoch": 6.503355704697986, "percentage": 92.86, "elapsed_time": "4:35:41", "remaining_time": "0:21:12"}
{"current_steps": 655, "total_steps": 700, "loss": 0.3608, "lr": 5.238767991418737e-07, "epoch": 6.553691275167785, "percentage": 93.57, "elapsed_time": "4:37:48", "remaining_time": "0:19:05"}
{"current_steps": 660, "total_steps": 700, "loss": 0.3589, "lr": 4.165556265873716e-07, "epoch": 6.604026845637584, "percentage": 94.29, "elapsed_time": "4:39:54", "remaining_time": "0:16:57"}
{"current_steps": 665, "total_steps": 700, "loss": 0.3652, "lr": 3.214082280274067e-07, "epoch": 6.654362416107382, "percentage": 95.0, "elapsed_time": "4:41:53", "remaining_time": "0:14:50"}
{"current_steps": 670, "total_steps": 700, "loss": 0.368, "lr": 2.384937505100804e-07, "epoch": 6.704697986577181, "percentage": 95.71, "elapsed_time": "4:43:56", "remaining_time": "0:12:42"}
{"current_steps": 675, "total_steps": 700, "loss": 0.3692, "lr": 1.6786373665939492e-07, "epoch": 6.75503355704698, "percentage": 96.43, "elapsed_time": "4:46:00", "remaining_time": "0:10:35"}
{"current_steps": 680, "total_steps": 700, "loss": 0.3633, "lr": 1.0956209263453421e-07, "epoch": 6.805369127516778, "percentage": 97.14, "elapsed_time": "4:47:56", "remaining_time": "0:08:28"}
{"current_steps": 685, "total_steps": 700, "loss": 0.3628, "lr": 6.362506083618103e-08, "epoch": 6.855704697986577, "percentage": 97.86, "elapsed_time": "4:50:02", "remaining_time": "0:06:21"}
{"current_steps": 690, "total_steps": 700, "loss": 0.3574, "lr": 3.0081197376965465e-08, "epoch": 6.906040268456376, "percentage": 98.57, "elapsed_time": "4:52:05", "remaining_time": "0:04:13"}
{"current_steps": 695, "total_steps": 700, "loss": 0.3671, "lr": 8.951354329933548e-09, "epoch": 6.956375838926174, "percentage": 99.29, "elapsed_time": "4:54:13", "remaining_time": "0:02:07"}
{"current_steps": 700, "total_steps": 700, "loss": 0.3647, "lr": 2.486667661627529e-10, "epoch": 7.0, "percentage": 100.0, "elapsed_time": "4:55:54", "remaining_time": "0:00:00"}
{"current_steps": 700, "total_steps": 700, "epoch": 7.0, "percentage": 100.0, "elapsed_time": "4:56:04", "remaining_time": "0:00:00"}
{"current_steps": 700, "total_steps": 700, "epoch": 7.0, "percentage": 100.0, "elapsed_time": "0:00:00", "remaining_time": "0:00:00"}
{"current_steps": 700, "total_steps": 700, "epoch": 7.0, "percentage": 100.0, "elapsed_time": "0:00:00", "remaining_time": "0:00:00"}
{"current_steps": 700, "total_steps": 700, "epoch": 7.0, "percentage": 100.0, "elapsed_time": "0:00:00", "remaining_time": "0:00:00"}
{"current_steps": 700, "total_steps": 700, "epoch": 7.0, "percentage": 100.0, "elapsed_time": "0:00:00", "remaining_time": "0:00:00"}
{"current_steps": 700, "total_steps": 700, "epoch": 7.0, "percentage": 100.0, "elapsed_time": "0:00:00", "remaining_time": "0:00:00"}

1583
trainer_state.json Normal file

File diff suppressed because it is too large Load Diff

3
training_args.bin Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:03455cd943574bd4ca748dcb5c2ef3419331defdcd09dd776ed4031577019278
size 8721

BIN
training_loss.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 37 KiB

1
vocab.json Normal file

File diff suppressed because one or more lines are too long