初始化项目,由ModelHub XC社区提供模型

Model: penfever/GLM-4_6-gemini25flash-stackexchange-overflow-32ep-512k-fixeps
Source: Original Platform
This commit is contained in:
ModelHub XC
2026-04-23 16:51:09 +08:00
commit b2cdc4c017
23 changed files with 155038 additions and 0 deletions

36
.gitattributes vendored Normal file
View File

@@ -0,0 +1,36 @@
*.7z filter=lfs diff=lfs merge=lfs -text
*.arrow filter=lfs diff=lfs merge=lfs -text
*.bin filter=lfs diff=lfs merge=lfs -text
*.bz2 filter=lfs diff=lfs merge=lfs -text
*.ckpt filter=lfs diff=lfs merge=lfs -text
*.ftz filter=lfs diff=lfs merge=lfs -text
*.gz filter=lfs diff=lfs merge=lfs -text
*.h5 filter=lfs diff=lfs merge=lfs -text
*.joblib filter=lfs diff=lfs merge=lfs -text
*.lfs.* filter=lfs diff=lfs merge=lfs -text
*.mlmodel filter=lfs diff=lfs merge=lfs -text
*.model filter=lfs diff=lfs merge=lfs -text
*.msgpack filter=lfs diff=lfs merge=lfs -text
*.npy filter=lfs diff=lfs merge=lfs -text
*.npz filter=lfs diff=lfs merge=lfs -text
*.onnx filter=lfs diff=lfs merge=lfs -text
*.ot filter=lfs diff=lfs merge=lfs -text
*.parquet filter=lfs diff=lfs merge=lfs -text
*.pb filter=lfs diff=lfs merge=lfs -text
*.pickle filter=lfs diff=lfs merge=lfs -text
*.pkl filter=lfs diff=lfs merge=lfs -text
*.pt filter=lfs diff=lfs merge=lfs -text
*.pth filter=lfs diff=lfs merge=lfs -text
*.rar filter=lfs diff=lfs merge=lfs -text
*.safetensors filter=lfs diff=lfs merge=lfs -text
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
*.tar.* filter=lfs diff=lfs merge=lfs -text
*.tar filter=lfs diff=lfs merge=lfs -text
*.tflite filter=lfs diff=lfs merge=lfs -text
*.tgz filter=lfs diff=lfs merge=lfs -text
*.wasm filter=lfs diff=lfs merge=lfs -text
*.xz filter=lfs diff=lfs merge=lfs -text
*.zip filter=lfs diff=lfs merge=lfs -text
*.zst filter=lfs diff=lfs merge=lfs -text
*tfevents* filter=lfs diff=lfs merge=lfs -text
tokenizer.json filter=lfs diff=lfs merge=lfs -text

60
README.md Normal file
View File

@@ -0,0 +1,60 @@
---
library_name: transformers
license: apache-2.0
base_model: Qwen/Qwen3-8B
tags:
- llama-factory
- full
- generated_from_trainer
model-index:
- name: GLM-4_6-gemini25flash-stackexchange-overflow-32ep-512k-fixeps
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# GLM-4_6-gemini25flash-stackexchange-overflow-32ep-512k-fixeps
This model is a fine-tuned version of [Qwen/Qwen3-8B](https://huggingface.co/Qwen/Qwen3-8B) on the penfever/GLM-4.6-gemini25flash-stackexchange-overflow-32ep-512k dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 4e-05
- train_batch_size: 1
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- num_devices: 16
- total_train_batch_size: 16
- total_eval_batch_size: 128
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.98) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 7.0
### Training results
### Framework versions
- Transformers 4.56.0
- Pytorch 2.9.0+cu128
- Datasets 4.4.1
- Tokenizers 0.22.1

28
added_tokens.json Normal file
View File

@@ -0,0 +1,28 @@
{
"</think>": 151668,
"</tool_call>": 151658,
"</tool_response>": 151666,
"<think>": 151667,
"<tool_call>": 151657,
"<tool_response>": 151665,
"<|box_end|>": 151649,
"<|box_start|>": 151648,
"<|endoftext|>": 151643,
"<|file_sep|>": 151664,
"<|fim_middle|>": 151660,
"<|fim_pad|>": 151662,
"<|fim_prefix|>": 151659,
"<|fim_suffix|>": 151661,
"<|im_end|>": 151645,
"<|im_start|>": 151644,
"<|image_pad|>": 151655,
"<|object_ref_end|>": 151647,
"<|object_ref_start|>": 151646,
"<|quad_end|>": 151651,
"<|quad_start|>": 151650,
"<|repo_name|>": 151663,
"<|video_pad|>": 151656,
"<|vision_end|>": 151653,
"<|vision_pad|>": 151654,
"<|vision_start|>": 151652
}

16
all_results.json Normal file
View File

@@ -0,0 +1,16 @@
{
"achieved_tflops_per_gpu": 0.0015014057683548775,
"achieved_tflops_per_gpu_theoretical": 992.6560369071881,
"epoch": 7.0,
"loss_nan_ranks": 0,
"loss_rank_avg": 0.21700537204742432,
"mfu_percent": 0.00010610641472472633,
"mfu_percent_theoretical": 70.15237009944792,
"total_flos": 87884495978496.0,
"train_loss": 0.3235391679834696,
"train_runtime": 3658.4254,
"train_samples_per_second": 4.669,
"train_steps_per_second": 0.293,
"valid_targets_mean": 902.0,
"valid_targets_min": 329
}

89
chat_template.jinja Normal file
View File

@@ -0,0 +1,89 @@
{%- if tools %}
{{- '<|im_start|>system\n' }}
{%- if messages[0].role == 'system' %}
{{- messages[0].content + '\n\n' }}
{%- endif %}
{{- "# Tools\n\nYou may call one or more functions to assist with the user query.\n\nYou are provided with function signatures within <tools></tools> XML tags:\n<tools>" }}
{%- for tool in tools %}
{{- "\n" }}
{{- tool | tojson }}
{%- endfor %}
{{- "\n</tools>\n\nFor each function call, return a json object with function name and arguments within <tool_call></tool_call> XML tags:\n<tool_call>\n{\"name\": <function-name>, \"arguments\": <args-json-object>}\n</tool_call><|im_end|>\n" }}
{%- else %}
{%- if messages[0].role == 'system' %}
{{- '<|im_start|>system\n' + messages[0].content + '<|im_end|>\n' }}
{%- endif %}
{%- endif %}
{%- set ns = namespace(multi_step_tool=true, last_query_index=messages|length - 1) %}
{%- for message in messages[::-1] %}
{%- set index = (messages|length - 1) - loop.index0 %}
{%- if ns.multi_step_tool and message.role == "user" and message.content is string and not(message.content.startswith('<tool_response>') and message.content.endswith('</tool_response>')) %}
{%- set ns.multi_step_tool = false %}
{%- set ns.last_query_index = index %}
{%- endif %}
{%- endfor %}
{%- for message in messages %}
{%- if message.content is string %}
{%- set content = message.content %}
{%- else %}
{%- set content = '' %}
{%- endif %}
{%- if (message.role == "user") or (message.role == "system" and not loop.first) %}
{{- '<|im_start|>' + message.role + '\n' + content + '<|im_end|>' + '\n' }}
{%- elif message.role == "assistant" %}
{%- set reasoning_content = '' %}
{%- if message.reasoning_content is string %}
{%- set reasoning_content = message.reasoning_content %}
{%- else %}
{%- if '</think>' in content %}
{%- set reasoning_content = content.split('</think>')[0].rstrip('\n').split('<think>')[-1].lstrip('\n') %}
{%- set content = content.split('</think>')[-1].lstrip('\n') %}
{%- endif %}
{%- endif %}
{%- if loop.index0 > ns.last_query_index %}
{%- if loop.last or (not loop.last and reasoning_content) %}
{{- '<|im_start|>' + message.role + '\n<think>\n' + reasoning_content.strip('\n') + '\n</think>\n\n' + content.lstrip('\n') }}
{%- else %}
{{- '<|im_start|>' + message.role + '\n' + content }}
{%- endif %}
{%- else %}
{{- '<|im_start|>' + message.role + '\n' + content }}
{%- endif %}
{%- if message.tool_calls %}
{%- for tool_call in message.tool_calls %}
{%- if (loop.first and content) or (not loop.first) %}
{{- '\n' }}
{%- endif %}
{%- if tool_call.function %}
{%- set tool_call = tool_call.function %}
{%- endif %}
{{- '<tool_call>\n{"name": "' }}
{{- tool_call.name }}
{{- '", "arguments": ' }}
{%- if tool_call.arguments is string %}
{{- tool_call.arguments }}
{%- else %}
{{- tool_call.arguments | tojson }}
{%- endif %}
{{- '}\n</tool_call>' }}
{%- endfor %}
{%- endif %}
{{- '<|im_end|>\n' }}
{%- elif message.role == "tool" %}
{%- if loop.first or (messages[loop.index0 - 1].role != "tool") %}
{{- '<|im_start|>user' }}
{%- endif %}
{{- '\n<tool_response>\n' }}
{{- content }}
{{- '\n</tool_response>' }}
{%- if loop.last or (messages[loop.index0 + 1].role != "tool") %}
{{- '<|im_end|>\n' }}
{%- endif %}
{%- endif %}
{%- endfor %}
{%- if add_generation_prompt %}
{{- '<|im_start|>assistant\n' }}
{%- if enable_thinking is defined and enable_thinking is false %}
{{- '<think>\n\n</think>\n\n' }}
{%- endif %}
{%- endif %}

68
config.json Normal file
View File

@@ -0,0 +1,68 @@
{
"architectures": [
"Qwen3ForCausalLM"
],
"attention_bias": false,
"attention_dropout": 0.0,
"dtype": "bfloat16",
"eos_token_id": 151645,
"head_dim": 128,
"hidden_act": "silu",
"hidden_size": 4096,
"initializer_range": 0.02,
"intermediate_size": 12288,
"layer_types": [
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention",
"full_attention"
],
"max_position_embeddings": 40960,
"max_window_layers": 36,
"model_type": "qwen3",
"num_attention_heads": 32,
"num_hidden_layers": 36,
"num_key_value_heads": 8,
"pad_token_id": 151643,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 1000000,
"sliding_window": null,
"tie_word_embeddings": false,
"transformers_version": "4.56.0",
"use_cache": false,
"use_sliding_window": false,
"vocab_size": 151936
}

12
generation_config.json Normal file
View File

@@ -0,0 +1,12 @@
{
"do_sample": true,
"eos_token_id": [
151645,
151643
],
"pad_token_id": 151643,
"temperature": 0.6,
"top_k": 20,
"top_p": 0.95,
"transformers_version": "4.56.0"
}

151388
merges.txt Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:5995b64cab1e5ca202ef6462850a7857e1a013e998aaef05d52b8af69a06343a
size 4902257696

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:c1035dc06ca16e465b020cd8d0168809444a238382e808b9f933cde6942ac64e
size 4915960368

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:ab6631cdf6ab820aa5728f8b678967de18b526479e185fda74b031f957ea300a
size 4983068496

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:4ef3f504adb62bde4fc8ffec7d9a7c4300c27f29c3280c2388c16735ab9a81f0
size 1580230264

View File

@@ -0,0 +1,407 @@
{
"metadata": {
"total_parameters": 308224,
"total_size": 16381470720
},
"weight_map": {
"lm_head.weight": "model-00004-of-00004.safetensors",
"model.embed_tokens.weight": "model-00001-of-00004.safetensors",
"model.layers.0.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.0.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.0.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.0.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.0.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.0.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.0.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.0.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.0.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.0.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.0.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.1.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.1.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.1.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.1.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.1.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.1.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.1.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.1.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.1.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.1.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.1.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.10.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.10.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.10.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.10.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.10.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.10.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.10.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.10.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.10.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.10.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.10.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.11.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.11.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.11.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.11.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.11.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.11.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.11.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.11.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.11.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.11.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.11.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.12.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.12.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.12.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.12.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.12.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.12.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.12.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.12.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.12.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.12.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.12.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.13.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.13.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.13.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.13.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.13.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.13.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.13.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.13.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.13.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.13.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.13.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.14.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.14.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.14.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.14.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.14.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.14.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.14.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.14.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.14.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.14.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.14.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.15.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.15.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.15.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.15.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.15.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.15.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.15.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.15.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.15.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.15.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.15.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.16.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.16.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.16.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.16.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.16.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.16.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.16.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.16.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.16.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.16.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.16.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.17.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.17.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.17.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.17.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.17.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.17.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.17.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.17.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.17.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.17.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.17.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.18.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.18.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.18.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.18.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.18.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.18.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.18.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.18.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.18.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.18.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.18.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.19.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.19.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.19.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.19.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.19.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.19.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.19.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.19.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.19.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.19.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.19.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.2.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.2.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.2.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.2.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.2.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.2.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.2.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.2.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.2.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.2.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.2.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.20.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.20.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.20.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.20.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.20.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.20.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.20.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.20.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.20.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.20.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.20.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.21.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.21.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.21.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.21.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.21.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.21.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.21.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.21.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.21.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.21.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.21.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.22.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.22.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.22.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.22.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.22.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.22.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.22.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.22.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.22.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
"model.layers.22.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.22.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.23.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.23.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.23.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.23.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.23.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.23.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.23.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.23.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.23.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.23.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.23.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.24.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.24.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.24.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.24.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.24.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.24.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.24.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.24.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.24.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.24.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.24.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.25.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.25.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.25.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.25.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.25.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.25.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.25.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.25.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.25.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.25.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.25.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.26.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.26.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.26.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.26.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.26.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.26.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.26.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.26.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.26.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.26.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.26.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.27.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.27.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.27.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.27.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.27.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.27.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.27.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.27.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.27.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.27.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.27.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.28.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.28.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.28.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.28.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.28.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.28.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.28.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.28.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.28.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.28.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.28.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.29.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.29.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.29.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.29.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.29.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.29.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.29.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.29.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.29.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.29.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.29.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.3.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.3.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.3.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.3.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.3.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.3.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.3.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.3.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.3.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.3.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.3.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.30.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.30.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.30.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.30.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.30.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.30.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.30.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.30.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.30.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.30.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.30.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.31.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.31.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.31.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.31.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.31.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.31.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.31.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.31.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.31.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.31.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.31.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.32.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.32.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.32.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.32.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.32.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.32.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.32.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.32.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.32.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.32.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.32.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.33.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.33.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.33.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.33.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.33.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.33.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.33.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.33.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.33.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.33.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.33.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.34.input_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.34.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.34.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.34.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.34.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
"model.layers.34.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.34.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.34.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.34.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
"model.layers.34.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.34.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.35.input_layernorm.weight": "model-00004-of-00004.safetensors",
"model.layers.35.mlp.down_proj.weight": "model-00004-of-00004.safetensors",
"model.layers.35.mlp.gate_proj.weight": "model-00004-of-00004.safetensors",
"model.layers.35.mlp.up_proj.weight": "model-00004-of-00004.safetensors",
"model.layers.35.post_attention_layernorm.weight": "model-00004-of-00004.safetensors",
"model.layers.35.self_attn.k_norm.weight": "model-00004-of-00004.safetensors",
"model.layers.35.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.35.self_attn.o_proj.weight": "model-00004-of-00004.safetensors",
"model.layers.35.self_attn.q_norm.weight": "model-00004-of-00004.safetensors",
"model.layers.35.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.35.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
"model.layers.4.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.4.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.4.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.4.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.4.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.4.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.4.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.4.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.4.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.4.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.4.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.5.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.5.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.5.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.5.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.5.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.5.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.5.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.5.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.5.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.5.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.5.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.6.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.6.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.6.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.6.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.6.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.6.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.6.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.6.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.6.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.6.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.6.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.7.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.7.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.7.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.7.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.7.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.7.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.7.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.7.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.7.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.7.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.7.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.8.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.8.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.8.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.8.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.8.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.layers.8.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.8.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.8.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.8.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.8.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.8.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.9.input_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.9.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.9.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.9.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
"model.layers.9.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
"model.layers.9.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.9.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.9.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.9.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
"model.layers.9.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
"model.layers.9.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
"model.norm.weight": "model-00004-of-00004.safetensors"
}
}

12
run_summary.json Normal file
View File

@@ -0,0 +1,12 @@
{
"agent_name": null,
"training_start": null,
"training_end": null,
"created_by": "DCAgent",
"base_model_name": "Qwen/Qwen3-8B",
"dataset_name": "penfever/GLM-4.6-gemini25flash-stackexchange-overflow-32ep-512k",
"training_type": "SFT",
"training_parameters": "https://huggingface.co/GLM-4_6-gemini25flash-stackexchange-overflow-32ep-512k-fixeps/blob/main/config.json",
"wandb_link": "https://wandb.ai/dogml/dc-agent/runs/GLM-4.6-gemini25flash-stackexchange-overflow-32ep-512k-fixeps_Qwen3-8B",
"traces_location_s3": null
}

31
special_tokens_map.json Normal file
View File

@@ -0,0 +1,31 @@
{
"additional_special_tokens": [
"<|im_start|>",
"<|im_end|>",
"<|object_ref_start|>",
"<|object_ref_end|>",
"<|box_start|>",
"<|box_end|>",
"<|quad_start|>",
"<|quad_end|>",
"<|vision_start|>",
"<|vision_end|>",
"<|vision_pad|>",
"<|image_pad|>",
"<|video_pad|>"
],
"eos_token": {
"content": "<|im_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false
},
"pad_token": {
"content": "<|endoftext|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false
}
}

3
tokenizer.json Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:aeb13307a71acd8fe81861d94ad54ab689df773318809eed3cbe794b4492dae4
size 11422654

240
tokenizer_config.json Normal file
View File

@@ -0,0 +1,240 @@
{
"add_bos_token": false,
"add_prefix_space": false,
"added_tokens_decoder": {
"151643": {
"content": "<|endoftext|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151644": {
"content": "<|im_start|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151645": {
"content": "<|im_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151646": {
"content": "<|object_ref_start|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151647": {
"content": "<|object_ref_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151648": {
"content": "<|box_start|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151649": {
"content": "<|box_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151650": {
"content": "<|quad_start|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151651": {
"content": "<|quad_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151652": {
"content": "<|vision_start|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151653": {
"content": "<|vision_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151654": {
"content": "<|vision_pad|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151655": {
"content": "<|image_pad|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151656": {
"content": "<|video_pad|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151657": {
"content": "<tool_call>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151658": {
"content": "</tool_call>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151659": {
"content": "<|fim_prefix|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151660": {
"content": "<|fim_middle|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151661": {
"content": "<|fim_suffix|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151662": {
"content": "<|fim_pad|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151663": {
"content": "<|repo_name|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151664": {
"content": "<|file_sep|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151665": {
"content": "<tool_response>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151666": {
"content": "</tool_response>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151667": {
"content": "<think>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151668": {
"content": "</think>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
}
},
"additional_special_tokens": [
"<|im_start|>",
"<|im_end|>",
"<|object_ref_start|>",
"<|object_ref_end|>",
"<|box_start|>",
"<|box_end|>",
"<|quad_start|>",
"<|quad_end|>",
"<|vision_start|>",
"<|vision_end|>",
"<|vision_pad|>",
"<|image_pad|>",
"<|video_pad|>"
],
"bos_token": null,
"clean_up_tokenization_spaces": false,
"eos_token": "<|im_end|>",
"errors": "replace",
"extra_special_tokens": {},
"model_max_length": 32768,
"pad_token": "<|endoftext|>",
"padding_side": "right",
"split_special_tokens": false,
"tokenizer_class": "Qwen2Tokenizer",
"unk_token": null
}

16
train_results.json Normal file
View File

@@ -0,0 +1,16 @@
{
"achieved_tflops_per_gpu": 0.0015014057683548775,
"achieved_tflops_per_gpu_theoretical": 992.6560369071881,
"epoch": 7.0,
"loss_nan_ranks": 0,
"loss_rank_avg": 0.21700537204742432,
"mfu_percent": 0.00010610641472472633,
"mfu_percent_theoretical": 70.15237009944792,
"total_flos": 87884495978496.0,
"train_loss": 0.3235391679834696,
"train_runtime": 3658.4254,
"train_samples_per_second": 4.669,
"train_steps_per_second": 0.293,
"valid_targets_mean": 902.0,
"valid_targets_min": 329
}

215
trainer_log.jsonl Normal file
View File

@@ -0,0 +1,215 @@
{"current_steps": 5, "total_steps": 1071, "loss": 0.8827, "lr": 1.4814814814814815e-06, "epoch": 0.032679738562091505, "percentage": 0.47, "elapsed_time": "0:00:25", "remaining_time": "1:30:16"}
{"current_steps": 10, "total_steps": 1071, "loss": 0.8066, "lr": 3.3333333333333333e-06, "epoch": 0.06535947712418301, "percentage": 0.93, "elapsed_time": "0:00:41", "remaining_time": "1:13:42"}
{"current_steps": 15, "total_steps": 1071, "loss": 0.7469, "lr": 5.185185185185185e-06, "epoch": 0.09803921568627451, "percentage": 1.4, "elapsed_time": "0:00:58", "remaining_time": "1:08:18"}
{"current_steps": 20, "total_steps": 1071, "loss": 0.6986, "lr": 7.0370370370370375e-06, "epoch": 0.13071895424836602, "percentage": 1.87, "elapsed_time": "0:01:14", "remaining_time": "1:05:38"}
{"current_steps": 25, "total_steps": 1071, "loss": 0.7292, "lr": 8.888888888888888e-06, "epoch": 0.16339869281045752, "percentage": 2.33, "elapsed_time": "0:01:29", "remaining_time": "1:02:41"}
{"current_steps": 30, "total_steps": 1071, "loss": 0.6299, "lr": 1.0740740740740742e-05, "epoch": 0.19607843137254902, "percentage": 2.8, "elapsed_time": "0:01:45", "remaining_time": "1:00:46"}
{"current_steps": 35, "total_steps": 1071, "loss": 0.4857, "lr": 1.2592592592592593e-05, "epoch": 0.22875816993464052, "percentage": 3.27, "elapsed_time": "0:02:08", "remaining_time": "1:03:19"}
{"current_steps": 40, "total_steps": 1071, "loss": 0.5807, "lr": 1.4444444444444446e-05, "epoch": 0.26143790849673204, "percentage": 3.73, "elapsed_time": "0:02:26", "remaining_time": "1:03:01"}
{"current_steps": 45, "total_steps": 1071, "loss": 0.6403, "lr": 1.6296296296296297e-05, "epoch": 0.29411764705882354, "percentage": 4.2, "elapsed_time": "0:02:40", "remaining_time": "1:00:57"}
{"current_steps": 50, "total_steps": 1071, "loss": 0.5481, "lr": 1.814814814814815e-05, "epoch": 0.32679738562091504, "percentage": 4.67, "elapsed_time": "0:02:56", "remaining_time": "0:59:58"}
{"current_steps": 55, "total_steps": 1071, "loss": 0.475, "lr": 2e-05, "epoch": 0.35947712418300654, "percentage": 5.14, "elapsed_time": "0:03:16", "remaining_time": "1:00:36"}
{"current_steps": 60, "total_steps": 1071, "loss": 0.5648, "lr": 2.1851851851851852e-05, "epoch": 0.39215686274509803, "percentage": 5.6, "elapsed_time": "0:03:31", "remaining_time": "0:59:20"}
{"current_steps": 65, "total_steps": 1071, "loss": 0.5476, "lr": 2.3703703703703703e-05, "epoch": 0.42483660130718953, "percentage": 6.07, "elapsed_time": "0:03:48", "remaining_time": "0:58:58"}
{"current_steps": 70, "total_steps": 1071, "loss": 0.5201, "lr": 2.5555555555555554e-05, "epoch": 0.45751633986928103, "percentage": 6.54, "elapsed_time": "0:04:03", "remaining_time": "0:58:00"}
{"current_steps": 75, "total_steps": 1071, "loss": 0.5551, "lr": 2.740740740740741e-05, "epoch": 0.49019607843137253, "percentage": 7.0, "elapsed_time": "0:04:16", "remaining_time": "0:56:47"}
{"current_steps": 80, "total_steps": 1071, "loss": 0.5125, "lr": 2.9259259259259262e-05, "epoch": 0.5228758169934641, "percentage": 7.47, "elapsed_time": "0:04:32", "remaining_time": "0:56:14"}
{"current_steps": 85, "total_steps": 1071, "loss": 0.5344, "lr": 3.111111111111112e-05, "epoch": 0.5555555555555556, "percentage": 7.94, "elapsed_time": "0:04:47", "remaining_time": "0:55:29"}
{"current_steps": 90, "total_steps": 1071, "loss": 0.5168, "lr": 3.2962962962962964e-05, "epoch": 0.5882352941176471, "percentage": 8.4, "elapsed_time": "0:05:01", "remaining_time": "0:54:43"}
{"current_steps": 95, "total_steps": 1071, "loss": 0.5328, "lr": 3.481481481481482e-05, "epoch": 0.6209150326797386, "percentage": 8.87, "elapsed_time": "0:05:18", "remaining_time": "0:54:35"}
{"current_steps": 100, "total_steps": 1071, "loss": 0.582, "lr": 3.6666666666666666e-05, "epoch": 0.6535947712418301, "percentage": 9.34, "elapsed_time": "0:05:33", "remaining_time": "0:53:54"}
{"current_steps": 105, "total_steps": 1071, "loss": 0.4839, "lr": 3.851851851851852e-05, "epoch": 0.6862745098039216, "percentage": 9.8, "elapsed_time": "0:05:49", "remaining_time": "0:53:36"}
{"current_steps": 110, "total_steps": 1071, "loss": 0.5261, "lr": 3.9999893574233685e-05, "epoch": 0.7189542483660131, "percentage": 10.27, "elapsed_time": "0:06:02", "remaining_time": "0:52:47"}
{"current_steps": 115, "total_steps": 1071, "loss": 0.5166, "lr": 3.9996168791339075e-05, "epoch": 0.7516339869281046, "percentage": 10.74, "elapsed_time": "0:06:16", "remaining_time": "0:52:10"}
{"current_steps": 120, "total_steps": 1071, "loss": 0.5019, "lr": 3.998712385271904e-05, "epoch": 0.7843137254901961, "percentage": 11.2, "elapsed_time": "0:06:30", "remaining_time": "0:51:31"}
{"current_steps": 125, "total_steps": 1071, "loss": 0.5344, "lr": 3.997276116485867e-05, "epoch": 0.8169934640522876, "percentage": 11.67, "elapsed_time": "0:06:44", "remaining_time": "0:51:02"}
{"current_steps": 130, "total_steps": 1071, "loss": 0.542, "lr": 3.995308454907679e-05, "epoch": 0.8496732026143791, "percentage": 12.14, "elapsed_time": "0:06:59", "remaining_time": "0:50:39"}
{"current_steps": 135, "total_steps": 1071, "loss": 0.4896, "lr": 3.992809924050924e-05, "epoch": 0.8823529411764706, "percentage": 12.61, "elapsed_time": "0:07:15", "remaining_time": "0:50:20"}
{"current_steps": 140, "total_steps": 1071, "loss": 0.5363, "lr": 3.9897811886716054e-05, "epoch": 0.9150326797385621, "percentage": 13.07, "elapsed_time": "0:07:29", "remaining_time": "0:49:48"}
{"current_steps": 145, "total_steps": 1071, "loss": 0.5173, "lr": 3.986223054591281e-05, "epoch": 0.9477124183006536, "percentage": 13.54, "elapsed_time": "0:07:43", "remaining_time": "0:49:21"}
{"current_steps": 150, "total_steps": 1071, "loss": 0.4419, "lr": 3.982136468482665e-05, "epoch": 0.9803921568627451, "percentage": 14.01, "elapsed_time": "0:07:57", "remaining_time": "0:48:54"}
{"current_steps": 155, "total_steps": 1071, "loss": 0.3684, "lr": 3.9775225176177595e-05, "epoch": 1.0130718954248366, "percentage": 14.47, "elapsed_time": "0:08:21", "remaining_time": "0:49:23"}
{"current_steps": 160, "total_steps": 1071, "loss": 0.442, "lr": 3.972382429578577e-05, "epoch": 1.0457516339869282, "percentage": 14.94, "elapsed_time": "0:08:35", "remaining_time": "0:48:53"}
{"current_steps": 165, "total_steps": 1071, "loss": 0.4546, "lr": 3.966717571930529e-05, "epoch": 1.0784313725490196, "percentage": 15.41, "elapsed_time": "0:08:55", "remaining_time": "0:49:02"}
{"current_steps": 170, "total_steps": 1071, "loss": 0.4344, "lr": 3.960529451858575e-05, "epoch": 1.1111111111111112, "percentage": 15.87, "elapsed_time": "0:09:09", "remaining_time": "0:48:31"}
{"current_steps": 175, "total_steps": 1071, "loss": 0.4177, "lr": 3.9538197157662226e-05, "epoch": 1.1437908496732025, "percentage": 16.34, "elapsed_time": "0:09:30", "remaining_time": "0:48:39"}
{"current_steps": 180, "total_steps": 1071, "loss": 0.4335, "lr": 3.946590148837487e-05, "epoch": 1.1764705882352942, "percentage": 16.81, "elapsed_time": "0:09:45", "remaining_time": "0:48:20"}
{"current_steps": 185, "total_steps": 1071, "loss": 0.4002, "lr": 3.9388426745619266e-05, "epoch": 1.2091503267973855, "percentage": 17.27, "elapsed_time": "0:10:01", "remaining_time": "0:48:02"}
{"current_steps": 190, "total_steps": 1071, "loss": 0.4606, "lr": 3.930579354222883e-05, "epoch": 1.2418300653594772, "percentage": 17.74, "elapsed_time": "0:10:14", "remaining_time": "0:47:31"}
{"current_steps": 195, "total_steps": 1071, "loss": 0.4677, "lr": 3.921802386349057e-05, "epoch": 1.2745098039215685, "percentage": 18.21, "elapsed_time": "0:10:29", "remaining_time": "0:47:09"}
{"current_steps": 200, "total_steps": 1071, "loss": 0.458, "lr": 3.912514106129576e-05, "epoch": 1.3071895424836601, "percentage": 18.67, "elapsed_time": "0:10:45", "remaining_time": "0:46:50"}
{"current_steps": 205, "total_steps": 1071, "loss": 0.4572, "lr": 3.902716984792685e-05, "epoch": 1.3398692810457518, "percentage": 19.14, "elapsed_time": "0:12:07", "remaining_time": "0:51:15"}
{"current_steps": 210, "total_steps": 1071, "loss": 0.4438, "lr": 3.8924136289482686e-05, "epoch": 1.3725490196078431, "percentage": 19.61, "elapsed_time": "0:12:24", "remaining_time": "0:50:53"}
{"current_steps": 215, "total_steps": 1071, "loss": 0.476, "lr": 3.881606779894329e-05, "epoch": 1.4052287581699345, "percentage": 20.07, "elapsed_time": "0:12:40", "remaining_time": "0:50:27"}
{"current_steps": 220, "total_steps": 1071, "loss": 0.4424, "lr": 3.8702993128876455e-05, "epoch": 1.4379084967320261, "percentage": 20.54, "elapsed_time": "0:12:55", "remaining_time": "0:49:57"}
{"current_steps": 225, "total_steps": 1071, "loss": 0.4517, "lr": 3.858494236378785e-05, "epoch": 1.4705882352941178, "percentage": 21.01, "elapsed_time": "0:13:10", "remaining_time": "0:49:32"}
{"current_steps": 230, "total_steps": 1071, "loss": 0.4507, "lr": 3.846194691211678e-05, "epoch": 1.5032679738562091, "percentage": 21.48, "elapsed_time": "0:13:25", "remaining_time": "0:49:06"}
{"current_steps": 235, "total_steps": 1071, "loss": 0.4525, "lr": 3.8334039497879694e-05, "epoch": 1.5359477124183005, "percentage": 21.94, "elapsed_time": "0:13:38", "remaining_time": "0:48:32"}
{"current_steps": 240, "total_steps": 1071, "loss": 0.4507, "lr": 3.8201254151963664e-05, "epoch": 1.5686274509803921, "percentage": 22.41, "elapsed_time": "0:13:51", "remaining_time": "0:48:00"}
{"current_steps": 245, "total_steps": 1071, "loss": 0.4416, "lr": 3.8063626203072196e-05, "epoch": 1.6013071895424837, "percentage": 22.88, "elapsed_time": "0:14:06", "remaining_time": "0:47:33"}
{"current_steps": 250, "total_steps": 1071, "loss": 0.4301, "lr": 3.792119226832569e-05, "epoch": 1.6339869281045751, "percentage": 23.34, "elapsed_time": "0:14:27", "remaining_time": "0:47:29"}
{"current_steps": 255, "total_steps": 1071, "loss": 0.4653, "lr": 3.7773990243519154e-05, "epoch": 1.6666666666666665, "percentage": 23.81, "elapsed_time": "0:14:41", "remaining_time": "0:46:59"}
{"current_steps": 260, "total_steps": 1071, "loss": 0.443, "lr": 3.762205929303969e-05, "epoch": 1.6993464052287581, "percentage": 24.28, "elapsed_time": "0:14:54", "remaining_time": "0:46:30"}
{"current_steps": 265, "total_steps": 1071, "loss": 0.4581, "lr": 3.746543983944646e-05, "epoch": 1.7320261437908497, "percentage": 24.74, "elapsed_time": "0:15:07", "remaining_time": "0:46:00"}
{"current_steps": 270, "total_steps": 1071, "loss": 0.4599, "lr": 3.730417355271593e-05, "epoch": 1.7647058823529411, "percentage": 25.21, "elapsed_time": "0:15:21", "remaining_time": "0:45:32"}
{"current_steps": 275, "total_steps": 1071, "loss": 0.4125, "lr": 3.713830333915517e-05, "epoch": 1.7973856209150327, "percentage": 25.68, "elapsed_time": "0:15:36", "remaining_time": "0:45:11"}
{"current_steps": 280, "total_steps": 1071, "loss": 0.3786, "lr": 3.6967873329986305e-05, "epoch": 1.8300653594771243, "percentage": 26.14, "elapsed_time": "0:15:53", "remaining_time": "0:44:53"}
{"current_steps": 285, "total_steps": 1071, "loss": 0.436, "lr": 3.679292886960497e-05, "epoch": 1.8627450980392157, "percentage": 26.61, "elapsed_time": "0:16:11", "remaining_time": "0:44:39"}
{"current_steps": 290, "total_steps": 1071, "loss": 0.4431, "lr": 3.661351650351608e-05, "epoch": 1.8954248366013071, "percentage": 27.08, "elapsed_time": "0:16:25", "remaining_time": "0:44:13"}
{"current_steps": 295, "total_steps": 1071, "loss": 0.4515, "lr": 3.642968396594995e-05, "epoch": 1.9281045751633987, "percentage": 27.54, "elapsed_time": "0:16:39", "remaining_time": "0:43:48"}
{"current_steps": 300, "total_steps": 1071, "loss": 0.4676, "lr": 3.624148016716222e-05, "epoch": 1.9607843137254903, "percentage": 28.01, "elapsed_time": "0:16:53", "remaining_time": "0:43:24"}
{"current_steps": 305, "total_steps": 1071, "loss": 0.4447, "lr": 3.604895518042081e-05, "epoch": 1.9934640522875817, "percentage": 28.48, "elapsed_time": "0:17:11", "remaining_time": "0:43:09"}
{"current_steps": 310, "total_steps": 1071, "loss": 0.3317, "lr": 3.585216022868356e-05, "epoch": 2.026143790849673, "percentage": 28.94, "elapsed_time": "0:17:27", "remaining_time": "0:42:50"}
{"current_steps": 315, "total_steps": 1071, "loss": 0.3789, "lr": 3.565114767096984e-05, "epoch": 2.0588235294117645, "percentage": 29.41, "elapsed_time": "0:17:40", "remaining_time": "0:42:24"}
{"current_steps": 320, "total_steps": 1071, "loss": 0.3653, "lr": 3.544597098843001e-05, "epoch": 2.0915032679738563, "percentage": 29.88, "elapsed_time": "0:17:57", "remaining_time": "0:42:09"}
{"current_steps": 325, "total_steps": 1071, "loss": 0.3748, "lr": 3.5236684770116295e-05, "epoch": 2.1241830065359477, "percentage": 30.35, "elapsed_time": "0:18:15", "remaining_time": "0:41:54"}
{"current_steps": 330, "total_steps": 1071, "loss": 0.3466, "lr": 3.502334469845886e-05, "epoch": 2.156862745098039, "percentage": 30.81, "elapsed_time": "0:18:36", "remaining_time": "0:41:47"}
{"current_steps": 335, "total_steps": 1071, "loss": 0.3717, "lr": 3.4806007534451075e-05, "epoch": 2.189542483660131, "percentage": 31.28, "elapsed_time": "0:18:56", "remaining_time": "0:41:35"}
{"current_steps": 340, "total_steps": 1071, "loss": 0.3992, "lr": 3.458473110254767e-05, "epoch": 2.2222222222222223, "percentage": 31.75, "elapsed_time": "0:19:08", "remaining_time": "0:41:09"}
{"current_steps": 345, "total_steps": 1071, "loss": 0.3575, "lr": 3.43595742752801e-05, "epoch": 2.2549019607843137, "percentage": 32.21, "elapsed_time": "0:19:24", "remaining_time": "0:40:50"}
{"current_steps": 350, "total_steps": 1071, "loss": 0.3993, "lr": 3.413059695759297e-05, "epoch": 2.287581699346405, "percentage": 32.68, "elapsed_time": "0:19:38", "remaining_time": "0:40:27"}
{"current_steps": 355, "total_steps": 1071, "loss": 0.3489, "lr": 3.389786007090581e-05, "epoch": 2.3202614379084965, "percentage": 33.15, "elapsed_time": "0:19:57", "remaining_time": "0:40:16"}
{"current_steps": 360, "total_steps": 1071, "loss": 0.3707, "lr": 3.3661425536904354e-05, "epoch": 2.3529411764705883, "percentage": 33.61, "elapsed_time": "0:20:12", "remaining_time": "0:39:55"}
{"current_steps": 365, "total_steps": 1071, "loss": 0.344, "lr": 3.3421356261065805e-05, "epoch": 2.3856209150326797, "percentage": 34.08, "elapsed_time": "0:20:26", "remaining_time": "0:39:32"}
{"current_steps": 370, "total_steps": 1071, "loss": 0.3509, "lr": 3.317771611592222e-05, "epoch": 2.418300653594771, "percentage": 34.55, "elapsed_time": "0:20:41", "remaining_time": "0:39:12"}
{"current_steps": 375, "total_steps": 1071, "loss": 0.3718, "lr": 3.293056992406671e-05, "epoch": 2.450980392156863, "percentage": 35.01, "elapsed_time": "0:20:55", "remaining_time": "0:38:50"}
{"current_steps": 380, "total_steps": 1071, "loss": 0.3411, "lr": 3.267998344090679e-05, "epoch": 2.4836601307189543, "percentage": 35.48, "elapsed_time": "0:21:09", "remaining_time": "0:38:28"}
{"current_steps": 385, "total_steps": 1071, "loss": 0.3492, "lr": 3.242602333716958e-05, "epoch": 2.5163398692810457, "percentage": 35.95, "elapsed_time": "0:21:23", "remaining_time": "0:38:07"}
{"current_steps": 390, "total_steps": 1071, "loss": 0.3939, "lr": 3.21687571811635e-05, "epoch": 2.549019607843137, "percentage": 36.41, "elapsed_time": "0:21:38", "remaining_time": "0:37:47"}
{"current_steps": 395, "total_steps": 1071, "loss": 0.3632, "lr": 3.190825342080109e-05, "epoch": 2.581699346405229, "percentage": 36.88, "elapsed_time": "0:21:58", "remaining_time": "0:37:36"}
{"current_steps": 400, "total_steps": 1071, "loss": 0.3384, "lr": 3.164458136538789e-05, "epoch": 2.6143790849673203, "percentage": 37.35, "elapsed_time": "0:22:13", "remaining_time": "0:37:16"}
{"current_steps": 405, "total_steps": 1071, "loss": 0.3844, "lr": 3.137781116718206e-05, "epoch": 2.6470588235294117, "percentage": 37.82, "elapsed_time": "0:23:27", "remaining_time": "0:38:34"}
{"current_steps": 410, "total_steps": 1071, "loss": 0.3714, "lr": 3.110801380272975e-05, "epoch": 2.6797385620915035, "percentage": 38.28, "elapsed_time": "0:23:41", "remaining_time": "0:38:11"}
{"current_steps": 415, "total_steps": 1071, "loss": 0.374, "lr": 3.0835261053981226e-05, "epoch": 2.712418300653595, "percentage": 38.75, "elapsed_time": "0:23:56", "remaining_time": "0:37:49"}
{"current_steps": 420, "total_steps": 1071, "loss": 0.3541, "lr": 3.055962548919257e-05, "epoch": 2.7450980392156863, "percentage": 39.22, "elapsed_time": "0:24:09", "remaining_time": "0:37:26"}
{"current_steps": 425, "total_steps": 1071, "loss": 0.3622, "lr": 3.0281180443618337e-05, "epoch": 2.7777777777777777, "percentage": 39.68, "elapsed_time": "0:24:25", "remaining_time": "0:37:06"}
{"current_steps": 430, "total_steps": 1071, "loss": 0.3183, "lr": 3.0000000000000004e-05, "epoch": 2.810457516339869, "percentage": 40.15, "elapsed_time": "0:24:44", "remaining_time": "0:36:53"}
{"current_steps": 435, "total_steps": 1071, "loss": 0.3729, "lr": 2.9716158968855665e-05, "epoch": 2.843137254901961, "percentage": 40.62, "elapsed_time": "0:24:57", "remaining_time": "0:36:29"}
{"current_steps": 440, "total_steps": 1071, "loss": 0.3203, "lr": 2.9429732868576e-05, "epoch": 2.8758169934640523, "percentage": 41.08, "elapsed_time": "0:25:14", "remaining_time": "0:36:12"}
{"current_steps": 445, "total_steps": 1071, "loss": 0.3724, "lr": 2.9140797905331964e-05, "epoch": 2.9084967320261437, "percentage": 41.55, "elapsed_time": "0:25:29", "remaining_time": "0:35:50"}
{"current_steps": 450, "total_steps": 1071, "loss": 0.3809, "lr": 2.884943095279946e-05, "epoch": 2.9411764705882355, "percentage": 42.02, "elapsed_time": "0:25:43", "remaining_time": "0:35:29"}
{"current_steps": 455, "total_steps": 1071, "loss": 0.3911, "lr": 2.8555709531706423e-05, "epoch": 2.973856209150327, "percentage": 42.48, "elapsed_time": "0:25:58", "remaining_time": "0:35:09"}
{"current_steps": 460, "total_steps": 1071, "loss": 0.3648, "lr": 2.825971178920777e-05, "epoch": 3.0065359477124183, "percentage": 42.95, "elapsed_time": "0:26:10", "remaining_time": "0:34:45"}
{"current_steps": 465, "total_steps": 1071, "loss": 0.2969, "lr": 2.796151647809364e-05, "epoch": 3.0392156862745097, "percentage": 43.42, "elapsed_time": "0:26:23", "remaining_time": "0:34:24"}
{"current_steps": 470, "total_steps": 1071, "loss": 0.3121, "lr": 2.7661202935836536e-05, "epoch": 3.0718954248366015, "percentage": 43.88, "elapsed_time": "0:26:37", "remaining_time": "0:34:03"}
{"current_steps": 475, "total_steps": 1071, "loss": 0.3271, "lr": 2.73588510634829e-05, "epoch": 3.104575163398693, "percentage": 44.35, "elapsed_time": "0:26:52", "remaining_time": "0:33:42"}
{"current_steps": 480, "total_steps": 1071, "loss": 0.3019, "lr": 2.7054541304394736e-05, "epoch": 3.1372549019607843, "percentage": 44.82, "elapsed_time": "0:27:08", "remaining_time": "0:33:25"}
{"current_steps": 485, "total_steps": 1071, "loss": 0.251, "lr": 2.6748354622846962e-05, "epoch": 3.1699346405228757, "percentage": 45.28, "elapsed_time": "0:27:25", "remaining_time": "0:33:08"}
{"current_steps": 490, "total_steps": 1071, "loss": 0.2652, "lr": 2.6440372482486127e-05, "epoch": 3.2026143790849675, "percentage": 45.75, "elapsed_time": "0:27:42", "remaining_time": "0:32:51"}
{"current_steps": 495, "total_steps": 1071, "loss": 0.28, "lr": 2.613067682465631e-05, "epoch": 3.235294117647059, "percentage": 46.22, "elapsed_time": "0:27:57", "remaining_time": "0:32:32"}
{"current_steps": 500, "total_steps": 1071, "loss": 0.297, "lr": 2.5819350046597927e-05, "epoch": 3.2679738562091503, "percentage": 46.69, "elapsed_time": "0:28:12", "remaining_time": "0:32:12"}
{"current_steps": 505, "total_steps": 1071, "loss": 0.3077, "lr": 2.55064749795252e-05, "epoch": 3.3006535947712417, "percentage": 47.15, "elapsed_time": "0:28:26", "remaining_time": "0:31:52"}
{"current_steps": 510, "total_steps": 1071, "loss": 0.2939, "lr": 2.519213486658819e-05, "epoch": 3.3333333333333335, "percentage": 47.62, "elapsed_time": "0:28:41", "remaining_time": "0:31:34"}
{"current_steps": 515, "total_steps": 1071, "loss": 0.3398, "lr": 2.4876413340725244e-05, "epoch": 3.366013071895425, "percentage": 48.09, "elapsed_time": "0:28:56", "remaining_time": "0:31:14"}
{"current_steps": 520, "total_steps": 1071, "loss": 0.3096, "lr": 2.4559394402411703e-05, "epoch": 3.3986928104575163, "percentage": 48.55, "elapsed_time": "0:29:12", "remaining_time": "0:30:57"}
{"current_steps": 525, "total_steps": 1071, "loss": 0.2718, "lr": 2.4241162397310836e-05, "epoch": 3.431372549019608, "percentage": 49.02, "elapsed_time": "0:29:28", "remaining_time": "0:30:39"}
{"current_steps": 530, "total_steps": 1071, "loss": 0.2899, "lr": 2.3921801993832964e-05, "epoch": 3.4640522875816995, "percentage": 49.49, "elapsed_time": "0:29:43", "remaining_time": "0:30:20"}
{"current_steps": 535, "total_steps": 1071, "loss": 0.2905, "lr": 2.3601398160608667e-05, "epoch": 3.496732026143791, "percentage": 49.95, "elapsed_time": "0:30:01", "remaining_time": "0:30:04"}
{"current_steps": 540, "total_steps": 1071, "loss": 0.3064, "lr": 2.3280036143882145e-05, "epoch": 3.5294117647058822, "percentage": 50.42, "elapsed_time": "0:30:14", "remaining_time": "0:29:44"}
{"current_steps": 545, "total_steps": 1071, "loss": 0.2679, "lr": 2.2957801444830684e-05, "epoch": 3.5620915032679736, "percentage": 50.89, "elapsed_time": "0:30:29", "remaining_time": "0:29:25"}
{"current_steps": 550, "total_steps": 1071, "loss": 0.329, "lr": 2.2634779796816377e-05, "epoch": 3.5947712418300655, "percentage": 51.35, "elapsed_time": "0:30:43", "remaining_time": "0:29:06"}
{"current_steps": 555, "total_steps": 1071, "loss": 0.2683, "lr": 2.2311057142575953e-05, "epoch": 3.627450980392157, "percentage": 51.82, "elapsed_time": "0:30:58", "remaining_time": "0:28:48"}
{"current_steps": 560, "total_steps": 1071, "loss": 0.2962, "lr": 2.198671961135498e-05, "epoch": 3.6601307189542482, "percentage": 52.29, "elapsed_time": "0:31:14", "remaining_time": "0:28:30"}
{"current_steps": 565, "total_steps": 1071, "loss": 0.2646, "lr": 2.166185349599245e-05, "epoch": 3.69281045751634, "percentage": 52.75, "elapsed_time": "0:31:30", "remaining_time": "0:28:12"}
{"current_steps": 570, "total_steps": 1071, "loss": 0.3037, "lr": 2.1336545229961772e-05, "epoch": 3.7254901960784315, "percentage": 53.22, "elapsed_time": "0:31:46", "remaining_time": "0:27:55"}
{"current_steps": 575, "total_steps": 1071, "loss": 0.323, "lr": 2.1010881364374404e-05, "epoch": 3.758169934640523, "percentage": 53.69, "elapsed_time": "0:32:00", "remaining_time": "0:27:36"}
{"current_steps": 580, "total_steps": 1071, "loss": 0.2629, "lr": 2.0684948544952217e-05, "epoch": 3.7908496732026142, "percentage": 54.15, "elapsed_time": "0:32:19", "remaining_time": "0:27:21"}
{"current_steps": 585, "total_steps": 1071, "loss": 0.3031, "lr": 2.0358833488974556e-05, "epoch": 3.8235294117647056, "percentage": 54.62, "elapsed_time": "0:32:31", "remaining_time": "0:27:01"}
{"current_steps": 590, "total_steps": 1071, "loss": 0.3152, "lr": 2.0032622962206428e-05, "epoch": 3.8562091503267975, "percentage": 55.09, "elapsed_time": "0:32:44", "remaining_time": "0:26:41"}
{"current_steps": 595, "total_steps": 1071, "loss": 0.2989, "lr": 1.9706403755813672e-05, "epoch": 3.888888888888889, "percentage": 55.56, "elapsed_time": "0:32:59", "remaining_time": "0:26:23"}
{"current_steps": 600, "total_steps": 1071, "loss": 0.2982, "lr": 1.9380262663271407e-05, "epoch": 3.9215686274509802, "percentage": 56.02, "elapsed_time": "0:33:18", "remaining_time": "0:26:08"}
{"current_steps": 605, "total_steps": 1071, "loss": 0.2867, "lr": 1.9054286457271892e-05, "epoch": 3.954248366013072, "percentage": 56.49, "elapsed_time": "0:34:30", "remaining_time": "0:26:35"}
{"current_steps": 610, "total_steps": 1071, "loss": 0.3039, "lr": 1.8728561866637886e-05, "epoch": 3.9869281045751634, "percentage": 56.96, "elapsed_time": "0:34:43", "remaining_time": "0:26:14"}
{"current_steps": 615, "total_steps": 1071, "loss": 0.2386, "lr": 1.840317555324764e-05, "epoch": 4.019607843137255, "percentage": 57.42, "elapsed_time": "0:34:59", "remaining_time": "0:25:56"}
{"current_steps": 620, "total_steps": 1071, "loss": 0.2196, "lr": 1.8078214088977817e-05, "epoch": 4.052287581699346, "percentage": 57.89, "elapsed_time": "0:35:12", "remaining_time": "0:25:36"}
{"current_steps": 625, "total_steps": 1071, "loss": 0.2553, "lr": 1.7753763932670257e-05, "epoch": 4.084967320261438, "percentage": 58.36, "elapsed_time": "0:35:27", "remaining_time": "0:25:18"}
{"current_steps": 630, "total_steps": 1071, "loss": 0.2434, "lr": 1.742991140712881e-05, "epoch": 4.117647058823529, "percentage": 58.82, "elapsed_time": "0:35:42", "remaining_time": "0:24:59"}
{"current_steps": 635, "total_steps": 1071, "loss": 0.2344, "lr": 1.7106742676152454e-05, "epoch": 4.150326797385621, "percentage": 59.29, "elapsed_time": "0:35:58", "remaining_time": "0:24:41"}
{"current_steps": 640, "total_steps": 1071, "loss": 0.2277, "lr": 1.678434372161064e-05, "epoch": 4.183006535947713, "percentage": 59.76, "elapsed_time": "0:36:14", "remaining_time": "0:24:24"}
{"current_steps": 645, "total_steps": 1071, "loss": 0.2179, "lr": 1.646280032056704e-05, "epoch": 4.215686274509804, "percentage": 60.22, "elapsed_time": "0:36:28", "remaining_time": "0:24:05"}
{"current_steps": 650, "total_steps": 1071, "loss": 0.233, "lr": 1.6142198022457853e-05, "epoch": 4.248366013071895, "percentage": 60.69, "elapsed_time": "0:36:42", "remaining_time": "0:23:46"}
{"current_steps": 655, "total_steps": 1071, "loss": 0.2493, "lr": 1.5822622126330597e-05, "epoch": 4.281045751633987, "percentage": 61.16, "elapsed_time": "0:36:54", "remaining_time": "0:23:26"}
{"current_steps": 660, "total_steps": 1071, "loss": 0.2102, "lr": 1.550415765814955e-05, "epoch": 4.313725490196078, "percentage": 61.62, "elapsed_time": "0:37:10", "remaining_time": "0:23:08"}
{"current_steps": 665, "total_steps": 1071, "loss": 0.2354, "lr": 1.5186889348173857e-05, "epoch": 4.34640522875817, "percentage": 62.09, "elapsed_time": "0:37:26", "remaining_time": "0:22:51"}
{"current_steps": 670, "total_steps": 1071, "loss": 0.2582, "lr": 1.487090160841433e-05, "epoch": 4.379084967320262, "percentage": 62.56, "elapsed_time": "0:37:40", "remaining_time": "0:22:32"}
{"current_steps": 675, "total_steps": 1071, "loss": 0.2262, "lr": 1.4556278510174827e-05, "epoch": 4.411764705882353, "percentage": 63.03, "elapsed_time": "0:37:57", "remaining_time": "0:22:16"}
{"current_steps": 680, "total_steps": 1071, "loss": 0.238, "lr": 1.424310376168441e-05, "epoch": 4.444444444444445, "percentage": 63.49, "elapsed_time": "0:38:12", "remaining_time": "0:21:58"}
{"current_steps": 685, "total_steps": 1071, "loss": 0.2424, "lr": 1.3931460685826022e-05, "epoch": 4.477124183006536, "percentage": 63.96, "elapsed_time": "0:38:32", "remaining_time": "0:21:42"}
{"current_steps": 690, "total_steps": 1071, "loss": 0.2228, "lr": 1.3621432197967664e-05, "epoch": 4.509803921568627, "percentage": 64.43, "elapsed_time": "0:38:48", "remaining_time": "0:21:25"}
{"current_steps": 695, "total_steps": 1071, "loss": 0.2396, "lr": 1.3313100783902097e-05, "epoch": 4.542483660130719, "percentage": 64.89, "elapsed_time": "0:39:01", "remaining_time": "0:21:06"}
{"current_steps": 700, "total_steps": 1071, "loss": 0.2442, "lr": 1.3006548477900735e-05, "epoch": 4.57516339869281, "percentage": 65.36, "elapsed_time": "0:39:18", "remaining_time": "0:20:49"}
{"current_steps": 705, "total_steps": 1071, "loss": 0.221, "lr": 1.270185684088771e-05, "epoch": 4.607843137254902, "percentage": 65.83, "elapsed_time": "0:39:32", "remaining_time": "0:20:31"}
{"current_steps": 710, "total_steps": 1071, "loss": 0.2347, "lr": 1.2399106938739903e-05, "epoch": 4.640522875816993, "percentage": 66.29, "elapsed_time": "0:39:45", "remaining_time": "0:20:13"}
{"current_steps": 715, "total_steps": 1071, "loss": 0.2288, "lr": 1.2098379320718633e-05, "epoch": 4.673202614379085, "percentage": 66.76, "elapsed_time": "0:40:04", "remaining_time": "0:19:57"}
{"current_steps": 720, "total_steps": 1071, "loss": 0.2324, "lr": 1.179975399803881e-05, "epoch": 4.705882352941177, "percentage": 67.23, "elapsed_time": "0:40:17", "remaining_time": "0:19:38"}
{"current_steps": 725, "total_steps": 1071, "loss": 0.2347, "lr": 1.1503310422581286e-05, "epoch": 4.738562091503268, "percentage": 67.69, "elapsed_time": "0:40:34", "remaining_time": "0:19:21"}
{"current_steps": 730, "total_steps": 1071, "loss": 0.2208, "lr": 1.1209127465753978e-05, "epoch": 4.771241830065359, "percentage": 68.16, "elapsed_time": "0:40:51", "remaining_time": "0:19:05"}
{"current_steps": 735, "total_steps": 1071, "loss": 0.2089, "lr": 1.0917283397507392e-05, "epoch": 4.803921568627451, "percentage": 68.63, "elapsed_time": "0:41:06", "remaining_time": "0:18:47"}
{"current_steps": 740, "total_steps": 1071, "loss": 0.2332, "lr": 1.0627855865510294e-05, "epoch": 4.836601307189542, "percentage": 69.09, "elapsed_time": "0:41:21", "remaining_time": "0:18:30"}
{"current_steps": 745, "total_steps": 1071, "loss": 0.2475, "lr": 1.034092187449082e-05, "epoch": 4.8692810457516345, "percentage": 69.56, "elapsed_time": "0:41:34", "remaining_time": "0:18:11"}
{"current_steps": 750, "total_steps": 1071, "loss": 0.2137, "lr": 1.0056557765748684e-05, "epoch": 4.901960784313726, "percentage": 70.03, "elapsed_time": "0:41:48", "remaining_time": "0:17:53"}
{"current_steps": 755, "total_steps": 1071, "loss": 0.2251, "lr": 9.774839196843953e-06, "epoch": 4.934640522875817, "percentage": 70.49, "elapsed_time": "0:42:04", "remaining_time": "0:17:36"}
{"current_steps": 760, "total_steps": 1071, "loss": 0.2221, "lr": 9.49584112146765e-06, "epoch": 4.967320261437909, "percentage": 70.96, "elapsed_time": "0:42:19", "remaining_time": "0:17:19"}
{"current_steps": 765, "total_steps": 1071, "loss": 0.2197, "lr": 9.21963776949969e-06, "epoch": 5.0, "percentage": 71.43, "elapsed_time": "0:42:34", "remaining_time": "0:17:01"}
{"current_steps": 770, "total_steps": 1071, "loss": 0.1998, "lr": 8.946302627259363e-06, "epoch": 5.032679738562091, "percentage": 71.9, "elapsed_time": "0:42:48", "remaining_time": "0:16:43"}
{"current_steps": 775, "total_steps": 1071, "loss": 0.1893, "lr": 8.67590841795366e-06, "epoch": 5.065359477124183, "percentage": 72.36, "elapsed_time": "0:43:06", "remaining_time": "0:16:27"}
{"current_steps": 780, "total_steps": 1071, "loss": 0.1854, "lr": 8.408527082328605e-06, "epoch": 5.098039215686274, "percentage": 72.83, "elapsed_time": "0:43:21", "remaining_time": "0:16:10"}
{"current_steps": 785, "total_steps": 1071, "loss": 0.1881, "lr": 8.144229759528835e-06, "epoch": 5.130718954248366, "percentage": 73.3, "elapsed_time": "0:43:33", "remaining_time": "0:15:52"}
{"current_steps": 790, "total_steps": 1071, "loss": 0.1847, "lr": 7.883086768170369e-06, "epoch": 5.163398692810458, "percentage": 73.76, "elapsed_time": "0:43:53", "remaining_time": "0:15:36"}
{"current_steps": 795, "total_steps": 1071, "loss": 0.2001, "lr": 7.625167587631732e-06, "epoch": 5.196078431372549, "percentage": 74.23, "elapsed_time": "0:44:05", "remaining_time": "0:15:18"}
{"current_steps": 800, "total_steps": 1071, "loss": 0.1901, "lr": 7.370540839568372e-06, "epoch": 5.228758169934641, "percentage": 74.7, "elapsed_time": "0:44:20", "remaining_time": "0:15:01"}
{"current_steps": 805, "total_steps": 1071, "loss": 0.2073, "lr": 7.119274269655265e-06, "epoch": 5.261437908496732, "percentage": 75.16, "elapsed_time": "0:45:34", "remaining_time": "0:15:03"}
{"current_steps": 810, "total_steps": 1071, "loss": 0.1913, "lr": 6.87143472956256e-06, "epoch": 5.294117647058823, "percentage": 75.63, "elapsed_time": "0:45:49", "remaining_time": "0:14:46"}
{"current_steps": 815, "total_steps": 1071, "loss": 0.1651, "lr": 6.627088159169146e-06, "epoch": 5.326797385620915, "percentage": 76.1, "elapsed_time": "0:46:06", "remaining_time": "0:14:28"}
{"current_steps": 820, "total_steps": 1071, "loss": 0.1777, "lr": 6.3862995690187505e-06, "epoch": 5.359477124183006, "percentage": 76.56, "elapsed_time": "0:46:19", "remaining_time": "0:14:10"}
{"current_steps": 825, "total_steps": 1071, "loss": 0.2005, "lr": 6.1491330230232944e-06, "epoch": 5.392156862745098, "percentage": 77.03, "elapsed_time": "0:46:34", "remaining_time": "0:13:53"}
{"current_steps": 830, "total_steps": 1071, "loss": 0.1839, "lr": 5.915651621418172e-06, "epoch": 5.42483660130719, "percentage": 77.5, "elapsed_time": "0:46:52", "remaining_time": "0:13:36"}
{"current_steps": 835, "total_steps": 1071, "loss": 0.2127, "lr": 5.6859174839738576e-06, "epoch": 5.457516339869281, "percentage": 77.96, "elapsed_time": "0:47:06", "remaining_time": "0:13:18"}
{"current_steps": 840, "total_steps": 1071, "loss": 0.1693, "lr": 5.459991733468375e-06, "epoch": 5.490196078431373, "percentage": 78.43, "elapsed_time": "0:47:28", "remaining_time": "0:13:03"}
{"current_steps": 845, "total_steps": 1071, "loss": 0.2142, "lr": 5.237934479425091e-06, "epoch": 5.522875816993464, "percentage": 78.9, "elapsed_time": "0:47:41", "remaining_time": "0:12:45"}
{"current_steps": 850, "total_steps": 1071, "loss": 0.2004, "lr": 5.019804802120027e-06, "epoch": 5.555555555555555, "percentage": 79.37, "elapsed_time": "0:47:55", "remaining_time": "0:12:27"}
{"current_steps": 855, "total_steps": 1071, "loss": 0.1831, "lr": 4.805660736863023e-06, "epoch": 5.588235294117647, "percentage": 79.83, "elapsed_time": "0:48:11", "remaining_time": "0:12:10"}
{"current_steps": 860, "total_steps": 1071, "loss": 0.188, "lr": 4.595559258556963e-06, "epoch": 5.620915032679738, "percentage": 80.3, "elapsed_time": "0:48:25", "remaining_time": "0:11:52"}
{"current_steps": 865, "total_steps": 1071, "loss": 0.2014, "lr": 4.389556266539081e-06, "epoch": 5.65359477124183, "percentage": 80.77, "elapsed_time": "0:48:41", "remaining_time": "0:11:35"}
{"current_steps": 870, "total_steps": 1071, "loss": 0.2089, "lr": 4.187706569708472e-06, "epoch": 5.686274509803922, "percentage": 81.23, "elapsed_time": "0:48:55", "remaining_time": "0:11:18"}
{"current_steps": 875, "total_steps": 1071, "loss": 0.1865, "lr": 3.990063871943681e-06, "epoch": 5.718954248366013, "percentage": 81.7, "elapsed_time": "0:49:09", "remaining_time": "0:11:00"}
{"current_steps": 880, "total_steps": 1071, "loss": 0.2047, "lr": 3.796680757814344e-06, "epoch": 5.751633986928105, "percentage": 82.17, "elapsed_time": "0:49:25", "remaining_time": "0:10:43"}
{"current_steps": 885, "total_steps": 1071, "loss": 0.1978, "lr": 3.6076086785905708e-06, "epoch": 5.784313725490196, "percentage": 82.63, "elapsed_time": "0:49:38", "remaining_time": "0:10:26"}
{"current_steps": 890, "total_steps": 1071, "loss": 0.1838, "lr": 3.4228979385539153e-06, "epoch": 5.816993464052287, "percentage": 83.1, "elapsed_time": "0:49:52", "remaining_time": "0:10:08"}
{"current_steps": 895, "total_steps": 1071, "loss": 0.2068, "lr": 3.242597681613471e-06, "epoch": 5.849673202614379, "percentage": 83.57, "elapsed_time": "0:50:07", "remaining_time": "0:09:51"}
{"current_steps": 900, "total_steps": 1071, "loss": 0.1695, "lr": 3.0667558782306782e-06, "epoch": 5.882352941176471, "percentage": 84.03, "elapsed_time": "0:50:21", "remaining_time": "0:09:34"}
{"current_steps": 905, "total_steps": 1071, "loss": 0.1708, "lr": 2.895419312656409e-06, "epoch": 5.915032679738562, "percentage": 84.5, "elapsed_time": "0:50:38", "remaining_time": "0:09:17"}
{"current_steps": 910, "total_steps": 1071, "loss": 0.1935, "lr": 2.7286335704835788e-06, "epoch": 5.947712418300654, "percentage": 84.97, "elapsed_time": "0:50:53", "remaining_time": "0:09:00"}
{"current_steps": 915, "total_steps": 1071, "loss": 0.2118, "lr": 2.566443026518692e-06, "epoch": 5.980392156862745, "percentage": 85.43, "elapsed_time": "0:51:08", "remaining_time": "0:08:43"}
{"current_steps": 920, "total_steps": 1071, "loss": 0.187, "lr": 2.4088908329755678e-06, "epoch": 6.0130718954248366, "percentage": 85.9, "elapsed_time": "0:51:21", "remaining_time": "0:08:25"}
{"current_steps": 925, "total_steps": 1071, "loss": 0.139, "lr": 2.256018907994284e-06, "epoch": 6.045751633986928, "percentage": 86.37, "elapsed_time": "0:51:40", "remaining_time": "0:08:09"}
{"current_steps": 930, "total_steps": 1071, "loss": 0.1548, "lr": 2.107867924488509e-06, "epoch": 6.078431372549019, "percentage": 86.83, "elapsed_time": "0:51:56", "remaining_time": "0:07:52"}
{"current_steps": 935, "total_steps": 1071, "loss": 0.1588, "lr": 1.9644772993241166e-06, "epoch": 6.111111111111111, "percentage": 87.3, "elapsed_time": "0:52:10", "remaining_time": "0:07:35"}
{"current_steps": 940, "total_steps": 1071, "loss": 0.1741, "lr": 1.8258851828319678e-06, "epoch": 6.143790849673203, "percentage": 87.77, "elapsed_time": "0:52:24", "remaining_time": "0:07:18"}
{"current_steps": 945, "total_steps": 1071, "loss": 0.1674, "lr": 1.692128448657695e-06, "epoch": 6.176470588235294, "percentage": 88.24, "elapsed_time": "0:52:38", "remaining_time": "0:07:01"}
{"current_steps": 950, "total_steps": 1071, "loss": 0.1791, "lr": 1.5632426839511494e-06, "epoch": 6.209150326797386, "percentage": 88.7, "elapsed_time": "0:52:52", "remaining_time": "0:06:44"}
{"current_steps": 955, "total_steps": 1071, "loss": 0.1831, "lr": 1.4392621798981154e-06, "epoch": 6.241830065359477, "percentage": 89.17, "elapsed_time": "0:53:04", "remaining_time": "0:06:26"}
{"current_steps": 960, "total_steps": 1071, "loss": 0.1799, "lr": 1.3202199225968481e-06, "epoch": 6.2745098039215685, "percentage": 89.64, "elapsed_time": "0:53:20", "remaining_time": "0:06:10"}
{"current_steps": 965, "total_steps": 1071, "loss": 0.178, "lr": 1.2061475842818337e-06, "epoch": 6.30718954248366, "percentage": 90.1, "elapsed_time": "0:53:38", "remaining_time": "0:05:53"}
{"current_steps": 970, "total_steps": 1071, "loss": 0.1761, "lr": 1.0970755148971057e-06, "epoch": 6.339869281045751, "percentage": 90.57, "elapsed_time": "0:53:51", "remaining_time": "0:05:36"}
{"current_steps": 975, "total_steps": 1071, "loss": 0.1725, "lr": 9.930327340213908e-07, "epoch": 6.372549019607844, "percentage": 91.04, "elapsed_time": "0:54:04", "remaining_time": "0:05:19"}
{"current_steps": 980, "total_steps": 1071, "loss": 0.163, "lr": 8.940469231471893e-07, "epoch": 6.405228758169935, "percentage": 91.5, "elapsed_time": "0:54:20", "remaining_time": "0:05:02"}
{"current_steps": 985, "total_steps": 1071, "loss": 0.1796, "lr": 8.001444183158602e-07, "epoch": 6.437908496732026, "percentage": 91.97, "elapsed_time": "0:54:37", "remaining_time": "0:04:46"}
{"current_steps": 990, "total_steps": 1071, "loss": 0.1758, "lr": 7.1135020311071e-07, "epoch": 6.470588235294118, "percentage": 92.44, "elapsed_time": "0:54:52", "remaining_time": "0:04:29"}
{"current_steps": 995, "total_steps": 1071, "loss": 0.1753, "lr": 6.276879020098769e-07, "epoch": 6.503267973856209, "percentage": 92.9, "elapsed_time": "0:55:08", "remaining_time": "0:04:12"}
{"current_steps": 1000, "total_steps": 1071, "loss": 0.1872, "lr": 5.491797741008232e-07, "epoch": 6.5359477124183005, "percentage": 93.37, "elapsed_time": "0:55:24", "remaining_time": "0:03:56"}
{"current_steps": 1005, "total_steps": 1071, "loss": 0.1598, "lr": 4.758467071581363e-07, "epoch": 6.568627450980392, "percentage": 93.84, "elapsed_time": "0:56:37", "remaining_time": "0:03:43"}
{"current_steps": 1010, "total_steps": 1071, "loss": 0.1659, "lr": 4.077082120861309e-07, "epoch": 6.601307189542483, "percentage": 94.3, "elapsed_time": "0:56:56", "remaining_time": "0:03:26"}
{"current_steps": 1015, "total_steps": 1071, "loss": 0.1596, "lr": 3.4478241772780695e-07, "epoch": 6.633986928104575, "percentage": 94.77, "elapsed_time": "0:57:11", "remaining_time": "0:03:09"}
{"current_steps": 1020, "total_steps": 1071, "loss": 0.1885, "lr": 2.8708606604151757e-07, "epoch": 6.666666666666667, "percentage": 95.24, "elapsed_time": "0:57:23", "remaining_time": "0:02:52"}
{"current_steps": 1025, "total_steps": 1071, "loss": 0.1514, "lr": 2.346345076466272e-07, "epoch": 6.699346405228758, "percentage": 95.7, "elapsed_time": "0:57:39", "remaining_time": "0:02:35"}
{"current_steps": 1030, "total_steps": 1071, "loss": 0.1579, "lr": 1.8744169773932784e-07, "epoch": 6.73202614379085, "percentage": 96.17, "elapsed_time": "0:57:54", "remaining_time": "0:02:18"}
{"current_steps": 1035, "total_steps": 1071, "loss": 0.1979, "lr": 1.4552019237976e-07, "epoch": 6.764705882352941, "percentage": 96.64, "elapsed_time": "0:58:07", "remaining_time": "0:02:01"}
{"current_steps": 1040, "total_steps": 1071, "loss": 0.1934, "lr": 1.0888114515134274e-07, "epoch": 6.7973856209150325, "percentage": 97.11, "elapsed_time": "0:58:22", "remaining_time": "0:01:44"}
{"current_steps": 1045, "total_steps": 1071, "loss": 0.1638, "lr": 7.753430419328301e-08, "epoch": 6.830065359477124, "percentage": 97.57, "elapsed_time": "0:58:34", "remaining_time": "0:01:27"}
{"current_steps": 1050, "total_steps": 1071, "loss": 0.1568, "lr": 5.1488009606979195e-08, "epoch": 6.862745098039216, "percentage": 98.04, "elapsed_time": "0:58:53", "remaining_time": "0:01:10"}
{"current_steps": 1055, "total_steps": 1071, "loss": 0.191, "lr": 3.074919123708275e-08, "epoch": 6.895424836601308, "percentage": 98.51, "elapsed_time": "0:59:07", "remaining_time": "0:00:53"}
{"current_steps": 1060, "total_steps": 1071, "loss": 0.193, "lr": 1.5323366827737496e-08, "epoch": 6.928104575163399, "percentage": 98.97, "elapsed_time": "0:59:21", "remaining_time": "0:00:36"}
{"current_steps": 1065, "total_steps": 1071, "loss": 0.1605, "lr": 5.2146405545427935e-09, "epoch": 6.96078431372549, "percentage": 99.44, "elapsed_time": "0:59:35", "remaining_time": "0:00:20"}
{"current_steps": 1070, "total_steps": 1071, "loss": 0.1657, "lr": 4.2570193260482727e-10, "epoch": 6.993464052287582, "percentage": 99.91, "elapsed_time": "0:59:50", "remaining_time": "0:00:03"}
{"current_steps": 1071, "total_steps": 1071, "epoch": 7.0, "percentage": 100.0, "elapsed_time": "1:00:56", "remaining_time": "0:00:00"}

2401
trainer_state.json Normal file

File diff suppressed because it is too large Load Diff

3
training_args.bin Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:4cbe37823df2c5caa0a66057022b5e5709aab46ed959ff65a8651095da40ca7d
size 8721

BIN
training_loss.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 49 KiB

1
vocab.json Normal file

File diff suppressed because one or more lines are too long