初始化项目,由ModelHub XC社区提供模型
Model: DCAgent/a1-wizardlm_orca Source: Original Platform
This commit is contained in:
36
.gitattributes
vendored
Normal file
36
.gitattributes
vendored
Normal file
@@ -0,0 +1,36 @@
|
|||||||
|
*.7z filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.arrow filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.bin filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.bz2 filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.ckpt filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.ftz filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.gz filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.h5 filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.joblib filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.lfs.* filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.mlmodel filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.model filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.msgpack filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.npy filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.npz filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.onnx filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.ot filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.parquet filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.pb filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.pickle filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.pkl filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.pt filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.pth filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.rar filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.safetensors filter=lfs diff=lfs merge=lfs -text
|
||||||
|
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.tar.* filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.tar filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.tflite filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.tgz filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.wasm filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.xz filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.zip filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.zst filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
||||||
|
tokenizer.json filter=lfs diff=lfs merge=lfs -text
|
||||||
60
README.md
Normal file
60
README.md
Normal file
@@ -0,0 +1,60 @@
|
|||||||
|
---
|
||||||
|
library_name: transformers
|
||||||
|
license: other
|
||||||
|
base_model: Qwen/Qwen3-8B
|
||||||
|
tags:
|
||||||
|
- llama-factory
|
||||||
|
- full
|
||||||
|
- generated_from_trainer
|
||||||
|
model-index:
|
||||||
|
- name: sft_a1_wizardlm_orca__Qwen3-8B
|
||||||
|
results: []
|
||||||
|
---
|
||||||
|
|
||||||
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
||||||
|
should probably proofread and complete it, then remove this comment. -->
|
||||||
|
|
||||||
|
# sft_a1_wizardlm_orca__Qwen3-8B
|
||||||
|
|
||||||
|
This model is a fine-tuned version of [Qwen/Qwen3-8B](https://huggingface.co/Qwen/Qwen3-8B) on the /e/scratch/jureap59/raoof1/sft_data/hf_hub/datasets--DCAgent--wizardlm-orca-sandboxes_glm_4.7_traces_jupiter/snapshots/626348c3167696dcbb1969897e0b1d79bde013d3_thinking_preprocessed dataset.
|
||||||
|
|
||||||
|
## Model description
|
||||||
|
|
||||||
|
More information needed
|
||||||
|
|
||||||
|
## Intended uses & limitations
|
||||||
|
|
||||||
|
More information needed
|
||||||
|
|
||||||
|
## Training and evaluation data
|
||||||
|
|
||||||
|
More information needed
|
||||||
|
|
||||||
|
## Training procedure
|
||||||
|
|
||||||
|
### Training hyperparameters
|
||||||
|
|
||||||
|
The following hyperparameters were used during training:
|
||||||
|
- learning_rate: 4e-05
|
||||||
|
- train_batch_size: 1
|
||||||
|
- eval_batch_size: 8
|
||||||
|
- seed: 42
|
||||||
|
- distributed_type: multi-GPU
|
||||||
|
- num_devices: 16
|
||||||
|
- total_train_batch_size: 16
|
||||||
|
- total_eval_batch_size: 128
|
||||||
|
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.98) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
||||||
|
- lr_scheduler_type: cosine
|
||||||
|
- lr_scheduler_warmup_ratio: 0.1
|
||||||
|
- num_epochs: 7.0
|
||||||
|
|
||||||
|
### Training results
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
### Framework versions
|
||||||
|
|
||||||
|
- Transformers 4.57.6
|
||||||
|
- Pytorch 2.9.1+cu130
|
||||||
|
- Datasets 4.7.0
|
||||||
|
- Tokenizers 0.22.2
|
||||||
28
added_tokens.json
Normal file
28
added_tokens.json
Normal file
@@ -0,0 +1,28 @@
|
|||||||
|
{
|
||||||
|
"</think>": 151668,
|
||||||
|
"</tool_call>": 151658,
|
||||||
|
"</tool_response>": 151666,
|
||||||
|
"<think>": 151667,
|
||||||
|
"<tool_call>": 151657,
|
||||||
|
"<tool_response>": 151665,
|
||||||
|
"<|box_end|>": 151649,
|
||||||
|
"<|box_start|>": 151648,
|
||||||
|
"<|endoftext|>": 151643,
|
||||||
|
"<|file_sep|>": 151664,
|
||||||
|
"<|fim_middle|>": 151660,
|
||||||
|
"<|fim_pad|>": 151662,
|
||||||
|
"<|fim_prefix|>": 151659,
|
||||||
|
"<|fim_suffix|>": 151661,
|
||||||
|
"<|im_end|>": 151645,
|
||||||
|
"<|im_start|>": 151644,
|
||||||
|
"<|image_pad|>": 151655,
|
||||||
|
"<|object_ref_end|>": 151647,
|
||||||
|
"<|object_ref_start|>": 151646,
|
||||||
|
"<|quad_end|>": 151651,
|
||||||
|
"<|quad_start|>": 151650,
|
||||||
|
"<|repo_name|>": 151663,
|
||||||
|
"<|video_pad|>": 151656,
|
||||||
|
"<|vision_end|>": 151653,
|
||||||
|
"<|vision_pad|>": 151654,
|
||||||
|
"<|vision_start|>": 151652
|
||||||
|
}
|
||||||
16
all_results.json
Normal file
16
all_results.json
Normal file
@@ -0,0 +1,16 @@
|
|||||||
|
{
|
||||||
|
"achieved_tflops_per_gpu": 0.0028883878665412407,
|
||||||
|
"achieved_tflops_per_gpu_theoretical": 985.8802277815419,
|
||||||
|
"epoch": 7.0,
|
||||||
|
"loss_nan_ranks": 0,
|
||||||
|
"loss_rank_avg": 0.25369924306869507,
|
||||||
|
"mfu_percent": 0.00020412635099231386,
|
||||||
|
"mfu_percent_theoretical": 69.67351433085102,
|
||||||
|
"total_flos": 644683152949248.0,
|
||||||
|
"train_loss": 0.3271850697072269,
|
||||||
|
"train_runtime": 13949.8914,
|
||||||
|
"train_samples_per_second": 4.655,
|
||||||
|
"train_steps_per_second": 0.291,
|
||||||
|
"valid_targets_mean": 2212.8,
|
||||||
|
"valid_targets_min": 431
|
||||||
|
}
|
||||||
89
chat_template.jinja
Normal file
89
chat_template.jinja
Normal file
@@ -0,0 +1,89 @@
|
|||||||
|
{%- if tools %}
|
||||||
|
{{- '<|im_start|>system\n' }}
|
||||||
|
{%- if messages[0].role == 'system' %}
|
||||||
|
{{- messages[0].content + '\n\n' }}
|
||||||
|
{%- endif %}
|
||||||
|
{{- "# Tools\n\nYou may call one or more functions to assist with the user query.\n\nYou are provided with function signatures within <tools></tools> XML tags:\n<tools>" }}
|
||||||
|
{%- for tool in tools %}
|
||||||
|
{{- "\n" }}
|
||||||
|
{{- tool | tojson }}
|
||||||
|
{%- endfor %}
|
||||||
|
{{- "\n</tools>\n\nFor each function call, return a json object with function name and arguments within <tool_call></tool_call> XML tags:\n<tool_call>\n{\"name\": <function-name>, \"arguments\": <args-json-object>}\n</tool_call><|im_end|>\n" }}
|
||||||
|
{%- else %}
|
||||||
|
{%- if messages[0].role == 'system' %}
|
||||||
|
{{- '<|im_start|>system\n' + messages[0].content + '<|im_end|>\n' }}
|
||||||
|
{%- endif %}
|
||||||
|
{%- endif %}
|
||||||
|
{%- set ns = namespace(multi_step_tool=true, last_query_index=messages|length - 1) %}
|
||||||
|
{%- for message in messages[::-1] %}
|
||||||
|
{%- set index = (messages|length - 1) - loop.index0 %}
|
||||||
|
{%- if ns.multi_step_tool and message.role == "user" and message.content is string and not(message.content.startswith('<tool_response>') and message.content.endswith('</tool_response>')) %}
|
||||||
|
{%- set ns.multi_step_tool = false %}
|
||||||
|
{%- set ns.last_query_index = index %}
|
||||||
|
{%- endif %}
|
||||||
|
{%- endfor %}
|
||||||
|
{%- for message in messages %}
|
||||||
|
{%- if message.content is string %}
|
||||||
|
{%- set content = message.content %}
|
||||||
|
{%- else %}
|
||||||
|
{%- set content = '' %}
|
||||||
|
{%- endif %}
|
||||||
|
{%- if (message.role == "user") or (message.role == "system" and not loop.first) %}
|
||||||
|
{{- '<|im_start|>' + message.role + '\n' + content + '<|im_end|>' + '\n' }}
|
||||||
|
{%- elif message.role == "assistant" %}
|
||||||
|
{%- set reasoning_content = '' %}
|
||||||
|
{%- if message.reasoning_content is string %}
|
||||||
|
{%- set reasoning_content = message.reasoning_content %}
|
||||||
|
{%- else %}
|
||||||
|
{%- if '</think>' in content %}
|
||||||
|
{%- set reasoning_content = content.split('</think>')[0].rstrip('\n').split('<think>')[-1].lstrip('\n') %}
|
||||||
|
{%- set content = content.split('</think>')[-1].lstrip('\n') %}
|
||||||
|
{%- endif %}
|
||||||
|
{%- endif %}
|
||||||
|
{%- if loop.index0 > ns.last_query_index %}
|
||||||
|
{%- if loop.last or (not loop.last and reasoning_content) %}
|
||||||
|
{{- '<|im_start|>' + message.role + '\n<think>\n' + reasoning_content.strip('\n') + '\n</think>\n\n' + content.lstrip('\n') }}
|
||||||
|
{%- else %}
|
||||||
|
{{- '<|im_start|>' + message.role + '\n' + content }}
|
||||||
|
{%- endif %}
|
||||||
|
{%- else %}
|
||||||
|
{{- '<|im_start|>' + message.role + '\n' + content }}
|
||||||
|
{%- endif %}
|
||||||
|
{%- if message.tool_calls %}
|
||||||
|
{%- for tool_call in message.tool_calls %}
|
||||||
|
{%- if (loop.first and content) or (not loop.first) %}
|
||||||
|
{{- '\n' }}
|
||||||
|
{%- endif %}
|
||||||
|
{%- if tool_call.function %}
|
||||||
|
{%- set tool_call = tool_call.function %}
|
||||||
|
{%- endif %}
|
||||||
|
{{- '<tool_call>\n{"name": "' }}
|
||||||
|
{{- tool_call.name }}
|
||||||
|
{{- '", "arguments": ' }}
|
||||||
|
{%- if tool_call.arguments is string %}
|
||||||
|
{{- tool_call.arguments }}
|
||||||
|
{%- else %}
|
||||||
|
{{- tool_call.arguments | tojson }}
|
||||||
|
{%- endif %}
|
||||||
|
{{- '}\n</tool_call>' }}
|
||||||
|
{%- endfor %}
|
||||||
|
{%- endif %}
|
||||||
|
{{- '<|im_end|>\n' }}
|
||||||
|
{%- elif message.role == "tool" %}
|
||||||
|
{%- if loop.first or (messages[loop.index0 - 1].role != "tool") %}
|
||||||
|
{{- '<|im_start|>user' }}
|
||||||
|
{%- endif %}
|
||||||
|
{{- '\n<tool_response>\n' }}
|
||||||
|
{{- content }}
|
||||||
|
{{- '\n</tool_response>' }}
|
||||||
|
{%- if loop.last or (messages[loop.index0 + 1].role != "tool") %}
|
||||||
|
{{- '<|im_end|>\n' }}
|
||||||
|
{%- endif %}
|
||||||
|
{%- endif %}
|
||||||
|
{%- endfor %}
|
||||||
|
{%- if add_generation_prompt %}
|
||||||
|
{{- '<|im_start|>assistant\n' }}
|
||||||
|
{%- if enable_thinking is defined and enable_thinking is false %}
|
||||||
|
{{- '<think>\n\n</think>\n\n' }}
|
||||||
|
{%- endif %}
|
||||||
|
{%- endif %}
|
||||||
68
config.json
Normal file
68
config.json
Normal file
@@ -0,0 +1,68 @@
|
|||||||
|
{
|
||||||
|
"architectures": [
|
||||||
|
"Qwen3ForCausalLM"
|
||||||
|
],
|
||||||
|
"attention_bias": false,
|
||||||
|
"attention_dropout": 0.0,
|
||||||
|
"dtype": "bfloat16",
|
||||||
|
"eos_token_id": 151645,
|
||||||
|
"head_dim": 128,
|
||||||
|
"hidden_act": "silu",
|
||||||
|
"hidden_size": 4096,
|
||||||
|
"initializer_range": 0.02,
|
||||||
|
"intermediate_size": 12288,
|
||||||
|
"layer_types": [
|
||||||
|
"full_attention",
|
||||||
|
"full_attention",
|
||||||
|
"full_attention",
|
||||||
|
"full_attention",
|
||||||
|
"full_attention",
|
||||||
|
"full_attention",
|
||||||
|
"full_attention",
|
||||||
|
"full_attention",
|
||||||
|
"full_attention",
|
||||||
|
"full_attention",
|
||||||
|
"full_attention",
|
||||||
|
"full_attention",
|
||||||
|
"full_attention",
|
||||||
|
"full_attention",
|
||||||
|
"full_attention",
|
||||||
|
"full_attention",
|
||||||
|
"full_attention",
|
||||||
|
"full_attention",
|
||||||
|
"full_attention",
|
||||||
|
"full_attention",
|
||||||
|
"full_attention",
|
||||||
|
"full_attention",
|
||||||
|
"full_attention",
|
||||||
|
"full_attention",
|
||||||
|
"full_attention",
|
||||||
|
"full_attention",
|
||||||
|
"full_attention",
|
||||||
|
"full_attention",
|
||||||
|
"full_attention",
|
||||||
|
"full_attention",
|
||||||
|
"full_attention",
|
||||||
|
"full_attention",
|
||||||
|
"full_attention",
|
||||||
|
"full_attention",
|
||||||
|
"full_attention",
|
||||||
|
"full_attention"
|
||||||
|
],
|
||||||
|
"max_position_embeddings": 40960,
|
||||||
|
"max_window_layers": 36,
|
||||||
|
"model_type": "qwen3",
|
||||||
|
"num_attention_heads": 32,
|
||||||
|
"num_hidden_layers": 36,
|
||||||
|
"num_key_value_heads": 8,
|
||||||
|
"pad_token_id": 151643,
|
||||||
|
"rms_norm_eps": 1e-06,
|
||||||
|
"rope_scaling": null,
|
||||||
|
"rope_theta": 1000000,
|
||||||
|
"sliding_window": null,
|
||||||
|
"tie_word_embeddings": false,
|
||||||
|
"transformers_version": "4.57.6",
|
||||||
|
"use_cache": false,
|
||||||
|
"use_sliding_window": false,
|
||||||
|
"vocab_size": 151936
|
||||||
|
}
|
||||||
12
generation_config.json
Normal file
12
generation_config.json
Normal file
@@ -0,0 +1,12 @@
|
|||||||
|
{
|
||||||
|
"do_sample": true,
|
||||||
|
"eos_token_id": [
|
||||||
|
151645,
|
||||||
|
151643
|
||||||
|
],
|
||||||
|
"pad_token_id": 151643,
|
||||||
|
"temperature": 0.6,
|
||||||
|
"top_k": 20,
|
||||||
|
"top_p": 0.95,
|
||||||
|
"transformers_version": "4.57.6"
|
||||||
|
}
|
||||||
151388
merges.txt
Normal file
151388
merges.txt
Normal file
File diff suppressed because it is too large
Load Diff
3
model-00001-of-00004.safetensors
Normal file
3
model-00001-of-00004.safetensors
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
version https://git-lfs.github.com/spec/v1
|
||||||
|
oid sha256:280739439903aebb83455b22004eca22f343a315f094bfa85db159f8d54d5fa1
|
||||||
|
size 4902257696
|
||||||
3
model-00002-of-00004.safetensors
Normal file
3
model-00002-of-00004.safetensors
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
version https://git-lfs.github.com/spec/v1
|
||||||
|
oid sha256:df4830ff023bb2a85c9d74c3bc80fa7accc413c46ea3529a142b6843eeb4dca4
|
||||||
|
size 4915960368
|
||||||
3
model-00003-of-00004.safetensors
Normal file
3
model-00003-of-00004.safetensors
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
version https://git-lfs.github.com/spec/v1
|
||||||
|
oid sha256:f84c46c2abae04d9f74f72cd8af38dc79bd0c4b6328f407cacd6cc581d0bf742
|
||||||
|
size 4983068496
|
||||||
3
model-00004-of-00004.safetensors
Normal file
3
model-00004-of-00004.safetensors
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
version https://git-lfs.github.com/spec/v1
|
||||||
|
oid sha256:480843d36a26dad7e036063b4eb782079fd84ef1fe2e9710354e48fe05ee1af5
|
||||||
|
size 1580230264
|
||||||
407
model.safetensors.index.json
Normal file
407
model.safetensors.index.json
Normal file
@@ -0,0 +1,407 @@
|
|||||||
|
{
|
||||||
|
"metadata": {
|
||||||
|
"total_parameters": 308224,
|
||||||
|
"total_size": 16381470720
|
||||||
|
},
|
||||||
|
"weight_map": {
|
||||||
|
"lm_head.weight": "model-00004-of-00004.safetensors",
|
||||||
|
"model.embed_tokens.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.0.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.0.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.0.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.0.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.0.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.0.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.0.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.0.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.0.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.0.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.0.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.1.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.1.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.1.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.1.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.1.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.1.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.1.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.1.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.1.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.1.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.1.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.10.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.10.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.10.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.10.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.10.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.10.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.10.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.10.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.10.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.10.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.10.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.11.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.11.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.11.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.11.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.11.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.11.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.11.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.11.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.11.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.11.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.11.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.12.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.12.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.12.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.12.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.12.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.12.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.12.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.12.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.12.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.12.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.12.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.13.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.13.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.13.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.13.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.13.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.13.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.13.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.13.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.13.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.13.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.13.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.14.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.14.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.14.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.14.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.14.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.14.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.14.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.14.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.14.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.14.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.14.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.15.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.15.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.15.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.15.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.15.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.15.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.15.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.15.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.15.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.15.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.15.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.16.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.16.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.16.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.16.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.16.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.16.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.16.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.16.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.16.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.16.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.16.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.17.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.17.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.17.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.17.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.17.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.17.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.17.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.17.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.17.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.17.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.17.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.18.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.18.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.18.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.18.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.18.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.18.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.18.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.18.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.18.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.18.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.18.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.19.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.19.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.19.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.19.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.19.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.19.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.19.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.19.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.19.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.19.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.19.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.2.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.2.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.2.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.2.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.2.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.2.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.2.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.2.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.2.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.2.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.2.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.20.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.20.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.20.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.20.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.20.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.20.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.20.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.20.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.20.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.20.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.20.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.21.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.21.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.21.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.21.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.21.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.21.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.21.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.21.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.21.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.21.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.21.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.22.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.22.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.22.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.22.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.22.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.22.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.22.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.22.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.22.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.22.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.22.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.23.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.23.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.23.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.23.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.23.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.23.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.23.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.23.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.23.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.23.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.23.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.24.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.24.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.24.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.24.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.24.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.24.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.24.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.24.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.24.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.24.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.24.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.25.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.25.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.25.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.25.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.25.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.25.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.25.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.25.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.25.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.25.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.25.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.26.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.26.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.26.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.26.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.26.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.26.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.26.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.26.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.26.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.26.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.26.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.27.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.27.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.27.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.27.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.27.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.27.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.27.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.27.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.27.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.27.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.27.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.28.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.28.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.28.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.28.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.28.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.28.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.28.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.28.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.28.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.28.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.28.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.29.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.29.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.29.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.29.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.29.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.29.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.29.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.29.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.29.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.29.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.29.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.3.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.3.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.3.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.3.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.3.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.3.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.3.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.3.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.3.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.3.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.3.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.30.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.30.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.30.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.30.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.30.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.30.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.30.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.30.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.30.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.30.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.30.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.31.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.31.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.31.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.31.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.31.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.31.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.31.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.31.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.31.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.31.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.31.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.32.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.32.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.32.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.32.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.32.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.32.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.32.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.32.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.32.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.32.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.32.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.33.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.33.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.33.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.33.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.33.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.33.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.33.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.33.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.33.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.33.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.33.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.34.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.34.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.34.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.34.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.34.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.34.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.34.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.34.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.34.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.34.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.34.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.35.input_layernorm.weight": "model-00004-of-00004.safetensors",
|
||||||
|
"model.layers.35.mlp.down_proj.weight": "model-00004-of-00004.safetensors",
|
||||||
|
"model.layers.35.mlp.gate_proj.weight": "model-00004-of-00004.safetensors",
|
||||||
|
"model.layers.35.mlp.up_proj.weight": "model-00004-of-00004.safetensors",
|
||||||
|
"model.layers.35.post_attention_layernorm.weight": "model-00004-of-00004.safetensors",
|
||||||
|
"model.layers.35.self_attn.k_norm.weight": "model-00004-of-00004.safetensors",
|
||||||
|
"model.layers.35.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.35.self_attn.o_proj.weight": "model-00004-of-00004.safetensors",
|
||||||
|
"model.layers.35.self_attn.q_norm.weight": "model-00004-of-00004.safetensors",
|
||||||
|
"model.layers.35.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.35.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||||
|
"model.layers.4.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.4.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.4.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.4.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.4.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.4.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.4.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.4.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.4.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.4.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.4.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.5.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.5.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.5.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.5.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.5.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.5.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.5.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.5.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.5.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.5.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.5.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.6.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.6.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.6.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.6.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.6.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.6.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.6.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.6.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.6.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.6.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.6.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.7.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.7.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.7.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.7.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.7.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.7.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.7.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.7.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.7.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.7.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.7.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.8.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.8.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.8.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.8.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.8.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.8.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.8.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.8.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.8.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.8.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.8.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.9.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.9.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.9.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.9.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.9.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||||
|
"model.layers.9.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.9.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.9.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.9.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.9.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.layers.9.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||||
|
"model.norm.weight": "model-00004-of-00004.safetensors"
|
||||||
|
}
|
||||||
|
}
|
||||||
12
run_summary.json
Normal file
12
run_summary.json
Normal file
@@ -0,0 +1,12 @@
|
|||||||
|
{
|
||||||
|
"agent_name": "626348c3167696dcbb1969897e0b1d79bde013d3_thinking_preprocessed",
|
||||||
|
"training_start": null,
|
||||||
|
"training_end": null,
|
||||||
|
"created_by": "raoof1",
|
||||||
|
"base_model_name": "Qwen/Qwen3-8B",
|
||||||
|
"dataset_name": "/e/scratch/jureap59/raoof1/sft_data/hf_hub/datasets--DCAgent--wizardlm-orca-sandboxes_glm_4.7_traces_jupiter/snapshots/626348c3167696dcbb1969897e0b1d79bde013d3_thinking_preprocessed",
|
||||||
|
"training_type": "SFT",
|
||||||
|
"training_parameters": "https://huggingface.co/DCAgent/a1-wizardlm_orca/blob/main/config.json",
|
||||||
|
"wandb_link": null,
|
||||||
|
"traces_location_s3": null
|
||||||
|
}
|
||||||
31
special_tokens_map.json
Normal file
31
special_tokens_map.json
Normal file
@@ -0,0 +1,31 @@
|
|||||||
|
{
|
||||||
|
"additional_special_tokens": [
|
||||||
|
"<|im_start|>",
|
||||||
|
"<|im_end|>",
|
||||||
|
"<|object_ref_start|>",
|
||||||
|
"<|object_ref_end|>",
|
||||||
|
"<|box_start|>",
|
||||||
|
"<|box_end|>",
|
||||||
|
"<|quad_start|>",
|
||||||
|
"<|quad_end|>",
|
||||||
|
"<|vision_start|>",
|
||||||
|
"<|vision_end|>",
|
||||||
|
"<|vision_pad|>",
|
||||||
|
"<|image_pad|>",
|
||||||
|
"<|video_pad|>"
|
||||||
|
],
|
||||||
|
"eos_token": {
|
||||||
|
"content": "<|im_end|>",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false
|
||||||
|
},
|
||||||
|
"pad_token": {
|
||||||
|
"content": "<|endoftext|>",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false
|
||||||
|
}
|
||||||
|
}
|
||||||
3
tokenizer.json
Normal file
3
tokenizer.json
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
version https://git-lfs.github.com/spec/v1
|
||||||
|
oid sha256:aeb13307a71acd8fe81861d94ad54ab689df773318809eed3cbe794b4492dae4
|
||||||
|
size 11422654
|
||||||
240
tokenizer_config.json
Normal file
240
tokenizer_config.json
Normal file
@@ -0,0 +1,240 @@
|
|||||||
|
{
|
||||||
|
"add_bos_token": false,
|
||||||
|
"add_prefix_space": false,
|
||||||
|
"added_tokens_decoder": {
|
||||||
|
"151643": {
|
||||||
|
"content": "<|endoftext|>",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"151644": {
|
||||||
|
"content": "<|im_start|>",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"151645": {
|
||||||
|
"content": "<|im_end|>",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"151646": {
|
||||||
|
"content": "<|object_ref_start|>",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"151647": {
|
||||||
|
"content": "<|object_ref_end|>",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"151648": {
|
||||||
|
"content": "<|box_start|>",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"151649": {
|
||||||
|
"content": "<|box_end|>",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"151650": {
|
||||||
|
"content": "<|quad_start|>",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"151651": {
|
||||||
|
"content": "<|quad_end|>",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"151652": {
|
||||||
|
"content": "<|vision_start|>",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"151653": {
|
||||||
|
"content": "<|vision_end|>",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"151654": {
|
||||||
|
"content": "<|vision_pad|>",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"151655": {
|
||||||
|
"content": "<|image_pad|>",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"151656": {
|
||||||
|
"content": "<|video_pad|>",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"151657": {
|
||||||
|
"content": "<tool_call>",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": false
|
||||||
|
},
|
||||||
|
"151658": {
|
||||||
|
"content": "</tool_call>",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": false
|
||||||
|
},
|
||||||
|
"151659": {
|
||||||
|
"content": "<|fim_prefix|>",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": false
|
||||||
|
},
|
||||||
|
"151660": {
|
||||||
|
"content": "<|fim_middle|>",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": false
|
||||||
|
},
|
||||||
|
"151661": {
|
||||||
|
"content": "<|fim_suffix|>",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": false
|
||||||
|
},
|
||||||
|
"151662": {
|
||||||
|
"content": "<|fim_pad|>",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": false
|
||||||
|
},
|
||||||
|
"151663": {
|
||||||
|
"content": "<|repo_name|>",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": false
|
||||||
|
},
|
||||||
|
"151664": {
|
||||||
|
"content": "<|file_sep|>",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": false
|
||||||
|
},
|
||||||
|
"151665": {
|
||||||
|
"content": "<tool_response>",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": false
|
||||||
|
},
|
||||||
|
"151666": {
|
||||||
|
"content": "</tool_response>",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": false
|
||||||
|
},
|
||||||
|
"151667": {
|
||||||
|
"content": "<think>",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": false
|
||||||
|
},
|
||||||
|
"151668": {
|
||||||
|
"content": "</think>",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": false
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"additional_special_tokens": [
|
||||||
|
"<|im_start|>",
|
||||||
|
"<|im_end|>",
|
||||||
|
"<|object_ref_start|>",
|
||||||
|
"<|object_ref_end|>",
|
||||||
|
"<|box_start|>",
|
||||||
|
"<|box_end|>",
|
||||||
|
"<|quad_start|>",
|
||||||
|
"<|quad_end|>",
|
||||||
|
"<|vision_start|>",
|
||||||
|
"<|vision_end|>",
|
||||||
|
"<|vision_pad|>",
|
||||||
|
"<|image_pad|>",
|
||||||
|
"<|video_pad|>"
|
||||||
|
],
|
||||||
|
"bos_token": null,
|
||||||
|
"clean_up_tokenization_spaces": false,
|
||||||
|
"eos_token": "<|im_end|>",
|
||||||
|
"errors": "replace",
|
||||||
|
"extra_special_tokens": {},
|
||||||
|
"model_max_length": 32768,
|
||||||
|
"pad_token": "<|endoftext|>",
|
||||||
|
"padding_side": "right",
|
||||||
|
"split_special_tokens": false,
|
||||||
|
"tokenizer_class": "Qwen2Tokenizer",
|
||||||
|
"unk_token": null
|
||||||
|
}
|
||||||
16
train_results.json
Normal file
16
train_results.json
Normal file
@@ -0,0 +1,16 @@
|
|||||||
|
{
|
||||||
|
"achieved_tflops_per_gpu": 0.0028883878665412407,
|
||||||
|
"achieved_tflops_per_gpu_theoretical": 985.8802277815419,
|
||||||
|
"epoch": 7.0,
|
||||||
|
"loss_nan_ranks": 0,
|
||||||
|
"loss_rank_avg": 0.25369924306869507,
|
||||||
|
"mfu_percent": 0.00020412635099231386,
|
||||||
|
"mfu_percent_theoretical": 69.67351433085102,
|
||||||
|
"total_flos": 644683152949248.0,
|
||||||
|
"train_loss": 0.3271850697072269,
|
||||||
|
"train_runtime": 13949.8914,
|
||||||
|
"train_samples_per_second": 4.655,
|
||||||
|
"train_steps_per_second": 0.291,
|
||||||
|
"valid_targets_mean": 2212.8,
|
||||||
|
"valid_targets_min": 431
|
||||||
|
}
|
||||||
813
trainer_log.jsonl
Normal file
813
trainer_log.jsonl
Normal file
@@ -0,0 +1,813 @@
|
|||||||
|
{"current_steps": 5, "total_steps": 4060, "loss": 0.8415, "lr": 3.9408866995073894e-07, "epoch": 0.008620689655172414, "percentage": 0.12, "elapsed_time": "0:00:26", "remaining_time": "5:58:10"}
|
||||||
|
{"current_steps": 10, "total_steps": 4060, "loss": 0.8298, "lr": 8.866995073891626e-07, "epoch": 0.017241379310344827, "percentage": 0.25, "elapsed_time": "0:00:47", "remaining_time": "5:20:10"}
|
||||||
|
{"current_steps": 15, "total_steps": 4060, "loss": 0.8474, "lr": 1.3793103448275862e-06, "epoch": 0.02586206896551724, "percentage": 0.37, "elapsed_time": "0:01:05", "remaining_time": "4:53:55"}
|
||||||
|
{"current_steps": 20, "total_steps": 4060, "loss": 0.8062, "lr": 1.8719211822660098e-06, "epoch": 0.034482758620689655, "percentage": 0.49, "elapsed_time": "0:01:22", "remaining_time": "4:38:03"}
|
||||||
|
{"current_steps": 25, "total_steps": 4060, "loss": 0.7396, "lr": 2.3645320197044334e-06, "epoch": 0.04310344827586207, "percentage": 0.62, "elapsed_time": "0:01:39", "remaining_time": "4:27:55"}
|
||||||
|
{"current_steps": 30, "total_steps": 4060, "loss": 0.7238, "lr": 2.8571428571428573e-06, "epoch": 0.05172413793103448, "percentage": 0.74, "elapsed_time": "0:01:53", "remaining_time": "4:13:51"}
|
||||||
|
{"current_steps": 35, "total_steps": 4060, "loss": 0.7084, "lr": 3.349753694581281e-06, "epoch": 0.0603448275862069, "percentage": 0.86, "elapsed_time": "0:02:10", "remaining_time": "4:09:56"}
|
||||||
|
{"current_steps": 40, "total_steps": 4060, "loss": 0.6874, "lr": 3.842364532019705e-06, "epoch": 0.06896551724137931, "percentage": 0.99, "elapsed_time": "0:02:29", "remaining_time": "4:10:01"}
|
||||||
|
{"current_steps": 45, "total_steps": 4060, "loss": 0.6636, "lr": 4.334975369458129e-06, "epoch": 0.07758620689655173, "percentage": 1.11, "elapsed_time": "0:02:47", "remaining_time": "4:09:31"}
|
||||||
|
{"current_steps": 50, "total_steps": 4060, "loss": 0.6038, "lr": 4.8275862068965525e-06, "epoch": 0.08620689655172414, "percentage": 1.23, "elapsed_time": "0:03:08", "remaining_time": "4:11:34"}
|
||||||
|
{"current_steps": 55, "total_steps": 4060, "loss": 0.6209, "lr": 5.320197044334976e-06, "epoch": 0.09482758620689655, "percentage": 1.35, "elapsed_time": "0:03:26", "remaining_time": "4:10:34"}
|
||||||
|
{"current_steps": 60, "total_steps": 4060, "loss": 0.5396, "lr": 5.812807881773399e-06, "epoch": 0.10344827586206896, "percentage": 1.48, "elapsed_time": "0:03:49", "remaining_time": "4:14:40"}
|
||||||
|
{"current_steps": 65, "total_steps": 4060, "loss": 0.5647, "lr": 6.305418719211823e-06, "epoch": 0.11206896551724138, "percentage": 1.6, "elapsed_time": "0:04:10", "remaining_time": "4:16:33"}
|
||||||
|
{"current_steps": 70, "total_steps": 4060, "loss": 0.5177, "lr": 6.798029556650246e-06, "epoch": 0.1206896551724138, "percentage": 1.72, "elapsed_time": "0:04:34", "remaining_time": "4:20:46"}
|
||||||
|
{"current_steps": 75, "total_steps": 4060, "loss": 0.5517, "lr": 7.290640394088671e-06, "epoch": 0.12931034482758622, "percentage": 1.85, "elapsed_time": "0:04:51", "remaining_time": "4:18:18"}
|
||||||
|
{"current_steps": 80, "total_steps": 4060, "loss": 0.6183, "lr": 7.783251231527095e-06, "epoch": 0.13793103448275862, "percentage": 1.97, "elapsed_time": "0:05:05", "remaining_time": "4:12:57"}
|
||||||
|
{"current_steps": 85, "total_steps": 4060, "loss": 0.4737, "lr": 8.275862068965518e-06, "epoch": 0.14655172413793102, "percentage": 2.09, "elapsed_time": "0:05:23", "remaining_time": "4:12:15"}
|
||||||
|
{"current_steps": 90, "total_steps": 4060, "loss": 0.5053, "lr": 8.768472906403942e-06, "epoch": 0.15517241379310345, "percentage": 2.22, "elapsed_time": "0:05:37", "remaining_time": "4:07:53"}
|
||||||
|
{"current_steps": 95, "total_steps": 4060, "loss": 0.5679, "lr": 9.261083743842364e-06, "epoch": 0.16379310344827586, "percentage": 2.34, "elapsed_time": "0:05:58", "remaining_time": "4:09:14"}
|
||||||
|
{"current_steps": 100, "total_steps": 4060, "loss": 0.5457, "lr": 9.75369458128079e-06, "epoch": 0.1724137931034483, "percentage": 2.46, "elapsed_time": "0:06:12", "remaining_time": "4:06:06"}
|
||||||
|
{"current_steps": 105, "total_steps": 4060, "loss": 0.5007, "lr": 1.0246305418719214e-05, "epoch": 0.1810344827586207, "percentage": 2.59, "elapsed_time": "0:06:26", "remaining_time": "4:02:51"}
|
||||||
|
{"current_steps": 110, "total_steps": 4060, "loss": 0.4347, "lr": 1.0738916256157637e-05, "epoch": 0.1896551724137931, "percentage": 2.71, "elapsed_time": "0:06:50", "remaining_time": "4:05:29"}
|
||||||
|
{"current_steps": 115, "total_steps": 4060, "loss": 0.502, "lr": 1.123152709359606e-05, "epoch": 0.19827586206896552, "percentage": 2.83, "elapsed_time": "0:07:04", "remaining_time": "4:02:58"}
|
||||||
|
{"current_steps": 120, "total_steps": 4060, "loss": 0.552, "lr": 1.1724137931034483e-05, "epoch": 0.20689655172413793, "percentage": 2.96, "elapsed_time": "0:07:21", "remaining_time": "4:01:23"}
|
||||||
|
{"current_steps": 125, "total_steps": 4060, "loss": 0.4823, "lr": 1.2216748768472909e-05, "epoch": 0.21551724137931033, "percentage": 3.08, "elapsed_time": "0:07:34", "remaining_time": "3:58:38"}
|
||||||
|
{"current_steps": 130, "total_steps": 4060, "loss": 0.443, "lr": 1.2709359605911331e-05, "epoch": 0.22413793103448276, "percentage": 3.2, "elapsed_time": "0:07:51", "remaining_time": "3:57:48"}
|
||||||
|
{"current_steps": 135, "total_steps": 4060, "loss": 0.5249, "lr": 1.3201970443349755e-05, "epoch": 0.23275862068965517, "percentage": 3.33, "elapsed_time": "0:08:08", "remaining_time": "3:56:55"}
|
||||||
|
{"current_steps": 140, "total_steps": 4060, "loss": 0.5116, "lr": 1.369458128078818e-05, "epoch": 0.2413793103448276, "percentage": 3.45, "elapsed_time": "0:08:27", "remaining_time": "3:56:51"}
|
||||||
|
{"current_steps": 145, "total_steps": 4060, "loss": 0.5083, "lr": 1.4187192118226602e-05, "epoch": 0.25, "percentage": 3.57, "elapsed_time": "0:08:47", "remaining_time": "3:57:32"}
|
||||||
|
{"current_steps": 150, "total_steps": 4060, "loss": 0.4255, "lr": 1.4679802955665026e-05, "epoch": 0.25862068965517243, "percentage": 3.69, "elapsed_time": "0:09:06", "remaining_time": "3:57:21"}
|
||||||
|
{"current_steps": 155, "total_steps": 4060, "loss": 0.4816, "lr": 1.5172413793103448e-05, "epoch": 0.2672413793103448, "percentage": 3.82, "elapsed_time": "0:09:25", "remaining_time": "3:57:15"}
|
||||||
|
{"current_steps": 160, "total_steps": 4060, "loss": 0.5105, "lr": 1.5665024630541875e-05, "epoch": 0.27586206896551724, "percentage": 3.94, "elapsed_time": "0:09:45", "remaining_time": "3:57:43"}
|
||||||
|
{"current_steps": 165, "total_steps": 4060, "loss": 0.4612, "lr": 1.6157635467980298e-05, "epoch": 0.28448275862068967, "percentage": 4.06, "elapsed_time": "0:10:06", "remaining_time": "3:58:35"}
|
||||||
|
{"current_steps": 170, "total_steps": 4060, "loss": 0.475, "lr": 1.665024630541872e-05, "epoch": 0.29310344827586204, "percentage": 4.19, "elapsed_time": "0:10:24", "remaining_time": "3:58:11"}
|
||||||
|
{"current_steps": 175, "total_steps": 4060, "loss": 0.4598, "lr": 1.7142857142857142e-05, "epoch": 0.3017241379310345, "percentage": 4.31, "elapsed_time": "0:10:42", "remaining_time": "3:57:49"}
|
||||||
|
{"current_steps": 180, "total_steps": 4060, "loss": 0.4672, "lr": 1.7635467980295567e-05, "epoch": 0.3103448275862069, "percentage": 4.43, "elapsed_time": "0:10:56", "remaining_time": "3:55:49"}
|
||||||
|
{"current_steps": 185, "total_steps": 4060, "loss": 0.4961, "lr": 1.8128078817733993e-05, "epoch": 0.31896551724137934, "percentage": 4.56, "elapsed_time": "0:11:19", "remaining_time": "3:57:04"}
|
||||||
|
{"current_steps": 190, "total_steps": 4060, "loss": 0.4817, "lr": 1.8620689655172415e-05, "epoch": 0.3275862068965517, "percentage": 4.68, "elapsed_time": "0:11:39", "remaining_time": "3:57:32"}
|
||||||
|
{"current_steps": 195, "total_steps": 4060, "loss": 0.468, "lr": 1.911330049261084e-05, "epoch": 0.33620689655172414, "percentage": 4.8, "elapsed_time": "0:11:59", "remaining_time": "3:57:48"}
|
||||||
|
{"current_steps": 200, "total_steps": 4060, "loss": 0.4526, "lr": 1.9605911330049263e-05, "epoch": 0.3448275862068966, "percentage": 4.93, "elapsed_time": "0:12:27", "remaining_time": "4:00:24"}
|
||||||
|
{"current_steps": 205, "total_steps": 4060, "loss": 0.4307, "lr": 2.0098522167487688e-05, "epoch": 0.35344827586206895, "percentage": 5.05, "elapsed_time": "0:12:51", "remaining_time": "4:01:42"}
|
||||||
|
{"current_steps": 210, "total_steps": 4060, "loss": 0.5102, "lr": 2.059113300492611e-05, "epoch": 0.3620689655172414, "percentage": 5.17, "elapsed_time": "0:13:05", "remaining_time": "4:00:01"}
|
||||||
|
{"current_steps": 215, "total_steps": 4060, "loss": 0.4855, "lr": 2.1083743842364536e-05, "epoch": 0.3706896551724138, "percentage": 5.3, "elapsed_time": "0:13:19", "remaining_time": "3:58:20"}
|
||||||
|
{"current_steps": 220, "total_steps": 4060, "loss": 0.4621, "lr": 2.1576354679802954e-05, "epoch": 0.3793103448275862, "percentage": 5.42, "elapsed_time": "0:13:30", "remaining_time": "3:55:54"}
|
||||||
|
{"current_steps": 225, "total_steps": 4060, "loss": 0.4648, "lr": 2.206896551724138e-05, "epoch": 0.3879310344827586, "percentage": 5.54, "elapsed_time": "0:13:45", "remaining_time": "3:54:23"}
|
||||||
|
{"current_steps": 230, "total_steps": 4060, "loss": 0.4487, "lr": 2.2561576354679805e-05, "epoch": 0.39655172413793105, "percentage": 5.67, "elapsed_time": "0:14:04", "remaining_time": "3:54:17"}
|
||||||
|
{"current_steps": 235, "total_steps": 4060, "loss": 0.4621, "lr": 2.3054187192118228e-05, "epoch": 0.4051724137931034, "percentage": 5.79, "elapsed_time": "0:14:20", "remaining_time": "3:53:19"}
|
||||||
|
{"current_steps": 240, "total_steps": 4060, "loss": 0.5309, "lr": 2.3546798029556653e-05, "epoch": 0.41379310344827586, "percentage": 5.91, "elapsed_time": "0:14:42", "remaining_time": "3:54:14"}
|
||||||
|
{"current_steps": 245, "total_steps": 4060, "loss": 0.4855, "lr": 2.403940886699508e-05, "epoch": 0.4224137931034483, "percentage": 6.03, "elapsed_time": "0:15:07", "remaining_time": "3:55:31"}
|
||||||
|
{"current_steps": 250, "total_steps": 4060, "loss": 0.4903, "lr": 2.4532019704433497e-05, "epoch": 0.43103448275862066, "percentage": 6.16, "elapsed_time": "0:15:19", "remaining_time": "3:53:37"}
|
||||||
|
{"current_steps": 255, "total_steps": 4060, "loss": 0.4304, "lr": 2.5024630541871923e-05, "epoch": 0.4396551724137931, "percentage": 6.28, "elapsed_time": "0:15:33", "remaining_time": "3:52:14"}
|
||||||
|
{"current_steps": 260, "total_steps": 4060, "loss": 0.4459, "lr": 2.551724137931035e-05, "epoch": 0.4482758620689655, "percentage": 6.4, "elapsed_time": "0:15:48", "remaining_time": "3:51:01"}
|
||||||
|
{"current_steps": 265, "total_steps": 4060, "loss": 0.4728, "lr": 2.600985221674877e-05, "epoch": 0.45689655172413796, "percentage": 6.53, "elapsed_time": "0:16:04", "remaining_time": "3:50:15"}
|
||||||
|
{"current_steps": 270, "total_steps": 4060, "loss": 0.4629, "lr": 2.6502463054187196e-05, "epoch": 0.46551724137931033, "percentage": 6.65, "elapsed_time": "0:16:20", "remaining_time": "3:49:24"}
|
||||||
|
{"current_steps": 275, "total_steps": 4060, "loss": 0.5005, "lr": 2.6995073891625615e-05, "epoch": 0.47413793103448276, "percentage": 6.77, "elapsed_time": "0:16:41", "remaining_time": "3:49:49"}
|
||||||
|
{"current_steps": 280, "total_steps": 4060, "loss": 0.4559, "lr": 2.748768472906404e-05, "epoch": 0.4827586206896552, "percentage": 6.9, "elapsed_time": "0:17:01", "remaining_time": "3:49:49"}
|
||||||
|
{"current_steps": 285, "total_steps": 4060, "loss": 0.494, "lr": 2.7980295566502466e-05, "epoch": 0.49137931034482757, "percentage": 7.02, "elapsed_time": "0:17:14", "remaining_time": "3:48:23"}
|
||||||
|
{"current_steps": 290, "total_steps": 4060, "loss": 0.4814, "lr": 2.8472906403940888e-05, "epoch": 0.5, "percentage": 7.14, "elapsed_time": "0:17:31", "remaining_time": "3:47:50"}
|
||||||
|
{"current_steps": 295, "total_steps": 4060, "loss": 0.4614, "lr": 2.8965517241379313e-05, "epoch": 0.5086206896551724, "percentage": 7.27, "elapsed_time": "0:17:52", "remaining_time": "3:48:07"}
|
||||||
|
{"current_steps": 300, "total_steps": 4060, "loss": 0.4388, "lr": 2.945812807881774e-05, "epoch": 0.5172413793103449, "percentage": 7.39, "elapsed_time": "0:18:12", "remaining_time": "3:48:09"}
|
||||||
|
{"current_steps": 305, "total_steps": 4060, "loss": 0.5058, "lr": 2.9950738916256158e-05, "epoch": 0.5258620689655172, "percentage": 7.51, "elapsed_time": "0:18:27", "remaining_time": "3:47:16"}
|
||||||
|
{"current_steps": 310, "total_steps": 4060, "loss": 0.4494, "lr": 3.0443349753694583e-05, "epoch": 0.5344827586206896, "percentage": 7.64, "elapsed_time": "0:18:44", "remaining_time": "3:46:38"}
|
||||||
|
{"current_steps": 315, "total_steps": 4060, "loss": 0.4426, "lr": 3.093596059113301e-05, "epoch": 0.5431034482758621, "percentage": 7.76, "elapsed_time": "0:19:05", "remaining_time": "3:46:54"}
|
||||||
|
{"current_steps": 320, "total_steps": 4060, "loss": 0.478, "lr": 3.142857142857143e-05, "epoch": 0.5517241379310345, "percentage": 7.88, "elapsed_time": "0:19:22", "remaining_time": "3:46:32"}
|
||||||
|
{"current_steps": 325, "total_steps": 4060, "loss": 0.4686, "lr": 3.1921182266009856e-05, "epoch": 0.5603448275862069, "percentage": 8.0, "elapsed_time": "0:19:46", "remaining_time": "3:47:15"}
|
||||||
|
{"current_steps": 330, "total_steps": 4060, "loss": 0.4703, "lr": 3.2413793103448275e-05, "epoch": 0.5689655172413793, "percentage": 8.13, "elapsed_time": "0:20:01", "remaining_time": "3:46:23"}
|
||||||
|
{"current_steps": 335, "total_steps": 4060, "loss": 0.4245, "lr": 3.29064039408867e-05, "epoch": 0.5775862068965517, "percentage": 8.25, "elapsed_time": "0:20:19", "remaining_time": "3:45:55"}
|
||||||
|
{"current_steps": 340, "total_steps": 4060, "loss": 0.463, "lr": 3.3399014778325126e-05, "epoch": 0.5862068965517241, "percentage": 8.37, "elapsed_time": "0:20:36", "remaining_time": "3:45:28"}
|
||||||
|
{"current_steps": 345, "total_steps": 4060, "loss": 0.4665, "lr": 3.389162561576355e-05, "epoch": 0.5948275862068966, "percentage": 8.5, "elapsed_time": "0:20:55", "remaining_time": "3:45:23"}
|
||||||
|
{"current_steps": 350, "total_steps": 4060, "loss": 0.4797, "lr": 3.438423645320197e-05, "epoch": 0.603448275862069, "percentage": 8.62, "elapsed_time": "0:21:15", "remaining_time": "3:45:22"}
|
||||||
|
{"current_steps": 355, "total_steps": 4060, "loss": 0.4418, "lr": 3.4876847290640396e-05, "epoch": 0.6120689655172413, "percentage": 8.74, "elapsed_time": "0:21:29", "remaining_time": "3:44:16"}
|
||||||
|
{"current_steps": 360, "total_steps": 4060, "loss": 0.4798, "lr": 3.536945812807882e-05, "epoch": 0.6206896551724138, "percentage": 8.87, "elapsed_time": "0:21:47", "remaining_time": "3:43:54"}
|
||||||
|
{"current_steps": 365, "total_steps": 4060, "loss": 0.4045, "lr": 3.586206896551725e-05, "epoch": 0.6293103448275862, "percentage": 8.99, "elapsed_time": "0:22:03", "remaining_time": "3:43:18"}
|
||||||
|
{"current_steps": 370, "total_steps": 4060, "loss": 0.4704, "lr": 3.6354679802955665e-05, "epoch": 0.6379310344827587, "percentage": 9.11, "elapsed_time": "0:22:23", "remaining_time": "3:43:23"}
|
||||||
|
{"current_steps": 375, "total_steps": 4060, "loss": 0.43, "lr": 3.684729064039409e-05, "epoch": 0.646551724137931, "percentage": 9.24, "elapsed_time": "0:22:40", "remaining_time": "3:42:51"}
|
||||||
|
{"current_steps": 380, "total_steps": 4060, "loss": 0.4636, "lr": 3.7339901477832516e-05, "epoch": 0.6551724137931034, "percentage": 9.36, "elapsed_time": "0:22:55", "remaining_time": "3:42:03"}
|
||||||
|
{"current_steps": 385, "total_steps": 4060, "loss": 0.4305, "lr": 3.7832512315270935e-05, "epoch": 0.6637931034482759, "percentage": 9.48, "elapsed_time": "0:23:16", "remaining_time": "3:42:11"}
|
||||||
|
{"current_steps": 390, "total_steps": 4060, "loss": 0.4543, "lr": 3.832512315270936e-05, "epoch": 0.6724137931034483, "percentage": 9.61, "elapsed_time": "0:23:37", "remaining_time": "3:42:21"}
|
||||||
|
{"current_steps": 395, "total_steps": 4060, "loss": 0.4919, "lr": 3.8817733990147786e-05, "epoch": 0.6810344827586207, "percentage": 9.73, "elapsed_time": "0:23:51", "remaining_time": "3:41:23"}
|
||||||
|
{"current_steps": 400, "total_steps": 4060, "loss": 0.4897, "lr": 3.931034482758621e-05, "epoch": 0.6896551724137931, "percentage": 9.85, "elapsed_time": "0:24:05", "remaining_time": "3:40:30"}
|
||||||
|
{"current_steps": 405, "total_steps": 4060, "loss": 0.4299, "lr": 3.980295566502464e-05, "epoch": 0.6982758620689655, "percentage": 9.98, "elapsed_time": "0:24:25", "remaining_time": "3:40:26"}
|
||||||
|
{"current_steps": 410, "total_steps": 4060, "loss": 0.4158, "lr": 3.999993347192948e-05, "epoch": 0.7068965517241379, "percentage": 10.1, "elapsed_time": "0:24:42", "remaining_time": "3:39:55"}
|
||||||
|
{"current_steps": 415, "total_steps": 4060, "loss": 0.4429, "lr": 3.9999526913101334e-05, "epoch": 0.7155172413793104, "percentage": 10.22, "elapsed_time": "0:25:00", "remaining_time": "3:39:36"}
|
||||||
|
{"current_steps": 420, "total_steps": 4060, "loss": 0.4298, "lr": 3.999875076298832e-05, "epoch": 0.7241379310344828, "percentage": 10.34, "elapsed_time": "0:25:25", "remaining_time": "3:40:24"}
|
||||||
|
{"current_steps": 425, "total_steps": 4060, "loss": 0.4076, "lr": 3.9997605035933704e-05, "epoch": 0.7327586206896551, "percentage": 10.47, "elapsed_time": "0:25:48", "remaining_time": "3:40:48"}
|
||||||
|
{"current_steps": 430, "total_steps": 4060, "loss": 0.4023, "lr": 3.99960897531105e-05, "epoch": 0.7413793103448276, "percentage": 10.59, "elapsed_time": "0:26:11", "remaining_time": "3:41:05"}
|
||||||
|
{"current_steps": 435, "total_steps": 4060, "loss": 0.4612, "lr": 3.999420494252116e-05, "epoch": 0.75, "percentage": 10.71, "elapsed_time": "0:26:26", "remaining_time": "3:40:22"}
|
||||||
|
{"current_steps": 440, "total_steps": 4060, "loss": 0.4775, "lr": 3.9991950638996976e-05, "epoch": 0.7586206896551724, "percentage": 10.84, "elapsed_time": "0:26:40", "remaining_time": "3:39:27"}
|
||||||
|
{"current_steps": 445, "total_steps": 4060, "loss": 0.4424, "lr": 3.998932688419748e-05, "epoch": 0.7672413793103449, "percentage": 10.96, "elapsed_time": "0:26:57", "remaining_time": "3:38:58"}
|
||||||
|
{"current_steps": 450, "total_steps": 4060, "loss": 0.4409, "lr": 3.9986333726609674e-05, "epoch": 0.7758620689655172, "percentage": 11.08, "elapsed_time": "0:27:12", "remaining_time": "3:38:16"}
|
||||||
|
{"current_steps": 455, "total_steps": 4060, "loss": 0.4696, "lr": 3.99829712215471e-05, "epoch": 0.7844827586206896, "percentage": 11.21, "elapsed_time": "0:27:32", "remaining_time": "3:38:08"}
|
||||||
|
{"current_steps": 460, "total_steps": 4060, "loss": 0.4595, "lr": 3.997923943114886e-05, "epoch": 0.7931034482758621, "percentage": 11.33, "elapsed_time": "0:27:49", "remaining_time": "3:37:44"}
|
||||||
|
{"current_steps": 465, "total_steps": 4060, "loss": 0.4311, "lr": 3.997513842437845e-05, "epoch": 0.8017241379310345, "percentage": 11.45, "elapsed_time": "0:28:03", "remaining_time": "3:36:54"}
|
||||||
|
{"current_steps": 470, "total_steps": 4060, "loss": 0.4858, "lr": 3.997066827702248e-05, "epoch": 0.8103448275862069, "percentage": 11.58, "elapsed_time": "0:28:17", "remaining_time": "3:36:09"}
|
||||||
|
{"current_steps": 475, "total_steps": 4060, "loss": 0.5172, "lr": 3.996582907168928e-05, "epoch": 0.8189655172413793, "percentage": 11.7, "elapsed_time": "0:28:46", "remaining_time": "3:37:09"}
|
||||||
|
{"current_steps": 480, "total_steps": 4060, "loss": 0.4354, "lr": 3.996062089780737e-05, "epoch": 0.8275862068965517, "percentage": 11.82, "elapsed_time": "0:29:04", "remaining_time": "3:36:48"}
|
||||||
|
{"current_steps": 485, "total_steps": 4060, "loss": 0.5466, "lr": 3.99550438516238e-05, "epoch": 0.8362068965517241, "percentage": 11.95, "elapsed_time": "0:29:26", "remaining_time": "3:36:59"}
|
||||||
|
{"current_steps": 490, "total_steps": 4060, "loss": 0.456, "lr": 3.994909803620241e-05, "epoch": 0.8448275862068966, "percentage": 12.07, "elapsed_time": "0:29:43", "remaining_time": "3:36:30"}
|
||||||
|
{"current_steps": 495, "total_steps": 4060, "loss": 0.4461, "lr": 3.994278356142187e-05, "epoch": 0.853448275862069, "percentage": 12.19, "elapsed_time": "0:29:56", "remaining_time": "3:35:34"}
|
||||||
|
{"current_steps": 500, "total_steps": 4060, "loss": 0.4587, "lr": 3.993610054397368e-05, "epoch": 0.8620689655172413, "percentage": 12.32, "elapsed_time": "0:30:17", "remaining_time": "3:35:43"}
|
||||||
|
{"current_steps": 505, "total_steps": 4060, "loss": 0.4604, "lr": 3.992904910736001e-05, "epoch": 0.8706896551724138, "percentage": 12.44, "elapsed_time": "0:30:32", "remaining_time": "3:35:01"}
|
||||||
|
{"current_steps": 510, "total_steps": 4060, "loss": 0.39, "lr": 3.9921629381891425e-05, "epoch": 0.8793103448275862, "percentage": 12.56, "elapsed_time": "0:30:52", "remaining_time": "3:34:53"}
|
||||||
|
{"current_steps": 515, "total_steps": 4060, "loss": 0.4546, "lr": 3.991384150468445e-05, "epoch": 0.8879310344827587, "percentage": 12.68, "elapsed_time": "0:31:04", "remaining_time": "3:33:55"}
|
||||||
|
{"current_steps": 520, "total_steps": 4060, "loss": 0.444, "lr": 3.9905685619659074e-05, "epoch": 0.896551724137931, "percentage": 12.81, "elapsed_time": "0:31:23", "remaining_time": "3:33:43"}
|
||||||
|
{"current_steps": 525, "total_steps": 4060, "loss": 0.4599, "lr": 3.9897161877536076e-05, "epoch": 0.9051724137931034, "percentage": 12.93, "elapsed_time": "0:31:37", "remaining_time": "3:32:59"}
|
||||||
|
{"current_steps": 530, "total_steps": 4060, "loss": 0.4432, "lr": 3.9888270435834196e-05, "epoch": 0.9137931034482759, "percentage": 13.05, "elapsed_time": "0:31:51", "remaining_time": "3:32:13"}
|
||||||
|
{"current_steps": 535, "total_steps": 4060, "loss": 0.4055, "lr": 3.987901145886731e-05, "epoch": 0.9224137931034483, "percentage": 13.18, "elapsed_time": "0:32:14", "remaining_time": "3:32:23"}
|
||||||
|
{"current_steps": 540, "total_steps": 4060, "loss": 0.4626, "lr": 3.9869385117741314e-05, "epoch": 0.9310344827586207, "percentage": 13.3, "elapsed_time": "0:32:34", "remaining_time": "3:32:17"}
|
||||||
|
{"current_steps": 545, "total_steps": 4060, "loss": 0.3871, "lr": 3.985939159035101e-05, "epoch": 0.9396551724137931, "percentage": 13.42, "elapsed_time": "0:32:55", "remaining_time": "3:32:21"}
|
||||||
|
{"current_steps": 550, "total_steps": 4060, "loss": 0.4763, "lr": 3.98490310613768e-05, "epoch": 0.9482758620689655, "percentage": 13.55, "elapsed_time": "0:33:09", "remaining_time": "3:31:35"}
|
||||||
|
{"current_steps": 555, "total_steps": 4060, "loss": 0.4762, "lr": 3.983830372228127e-05, "epoch": 0.9568965517241379, "percentage": 13.67, "elapsed_time": "0:33:26", "remaining_time": "3:31:12"}
|
||||||
|
{"current_steps": 560, "total_steps": 4060, "loss": 0.4584, "lr": 3.982720977130567e-05, "epoch": 0.9655172413793104, "percentage": 13.79, "elapsed_time": "0:33:45", "remaining_time": "3:30:57"}
|
||||||
|
{"current_steps": 565, "total_steps": 4060, "loss": 0.4184, "lr": 3.9815749413466204e-05, "epoch": 0.9741379310344828, "percentage": 13.92, "elapsed_time": "0:34:02", "remaining_time": "3:30:34"}
|
||||||
|
{"current_steps": 570, "total_steps": 4060, "loss": 0.4518, "lr": 3.980392286055033e-05, "epoch": 0.9827586206896551, "percentage": 14.04, "elapsed_time": "0:34:18", "remaining_time": "3:30:00"}
|
||||||
|
{"current_steps": 575, "total_steps": 4060, "loss": 0.3977, "lr": 3.979173033111275e-05, "epoch": 0.9913793103448276, "percentage": 14.16, "elapsed_time": "0:34:41", "remaining_time": "3:30:14"}
|
||||||
|
{"current_steps": 580, "total_steps": 4060, "loss": 0.422, "lr": 3.977917205047142e-05, "epoch": 1.0, "percentage": 14.29, "elapsed_time": "0:35:01", "remaining_time": "3:30:10"}
|
||||||
|
{"current_steps": 585, "total_steps": 4060, "loss": 0.3625, "lr": 3.976624825070339e-05, "epoch": 1.0086206896551724, "percentage": 14.41, "elapsed_time": "0:35:17", "remaining_time": "3:29:38"}
|
||||||
|
{"current_steps": 590, "total_steps": 4060, "loss": 0.3493, "lr": 3.97529591706405e-05, "epoch": 1.0172413793103448, "percentage": 14.53, "elapsed_time": "0:35:35", "remaining_time": "3:29:22"}
|
||||||
|
{"current_steps": 595, "total_steps": 4060, "loss": 0.4321, "lr": 3.973930505586496e-05, "epoch": 1.0258620689655173, "percentage": 14.66, "elapsed_time": "0:35:53", "remaining_time": "3:28:58"}
|
||||||
|
{"current_steps": 600, "total_steps": 4060, "loss": 0.3917, "lr": 3.972528615870483e-05, "epoch": 1.0344827586206897, "percentage": 14.78, "elapsed_time": "0:36:09", "remaining_time": "3:28:31"}
|
||||||
|
{"current_steps": 605, "total_steps": 4060, "loss": 0.3824, "lr": 3.9710902738229354e-05, "epoch": 1.043103448275862, "percentage": 14.9, "elapsed_time": "0:36:31", "remaining_time": "3:28:35"}
|
||||||
|
{"current_steps": 610, "total_steps": 4060, "loss": 0.4383, "lr": 3.9696155060244166e-05, "epoch": 1.0517241379310345, "percentage": 15.02, "elapsed_time": "0:36:44", "remaining_time": "3:27:50"}
|
||||||
|
{"current_steps": 615, "total_steps": 4060, "loss": 0.3815, "lr": 3.968104339728636e-05, "epoch": 1.0603448275862069, "percentage": 15.15, "elapsed_time": "0:37:01", "remaining_time": "3:27:26"}
|
||||||
|
{"current_steps": 620, "total_steps": 4060, "loss": 0.4487, "lr": 3.966556802861951e-05, "epoch": 1.0689655172413792, "percentage": 15.27, "elapsed_time": "0:37:16", "remaining_time": "3:26:50"}
|
||||||
|
{"current_steps": 625, "total_steps": 4060, "loss": 0.4245, "lr": 3.964972924022843e-05, "epoch": 1.0775862068965518, "percentage": 15.39, "elapsed_time": "0:37:30", "remaining_time": "3:26:08"}
|
||||||
|
{"current_steps": 630, "total_steps": 4060, "loss": 0.4001, "lr": 3.963352732481396e-05, "epoch": 1.0862068965517242, "percentage": 15.52, "elapsed_time": "0:37:52", "remaining_time": "3:26:15"}
|
||||||
|
{"current_steps": 635, "total_steps": 4060, "loss": 0.4073, "lr": 3.961696258178752e-05, "epoch": 1.0948275862068966, "percentage": 15.64, "elapsed_time": "0:38:07", "remaining_time": "3:25:35"}
|
||||||
|
{"current_steps": 640, "total_steps": 4060, "loss": 0.3559, "lr": 3.960003531726559e-05, "epoch": 1.103448275862069, "percentage": 15.76, "elapsed_time": "0:38:25", "remaining_time": "3:25:19"}
|
||||||
|
{"current_steps": 645, "total_steps": 4060, "loss": 0.4336, "lr": 3.958274584406403e-05, "epoch": 1.1120689655172413, "percentage": 15.89, "elapsed_time": "0:38:41", "remaining_time": "3:24:53"}
|
||||||
|
{"current_steps": 650, "total_steps": 4060, "loss": 0.4048, "lr": 3.956509448169233e-05, "epoch": 1.1206896551724137, "percentage": 16.01, "elapsed_time": "0:38:59", "remaining_time": "3:24:32"}
|
||||||
|
{"current_steps": 655, "total_steps": 4060, "loss": 0.3934, "lr": 3.9547081556347693e-05, "epoch": 1.1293103448275863, "percentage": 16.13, "elapsed_time": "0:39:17", "remaining_time": "3:24:15"}
|
||||||
|
{"current_steps": 660, "total_steps": 4060, "loss": 0.413, "lr": 3.952870740090901e-05, "epoch": 1.1379310344827587, "percentage": 16.26, "elapsed_time": "0:39:28", "remaining_time": "3:23:22"}
|
||||||
|
{"current_steps": 665, "total_steps": 4060, "loss": 0.3897, "lr": 3.950997235493069e-05, "epoch": 1.146551724137931, "percentage": 16.38, "elapsed_time": "0:39:38", "remaining_time": "3:22:22"}
|
||||||
|
{"current_steps": 670, "total_steps": 4060, "loss": 0.4095, "lr": 3.9490876764636414e-05, "epoch": 1.1551724137931034, "percentage": 16.5, "elapsed_time": "0:39:50", "remaining_time": "3:21:37"}
|
||||||
|
{"current_steps": 675, "total_steps": 4060, "loss": 0.3831, "lr": 3.947142098291272e-05, "epoch": 1.1637931034482758, "percentage": 16.63, "elapsed_time": "0:40:11", "remaining_time": "3:21:34"}
|
||||||
|
{"current_steps": 680, "total_steps": 4060, "loss": 0.3948, "lr": 3.945160536930247e-05, "epoch": 1.1724137931034484, "percentage": 16.75, "elapsed_time": "0:40:25", "remaining_time": "3:20:53"}
|
||||||
|
{"current_steps": 685, "total_steps": 4060, "loss": 0.4309, "lr": 3.9431430289998235e-05, "epoch": 1.1810344827586208, "percentage": 16.87, "elapsed_time": "0:40:38", "remaining_time": "3:20:16"}
|
||||||
|
{"current_steps": 690, "total_steps": 4060, "loss": 0.4384, "lr": 3.941089611783551e-05, "epoch": 1.1896551724137931, "percentage": 17.0, "elapsed_time": "0:40:56", "remaining_time": "3:19:59"}
|
||||||
|
{"current_steps": 695, "total_steps": 4060, "loss": 0.3825, "lr": 3.939000323228583e-05, "epoch": 1.1982758620689655, "percentage": 17.12, "elapsed_time": "0:41:08", "remaining_time": "3:19:10"}
|
||||||
|
{"current_steps": 700, "total_steps": 4060, "loss": 0.4469, "lr": 3.9368752019449744e-05, "epoch": 1.206896551724138, "percentage": 17.24, "elapsed_time": "0:41:24", "remaining_time": "3:18:47"}
|
||||||
|
{"current_steps": 705, "total_steps": 4060, "loss": 0.3956, "lr": 3.934714287204969e-05, "epoch": 1.2155172413793103, "percentage": 17.36, "elapsed_time": "0:41:42", "remaining_time": "3:18:28"}
|
||||||
|
{"current_steps": 710, "total_steps": 4060, "loss": 0.4363, "lr": 3.932517618942275e-05, "epoch": 1.2241379310344827, "percentage": 17.49, "elapsed_time": "0:41:55", "remaining_time": "3:17:48"}
|
||||||
|
{"current_steps": 715, "total_steps": 4060, "loss": 0.3825, "lr": 3.930285237751324e-05, "epoch": 1.2327586206896552, "percentage": 17.61, "elapsed_time": "0:42:09", "remaining_time": "3:17:14"}
|
||||||
|
{"current_steps": 720, "total_steps": 4060, "loss": 0.4158, "lr": 3.928017184886525e-05, "epoch": 1.2413793103448276, "percentage": 17.73, "elapsed_time": "0:42:27", "remaining_time": "3:16:59"}
|
||||||
|
{"current_steps": 725, "total_steps": 4060, "loss": 0.3828, "lr": 3.925713502261496e-05, "epoch": 1.25, "percentage": 17.86, "elapsed_time": "0:42:43", "remaining_time": "3:16:31"}
|
||||||
|
{"current_steps": 730, "total_steps": 4060, "loss": 0.3671, "lr": 3.9233742324482965e-05, "epoch": 1.2586206896551724, "percentage": 17.98, "elapsed_time": "0:43:03", "remaining_time": "3:16:23"}
|
||||||
|
{"current_steps": 735, "total_steps": 4060, "loss": 0.4012, "lr": 3.920999418676636e-05, "epoch": 1.2672413793103448, "percentage": 18.1, "elapsed_time": "0:43:14", "remaining_time": "3:15:35"}
|
||||||
|
{"current_steps": 740, "total_steps": 4060, "loss": 0.4271, "lr": 3.918589104833075e-05, "epoch": 1.2758620689655173, "percentage": 18.23, "elapsed_time": "0:43:35", "remaining_time": "3:15:35"}
|
||||||
|
{"current_steps": 745, "total_steps": 4060, "loss": 0.4317, "lr": 3.916143335460218e-05, "epoch": 1.2844827586206897, "percentage": 18.35, "elapsed_time": "0:43:47", "remaining_time": "3:14:51"}
|
||||||
|
{"current_steps": 750, "total_steps": 4060, "loss": 0.4397, "lr": 3.913662155755885e-05, "epoch": 1.293103448275862, "percentage": 18.47, "elapsed_time": "0:44:09", "remaining_time": "3:14:52"}
|
||||||
|
{"current_steps": 755, "total_steps": 4060, "loss": 0.4318, "lr": 3.911145611572282e-05, "epoch": 1.3017241379310345, "percentage": 18.6, "elapsed_time": "0:44:27", "remaining_time": "3:14:39"}
|
||||||
|
{"current_steps": 760, "total_steps": 4060, "loss": 0.4629, "lr": 3.908593749415148e-05, "epoch": 1.3103448275862069, "percentage": 18.72, "elapsed_time": "0:44:51", "remaining_time": "3:14:47"}
|
||||||
|
{"current_steps": 765, "total_steps": 4060, "loss": 0.3818, "lr": 3.9060066164428986e-05, "epoch": 1.3189655172413794, "percentage": 18.84, "elapsed_time": "0:45:16", "remaining_time": "3:14:59"}
|
||||||
|
{"current_steps": 770, "total_steps": 4060, "loss": 0.3751, "lr": 3.903384260465756e-05, "epoch": 1.3275862068965516, "percentage": 18.97, "elapsed_time": "0:45:30", "remaining_time": "3:14:25"}
|
||||||
|
{"current_steps": 775, "total_steps": 4060, "loss": 0.431, "lr": 3.900726729944861e-05, "epoch": 1.3362068965517242, "percentage": 19.09, "elapsed_time": "0:45:53", "remaining_time": "3:14:29"}
|
||||||
|
{"current_steps": 780, "total_steps": 4060, "loss": 0.4135, "lr": 3.898034073991382e-05, "epoch": 1.3448275862068966, "percentage": 19.21, "elapsed_time": "0:46:17", "remaining_time": "3:14:38"}
|
||||||
|
{"current_steps": 785, "total_steps": 4060, "loss": 0.424, "lr": 3.8953063423656055e-05, "epoch": 1.353448275862069, "percentage": 19.33, "elapsed_time": "0:46:38", "remaining_time": "3:14:33"}
|
||||||
|
{"current_steps": 790, "total_steps": 4060, "loss": 0.4081, "lr": 3.892543585476014e-05, "epoch": 1.3620689655172413, "percentage": 19.46, "elapsed_time": "0:46:56", "remaining_time": "3:14:17"}
|
||||||
|
{"current_steps": 795, "total_steps": 4060, "loss": 0.4337, "lr": 3.88974585437836e-05, "epoch": 1.3706896551724137, "percentage": 19.58, "elapsed_time": "0:47:11", "remaining_time": "3:13:48"}
|
||||||
|
{"current_steps": 800, "total_steps": 4060, "loss": 0.3736, "lr": 3.886913200774717e-05, "epoch": 1.3793103448275863, "percentage": 19.7, "elapsed_time": "0:47:34", "remaining_time": "3:13:53"}
|
||||||
|
{"current_steps": 805, "total_steps": 4060, "loss": 0.4324, "lr": 3.884045677012528e-05, "epoch": 1.3879310344827587, "percentage": 19.83, "elapsed_time": "0:47:47", "remaining_time": "3:13:16"}
|
||||||
|
{"current_steps": 810, "total_steps": 4060, "loss": 0.4064, "lr": 3.8811433360836364e-05, "epoch": 1.396551724137931, "percentage": 19.95, "elapsed_time": "0:48:02", "remaining_time": "3:12:43"}
|
||||||
|
{"current_steps": 815, "total_steps": 4060, "loss": 0.4291, "lr": 3.878206231623306e-05, "epoch": 1.4051724137931034, "percentage": 20.07, "elapsed_time": "0:48:28", "remaining_time": "3:12:59"}
|
||||||
|
{"current_steps": 820, "total_steps": 4060, "loss": 0.4534, "lr": 3.8752344179092315e-05, "epoch": 1.4137931034482758, "percentage": 20.2, "elapsed_time": "0:48:46", "remaining_time": "3:12:44"}
|
||||||
|
{"current_steps": 825, "total_steps": 4060, "loss": 0.3673, "lr": 3.8722279498605344e-05, "epoch": 1.4224137931034484, "percentage": 20.32, "elapsed_time": "0:49:05", "remaining_time": "3:12:31"}
|
||||||
|
{"current_steps": 830, "total_steps": 4060, "loss": 0.4027, "lr": 3.869186883036748e-05, "epoch": 1.4310344827586206, "percentage": 20.44, "elapsed_time": "0:49:18", "remaining_time": "3:11:52"}
|
||||||
|
{"current_steps": 835, "total_steps": 4060, "loss": 0.4036, "lr": 3.8661112736367924e-05, "epoch": 1.4396551724137931, "percentage": 20.57, "elapsed_time": "0:49:34", "remaining_time": "3:11:29"}
|
||||||
|
{"current_steps": 840, "total_steps": 4060, "loss": 0.3987, "lr": 3.863001178497933e-05, "epoch": 1.4482758620689655, "percentage": 20.69, "elapsed_time": "0:49:55", "remaining_time": "3:11:22"}
|
||||||
|
{"current_steps": 845, "total_steps": 4060, "loss": 0.3881, "lr": 3.8598566550947316e-05, "epoch": 1.456896551724138, "percentage": 20.81, "elapsed_time": "0:50:22", "remaining_time": "3:11:39"}
|
||||||
|
{"current_steps": 850, "total_steps": 4060, "loss": 0.4021, "lr": 3.856677761537986e-05, "epoch": 1.4655172413793103, "percentage": 20.94, "elapsed_time": "0:50:41", "remaining_time": "3:11:25"}
|
||||||
|
{"current_steps": 855, "total_steps": 4060, "loss": 0.3916, "lr": 3.853464556573652e-05, "epoch": 1.4741379310344827, "percentage": 21.06, "elapsed_time": "0:51:02", "remaining_time": "3:11:18"}
|
||||||
|
{"current_steps": 860, "total_steps": 4060, "loss": 0.396, "lr": 3.850217099581764e-05, "epoch": 1.4827586206896552, "percentage": 21.18, "elapsed_time": "0:51:14", "remaining_time": "3:10:40"}
|
||||||
|
{"current_steps": 865, "total_steps": 4060, "loss": 0.3623, "lr": 3.8469354505753305e-05, "epoch": 1.4913793103448276, "percentage": 21.31, "elapsed_time": "0:51:31", "remaining_time": "3:10:19"}
|
||||||
|
{"current_steps": 870, "total_steps": 4060, "loss": 0.3986, "lr": 3.843619670199229e-05, "epoch": 1.5, "percentage": 21.43, "elapsed_time": "0:51:43", "remaining_time": "3:09:40"}
|
||||||
|
{"current_steps": 875, "total_steps": 4060, "loss": 0.4389, "lr": 3.8402698197290865e-05, "epoch": 1.5086206896551724, "percentage": 21.55, "elapsed_time": "0:51:53", "remaining_time": "3:08:54"}
|
||||||
|
{"current_steps": 880, "total_steps": 4060, "loss": 0.4321, "lr": 3.8368859610701443e-05, "epoch": 1.5172413793103448, "percentage": 21.67, "elapsed_time": "0:52:07", "remaining_time": "3:08:23"}
|
||||||
|
{"current_steps": 885, "total_steps": 4060, "loss": 0.3699, "lr": 3.833468156756114e-05, "epoch": 1.5258620689655173, "percentage": 21.8, "elapsed_time": "0:52:24", "remaining_time": "3:08:00"}
|
||||||
|
{"current_steps": 890, "total_steps": 4060, "loss": 0.4074, "lr": 3.8300164699480246e-05, "epoch": 1.5344827586206895, "percentage": 21.92, "elapsed_time": "0:52:36", "remaining_time": "3:07:23"}
|
||||||
|
{"current_steps": 895, "total_steps": 4060, "loss": 0.3796, "lr": 3.8265309644330535e-05, "epoch": 1.543103448275862, "percentage": 22.04, "elapsed_time": "0:52:54", "remaining_time": "3:07:07"}
|
||||||
|
{"current_steps": 900, "total_steps": 4060, "loss": 0.4127, "lr": 3.823011704623347e-05, "epoch": 1.5517241379310345, "percentage": 22.17, "elapsed_time": "0:53:11", "remaining_time": "3:06:44"}
|
||||||
|
{"current_steps": 905, "total_steps": 4060, "loss": 0.4045, "lr": 3.81945875555483e-05, "epoch": 1.5603448275862069, "percentage": 22.29, "elapsed_time": "0:53:24", "remaining_time": "3:06:13"}
|
||||||
|
{"current_steps": 910, "total_steps": 4060, "loss": 0.3494, "lr": 3.8158721828860094e-05, "epoch": 1.5689655172413794, "percentage": 22.41, "elapsed_time": "0:53:41", "remaining_time": "3:05:51"}
|
||||||
|
{"current_steps": 915, "total_steps": 4060, "loss": 0.4902, "lr": 3.81225205289675e-05, "epoch": 1.5775862068965516, "percentage": 22.54, "elapsed_time": "0:53:55", "remaining_time": "3:05:22"}
|
||||||
|
{"current_steps": 920, "total_steps": 4060, "loss": 0.4345, "lr": 3.808598432487061e-05, "epoch": 1.5862068965517242, "percentage": 22.66, "elapsed_time": "0:54:09", "remaining_time": "3:04:49"}
|
||||||
|
{"current_steps": 925, "total_steps": 4060, "loss": 0.4502, "lr": 3.8049113891758506e-05, "epoch": 1.5948275862068966, "percentage": 22.78, "elapsed_time": "0:54:19", "remaining_time": "3:04:07"}
|
||||||
|
{"current_steps": 930, "total_steps": 4060, "loss": 0.3747, "lr": 3.8011909910996856e-05, "epoch": 1.603448275862069, "percentage": 22.91, "elapsed_time": "0:54:37", "remaining_time": "3:03:50"}
|
||||||
|
{"current_steps": 935, "total_steps": 4060, "loss": 0.378, "lr": 3.797437307011527e-05, "epoch": 1.6120689655172413, "percentage": 23.03, "elapsed_time": "0:54:57", "remaining_time": "3:03:41"}
|
||||||
|
{"current_steps": 940, "total_steps": 4060, "loss": 0.4217, "lr": 3.793650406279463e-05, "epoch": 1.6206896551724137, "percentage": 23.15, "elapsed_time": "0:55:17", "remaining_time": "3:03:29"}
|
||||||
|
{"current_steps": 945, "total_steps": 4060, "loss": 0.3822, "lr": 3.789830358885423e-05, "epoch": 1.6293103448275863, "percentage": 23.28, "elapsed_time": "0:55:29", "remaining_time": "3:02:55"}
|
||||||
|
{"current_steps": 950, "total_steps": 4060, "loss": 0.3921, "lr": 3.7859772354238885e-05, "epoch": 1.6379310344827587, "percentage": 23.4, "elapsed_time": "0:55:43", "remaining_time": "3:02:24"}
|
||||||
|
{"current_steps": 955, "total_steps": 4060, "loss": 0.4034, "lr": 3.782091107100587e-05, "epoch": 1.646551724137931, "percentage": 23.52, "elapsed_time": "0:55:55", "remaining_time": "3:01:49"}
|
||||||
|
{"current_steps": 960, "total_steps": 4060, "loss": 0.4166, "lr": 3.7781720457311746e-05, "epoch": 1.6551724137931034, "percentage": 23.65, "elapsed_time": "0:56:13", "remaining_time": "3:01:32"}
|
||||||
|
{"current_steps": 965, "total_steps": 4060, "loss": 0.4514, "lr": 3.7742201237399105e-05, "epoch": 1.6637931034482758, "percentage": 23.77, "elapsed_time": "0:56:39", "remaining_time": "3:01:42"}
|
||||||
|
{"current_steps": 970, "total_steps": 4060, "loss": 0.3931, "lr": 3.77023541415832e-05, "epoch": 1.6724137931034484, "percentage": 23.89, "elapsed_time": "0:56:53", "remaining_time": "3:01:12"}
|
||||||
|
{"current_steps": 975, "total_steps": 4060, "loss": 0.4068, "lr": 3.7662179906238405e-05, "epoch": 1.6810344827586206, "percentage": 24.01, "elapsed_time": "0:57:17", "remaining_time": "3:01:16"}
|
||||||
|
{"current_steps": 980, "total_steps": 4060, "loss": 0.3599, "lr": 3.762167927378464e-05, "epoch": 1.6896551724137931, "percentage": 24.14, "elapsed_time": "0:57:36", "remaining_time": "3:01:03"}
|
||||||
|
{"current_steps": 985, "total_steps": 4060, "loss": 0.392, "lr": 3.7580852992673656e-05, "epoch": 1.6982758620689655, "percentage": 24.26, "elapsed_time": "0:57:52", "remaining_time": "3:00:40"}
|
||||||
|
{"current_steps": 990, "total_steps": 4060, "loss": 0.3932, "lr": 3.7539701817375185e-05, "epoch": 1.706896551724138, "percentage": 24.38, "elapsed_time": "0:58:13", "remaining_time": "3:00:34"}
|
||||||
|
{"current_steps": 995, "total_steps": 4060, "loss": 0.4305, "lr": 3.7498226508362996e-05, "epoch": 1.7155172413793105, "percentage": 24.51, "elapsed_time": "0:58:35", "remaining_time": "3:00:29"}
|
||||||
|
{"current_steps": 1000, "total_steps": 4060, "loss": 0.3872, "lr": 3.7456427832100864e-05, "epoch": 1.7241379310344827, "percentage": 24.63, "elapsed_time": "0:58:54", "remaining_time": "3:00:16"}
|
||||||
|
{"current_steps": 1005, "total_steps": 4060, "loss": 0.4164, "lr": 3.7414306561028385e-05, "epoch": 1.7327586206896552, "percentage": 24.75, "elapsed_time": "0:59:09", "remaining_time": "2:59:49"}
|
||||||
|
{"current_steps": 1010, "total_steps": 4060, "loss": 0.4169, "lr": 3.73718634735467e-05, "epoch": 1.7413793103448276, "percentage": 24.88, "elapsed_time": "0:59:23", "remaining_time": "2:59:20"}
|
||||||
|
{"current_steps": 1015, "total_steps": 4060, "loss": 0.3606, "lr": 3.732909935400412e-05, "epoch": 1.75, "percentage": 25.0, "elapsed_time": "0:59:40", "remaining_time": "2:59:00"}
|
||||||
|
{"current_steps": 1020, "total_steps": 4060, "loss": 0.3841, "lr": 3.7286014992681645e-05, "epoch": 1.7586206896551724, "percentage": 25.12, "elapsed_time": "1:00:03", "remaining_time": "2:58:58"}
|
||||||
|
{"current_steps": 1025, "total_steps": 4060, "loss": 0.3862, "lr": 3.7242611185778325e-05, "epoch": 1.7672413793103448, "percentage": 25.25, "elapsed_time": "1:00:30", "remaining_time": "2:59:11"}
|
||||||
|
{"current_steps": 1030, "total_steps": 4060, "loss": 0.4318, "lr": 3.7198888735396574e-05, "epoch": 1.7758620689655173, "percentage": 25.37, "elapsed_time": "1:00:46", "remaining_time": "2:58:46"}
|
||||||
|
{"current_steps": 1035, "total_steps": 4060, "loss": 0.3794, "lr": 3.7154848449527334e-05, "epoch": 1.7844827586206895, "percentage": 25.49, "elapsed_time": "1:01:03", "remaining_time": "2:58:28"}
|
||||||
|
{"current_steps": 1040, "total_steps": 4060, "loss": 0.3694, "lr": 3.7110491142035145e-05, "epoch": 1.793103448275862, "percentage": 25.62, "elapsed_time": "1:01:21", "remaining_time": "2:58:09"}
|
||||||
|
{"current_steps": 1045, "total_steps": 4060, "loss": 0.3729, "lr": 3.7065817632643115e-05, "epoch": 1.8017241379310345, "percentage": 25.74, "elapsed_time": "1:01:35", "remaining_time": "2:57:43"}
|
||||||
|
{"current_steps": 1050, "total_steps": 4060, "loss": 0.3779, "lr": 3.702082874691776e-05, "epoch": 1.8103448275862069, "percentage": 25.86, "elapsed_time": "1:01:54", "remaining_time": "2:57:28"}
|
||||||
|
{"current_steps": 1055, "total_steps": 4060, "loss": 0.4054, "lr": 3.6975525316253744e-05, "epoch": 1.8189655172413794, "percentage": 25.99, "elapsed_time": "1:02:09", "remaining_time": "2:57:03"}
|
||||||
|
{"current_steps": 1060, "total_steps": 4060, "loss": 0.3681, "lr": 3.692990817785853e-05, "epoch": 1.8275862068965516, "percentage": 26.11, "elapsed_time": "1:02:24", "remaining_time": "2:56:38"}
|
||||||
|
{"current_steps": 1065, "total_steps": 4060, "loss": 0.3863, "lr": 3.68839781747369e-05, "epoch": 1.8362068965517242, "percentage": 26.23, "elapsed_time": "1:02:39", "remaining_time": "2:56:11"}
|
||||||
|
{"current_steps": 1070, "total_steps": 4060, "loss": 0.3954, "lr": 3.683773615567538e-05, "epoch": 1.8448275862068966, "percentage": 26.35, "elapsed_time": "1:02:55", "remaining_time": "2:55:50"}
|
||||||
|
{"current_steps": 1075, "total_steps": 4060, "loss": 0.4454, "lr": 3.679118297522654e-05, "epoch": 1.853448275862069, "percentage": 26.48, "elapsed_time": "1:03:15", "remaining_time": "2:55:39"}
|
||||||
|
{"current_steps": 1080, "total_steps": 4060, "loss": 0.3673, "lr": 3.674431949369321e-05, "epoch": 1.8620689655172413, "percentage": 26.6, "elapsed_time": "1:03:35", "remaining_time": "2:55:27"}
|
||||||
|
{"current_steps": 1085, "total_steps": 4060, "loss": 0.3936, "lr": 3.6697146577112614e-05, "epoch": 1.8706896551724137, "percentage": 26.72, "elapsed_time": "1:03:55", "remaining_time": "2:55:15"}
|
||||||
|
{"current_steps": 1090, "total_steps": 4060, "loss": 0.3942, "lr": 3.6649665097240304e-05, "epoch": 1.8793103448275863, "percentage": 26.85, "elapsed_time": "1:04:12", "remaining_time": "2:54:57"}
|
||||||
|
{"current_steps": 1095, "total_steps": 4060, "loss": 0.3816, "lr": 3.660187593153408e-05, "epoch": 1.8879310344827587, "percentage": 26.97, "elapsed_time": "1:04:25", "remaining_time": "2:54:27"}
|
||||||
|
{"current_steps": 1100, "total_steps": 4060, "loss": 0.4069, "lr": 3.655377996313782e-05, "epoch": 1.896551724137931, "percentage": 27.09, "elapsed_time": "1:04:37", "remaining_time": "2:53:53"}
|
||||||
|
{"current_steps": 1105, "total_steps": 4060, "loss": 0.4113, "lr": 3.6505378080865054e-05, "epoch": 1.9051724137931034, "percentage": 27.22, "elapsed_time": "1:04:52", "remaining_time": "2:53:30"}
|
||||||
|
{"current_steps": 1110, "total_steps": 4060, "loss": 0.4192, "lr": 3.645667117918265e-05, "epoch": 1.9137931034482758, "percentage": 27.34, "elapsed_time": "1:05:09", "remaining_time": "2:53:09"}
|
||||||
|
{"current_steps": 1115, "total_steps": 4060, "loss": 0.4441, "lr": 3.640766015819423e-05, "epoch": 1.9224137931034484, "percentage": 27.46, "elapsed_time": "1:05:26", "remaining_time": "2:52:50"}
|
||||||
|
{"current_steps": 1120, "total_steps": 4060, "loss": 0.3768, "lr": 3.6358345923623506e-05, "epoch": 1.9310344827586206, "percentage": 27.59, "elapsed_time": "1:05:42", "remaining_time": "2:52:30"}
|
||||||
|
{"current_steps": 1125, "total_steps": 4060, "loss": 0.3891, "lr": 3.630872938679761e-05, "epoch": 1.9396551724137931, "percentage": 27.71, "elapsed_time": "1:06:04", "remaining_time": "2:52:22"}
|
||||||
|
{"current_steps": 1130, "total_steps": 4060, "loss": 0.4098, "lr": 3.6258811464630215e-05, "epoch": 1.9482758620689655, "percentage": 27.83, "elapsed_time": "1:06:29", "remaining_time": "2:52:25"}
|
||||||
|
{"current_steps": 1135, "total_steps": 4060, "loss": 0.4763, "lr": 3.620859307960458e-05, "epoch": 1.956896551724138, "percentage": 27.96, "elapsed_time": "1:06:57", "remaining_time": "2:52:32"}
|
||||||
|
{"current_steps": 1140, "total_steps": 4060, "loss": 0.3811, "lr": 3.615807515975654e-05, "epoch": 1.9655172413793105, "percentage": 28.08, "elapsed_time": "1:07:12", "remaining_time": "2:52:09"}
|
||||||
|
{"current_steps": 1145, "total_steps": 4060, "loss": 0.3751, "lr": 3.6107258638657324e-05, "epoch": 1.9741379310344827, "percentage": 28.2, "elapsed_time": "1:07:28", "remaining_time": "2:51:47"}
|
||||||
|
{"current_steps": 1150, "total_steps": 4060, "loss": 0.4329, "lr": 3.60561444553963e-05, "epoch": 1.9827586206896552, "percentage": 28.33, "elapsed_time": "1:07:51", "remaining_time": "2:51:41"}
|
||||||
|
{"current_steps": 1155, "total_steps": 4060, "loss": 0.3904, "lr": 3.600473355456366e-05, "epoch": 1.9913793103448276, "percentage": 28.45, "elapsed_time": "1:08:06", "remaining_time": "2:51:17"}
|
||||||
|
{"current_steps": 1160, "total_steps": 4060, "loss": 0.365, "lr": 3.595302688623291e-05, "epoch": 2.0, "percentage": 28.57, "elapsed_time": "1:08:25", "remaining_time": "2:51:03"}
|
||||||
|
{"current_steps": 1165, "total_steps": 4060, "loss": 0.3466, "lr": 3.590102540594337e-05, "epoch": 2.0086206896551726, "percentage": 28.69, "elapsed_time": "1:08:40", "remaining_time": "2:50:39"}
|
||||||
|
{"current_steps": 1170, "total_steps": 4060, "loss": 0.3232, "lr": 3.584873007468244e-05, "epoch": 2.0172413793103448, "percentage": 28.82, "elapsed_time": "1:08:56", "remaining_time": "2:50:17"}
|
||||||
|
{"current_steps": 1175, "total_steps": 4060, "loss": 0.3589, "lr": 3.5796141858867935e-05, "epoch": 2.0258620689655173, "percentage": 28.94, "elapsed_time": "1:09:17", "remaining_time": "2:50:08"}
|
||||||
|
{"current_steps": 1180, "total_steps": 4060, "loss": 0.3388, "lr": 3.5743261730330144e-05, "epoch": 2.0344827586206895, "percentage": 29.06, "elapsed_time": "1:09:31", "remaining_time": "2:49:42"}
|
||||||
|
{"current_steps": 1185, "total_steps": 4060, "loss": 0.3406, "lr": 3.569009066629392e-05, "epoch": 2.043103448275862, "percentage": 29.19, "elapsed_time": "1:09:50", "remaining_time": "2:49:26"}
|
||||||
|
{"current_steps": 1190, "total_steps": 4060, "loss": 0.332, "lr": 3.56366296493606e-05, "epoch": 2.0517241379310347, "percentage": 29.31, "elapsed_time": "1:10:04", "remaining_time": "2:49:00"}
|
||||||
|
{"current_steps": 1195, "total_steps": 4060, "loss": 0.3324, "lr": 3.558287966748985e-05, "epoch": 2.060344827586207, "percentage": 29.43, "elapsed_time": "1:10:18", "remaining_time": "2:48:34"}
|
||||||
|
{"current_steps": 1200, "total_steps": 4060, "loss": 0.3567, "lr": 3.552884171398141e-05, "epoch": 2.0689655172413794, "percentage": 29.56, "elapsed_time": "1:10:34", "remaining_time": "2:48:11"}
|
||||||
|
{"current_steps": 1205, "total_steps": 4060, "loss": 0.3679, "lr": 3.547451678745673e-05, "epoch": 2.0775862068965516, "percentage": 29.68, "elapsed_time": "1:10:50", "remaining_time": "2:47:51"}
|
||||||
|
{"current_steps": 1210, "total_steps": 4060, "loss": 0.3874, "lr": 3.541990589184053e-05, "epoch": 2.086206896551724, "percentage": 29.8, "elapsed_time": "1:11:06", "remaining_time": "2:47:28"}
|
||||||
|
{"current_steps": 1215, "total_steps": 4060, "loss": 0.3094, "lr": 3.5365010036342245e-05, "epoch": 2.0948275862068964, "percentage": 29.93, "elapsed_time": "1:11:20", "remaining_time": "2:47:02"}
|
||||||
|
{"current_steps": 1220, "total_steps": 4060, "loss": 0.344, "lr": 3.530983023543734e-05, "epoch": 2.103448275862069, "percentage": 30.05, "elapsed_time": "1:11:38", "remaining_time": "2:46:46"}
|
||||||
|
{"current_steps": 1225, "total_steps": 4060, "loss": 0.3505, "lr": 3.525436750884863e-05, "epoch": 2.1120689655172415, "percentage": 30.17, "elapsed_time": "1:11:51", "remaining_time": "2:46:18"}
|
||||||
|
{"current_steps": 1230, "total_steps": 4060, "loss": 0.3163, "lr": 3.5198622881527374e-05, "epoch": 2.1206896551724137, "percentage": 30.3, "elapsed_time": "1:12:09", "remaining_time": "2:46:01"}
|
||||||
|
{"current_steps": 1235, "total_steps": 4060, "loss": 0.3408, "lr": 3.514259738363436e-05, "epoch": 2.1293103448275863, "percentage": 30.42, "elapsed_time": "1:12:24", "remaining_time": "2:45:37"}
|
||||||
|
{"current_steps": 1240, "total_steps": 4060, "loss": 0.3613, "lr": 3.5086292050520855e-05, "epoch": 2.1379310344827585, "percentage": 30.54, "elapsed_time": "1:12:48", "remaining_time": "2:45:34"}
|
||||||
|
{"current_steps": 1245, "total_steps": 4060, "loss": 0.3169, "lr": 3.502970792270951e-05, "epoch": 2.146551724137931, "percentage": 30.67, "elapsed_time": "1:12:58", "remaining_time": "2:45:00"}
|
||||||
|
{"current_steps": 1250, "total_steps": 4060, "loss": 0.356, "lr": 3.497284604587508e-05, "epoch": 2.1551724137931036, "percentage": 30.79, "elapsed_time": "1:13:15", "remaining_time": "2:44:41"}
|
||||||
|
{"current_steps": 1255, "total_steps": 4060, "loss": 0.2985, "lr": 3.491570747082512e-05, "epoch": 2.163793103448276, "percentage": 30.91, "elapsed_time": "1:13:34", "remaining_time": "2:44:26"}
|
||||||
|
{"current_steps": 1260, "total_steps": 4060, "loss": 0.3621, "lr": 3.485829325348059e-05, "epoch": 2.1724137931034484, "percentage": 31.03, "elapsed_time": "1:13:53", "remaining_time": "2:44:11"}
|
||||||
|
{"current_steps": 1265, "total_steps": 4060, "loss": 0.3506, "lr": 3.4800604454856284e-05, "epoch": 2.1810344827586206, "percentage": 31.16, "elapsed_time": "1:14:10", "remaining_time": "2:43:53"}
|
||||||
|
{"current_steps": 1270, "total_steps": 4060, "loss": 0.3751, "lr": 3.47426421410413e-05, "epoch": 2.189655172413793, "percentage": 31.28, "elapsed_time": "1:14:26", "remaining_time": "2:43:32"}
|
||||||
|
{"current_steps": 1275, "total_steps": 4060, "loss": 0.3612, "lr": 3.468440738317926e-05, "epoch": 2.1982758620689653, "percentage": 31.4, "elapsed_time": "1:14:46", "remaining_time": "2:43:19"}
|
||||||
|
{"current_steps": 1280, "total_steps": 4060, "loss": 0.3481, "lr": 3.4625901257448596e-05, "epoch": 2.206896551724138, "percentage": 31.53, "elapsed_time": "1:15:03", "remaining_time": "2:43:00"}
|
||||||
|
{"current_steps": 1285, "total_steps": 4060, "loss": 0.2697, "lr": 3.4567124845042564e-05, "epoch": 2.2155172413793105, "percentage": 31.65, "elapsed_time": "1:15:25", "remaining_time": "2:42:52"}
|
||||||
|
{"current_steps": 1290, "total_steps": 4060, "loss": 0.3545, "lr": 3.4508079232149354e-05, "epoch": 2.2241379310344827, "percentage": 31.77, "elapsed_time": "1:15:40", "remaining_time": "2:42:28"}
|
||||||
|
{"current_steps": 1295, "total_steps": 4060, "loss": 0.3449, "lr": 3.444876550993198e-05, "epoch": 2.2327586206896552, "percentage": 31.9, "elapsed_time": "1:16:00", "remaining_time": "2:42:18"}
|
||||||
|
{"current_steps": 1300, "total_steps": 4060, "loss": 0.3693, "lr": 3.4389184774508105e-05, "epoch": 2.2413793103448274, "percentage": 32.02, "elapsed_time": "1:16:22", "remaining_time": "2:42:09"}
|
||||||
|
{"current_steps": 1305, "total_steps": 4060, "loss": 0.3623, "lr": 3.43293381269298e-05, "epoch": 2.25, "percentage": 32.14, "elapsed_time": "1:16:42", "remaining_time": "2:41:57"}
|
||||||
|
{"current_steps": 1310, "total_steps": 4060, "loss": 0.381, "lr": 3.4269226673163204e-05, "epoch": 2.2586206896551726, "percentage": 32.27, "elapsed_time": "1:16:59", "remaining_time": "2:41:38"}
|
||||||
|
{"current_steps": 1315, "total_steps": 4060, "loss": 0.3438, "lr": 3.420885152406805e-05, "epoch": 2.2672413793103448, "percentage": 32.39, "elapsed_time": "1:17:13", "remaining_time": "2:41:12"}
|
||||||
|
{"current_steps": 1320, "total_steps": 4060, "loss": 0.3724, "lr": 3.4148213795377194e-05, "epoch": 2.2758620689655173, "percentage": 32.51, "elapsed_time": "1:17:26", "remaining_time": "2:40:45"}
|
||||||
|
{"current_steps": 1325, "total_steps": 4060, "loss": 0.3478, "lr": 3.408731460767593e-05, "epoch": 2.2844827586206895, "percentage": 32.64, "elapsed_time": "1:17:39", "remaining_time": "2:40:17"}
|
||||||
|
{"current_steps": 1330, "total_steps": 4060, "loss": 0.3668, "lr": 3.402615508638134e-05, "epoch": 2.293103448275862, "percentage": 32.76, "elapsed_time": "1:17:56", "remaining_time": "2:39:58"}
|
||||||
|
{"current_steps": 1335, "total_steps": 4060, "loss": 0.3765, "lr": 3.396473636172146e-05, "epoch": 2.3017241379310347, "percentage": 32.88, "elapsed_time": "1:18:23", "remaining_time": "2:40:00"}
|
||||||
|
{"current_steps": 1340, "total_steps": 4060, "loss": 0.3458, "lr": 3.3903059568714406e-05, "epoch": 2.310344827586207, "percentage": 33.0, "elapsed_time": "1:18:46", "remaining_time": "2:39:54"}
|
||||||
|
{"current_steps": 1345, "total_steps": 4060, "loss": 0.3588, "lr": 3.384112584714739e-05, "epoch": 2.3189655172413794, "percentage": 33.13, "elapsed_time": "1:19:01", "remaining_time": "2:39:31"}
|
||||||
|
{"current_steps": 1350, "total_steps": 4060, "loss": 0.3104, "lr": 3.377893634155568e-05, "epoch": 2.3275862068965516, "percentage": 33.25, "elapsed_time": "1:19:19", "remaining_time": "2:39:14"}
|
||||||
|
{"current_steps": 1355, "total_steps": 4060, "loss": 0.329, "lr": 3.371649220120143e-05, "epoch": 2.336206896551724, "percentage": 33.37, "elapsed_time": "1:19:32", "remaining_time": "2:38:47"}
|
||||||
|
{"current_steps": 1360, "total_steps": 4060, "loss": 0.3431, "lr": 3.365379458005243e-05, "epoch": 2.344827586206897, "percentage": 33.5, "elapsed_time": "1:19:53", "remaining_time": "2:38:36"}
|
||||||
|
{"current_steps": 1365, "total_steps": 4060, "loss": 0.3438, "lr": 3.35908446367608e-05, "epoch": 2.353448275862069, "percentage": 33.62, "elapsed_time": "1:20:05", "remaining_time": "2:38:07"}
|
||||||
|
{"current_steps": 1370, "total_steps": 4060, "loss": 0.362, "lr": 3.35276435346416e-05, "epoch": 2.3620689655172415, "percentage": 33.74, "elapsed_time": "1:20:19", "remaining_time": "2:37:42"}
|
||||||
|
{"current_steps": 1375, "total_steps": 4060, "loss": 0.3362, "lr": 3.346419244165127e-05, "epoch": 2.3706896551724137, "percentage": 33.87, "elapsed_time": "1:20:33", "remaining_time": "2:37:18"}
|
||||||
|
{"current_steps": 1380, "total_steps": 4060, "loss": 0.3449, "lr": 3.3400492530366086e-05, "epoch": 2.3793103448275863, "percentage": 33.99, "elapsed_time": "1:20:50", "remaining_time": "2:36:59"}
|
||||||
|
{"current_steps": 1385, "total_steps": 4060, "loss": 0.4314, "lr": 3.333654497796051e-05, "epoch": 2.3879310344827585, "percentage": 34.11, "elapsed_time": "1:21:07", "remaining_time": "2:36:40"}
|
||||||
|
{"current_steps": 1390, "total_steps": 4060, "loss": 0.3426, "lr": 3.32723509661854e-05, "epoch": 2.396551724137931, "percentage": 34.24, "elapsed_time": "1:21:19", "remaining_time": "2:36:12"}
|
||||||
|
{"current_steps": 1395, "total_steps": 4060, "loss": 0.3141, "lr": 3.320791168134617e-05, "epoch": 2.405172413793103, "percentage": 34.36, "elapsed_time": "1:21:35", "remaining_time": "2:35:51"}
|
||||||
|
{"current_steps": 1400, "total_steps": 4060, "loss": 0.3496, "lr": 3.31432283142809e-05, "epoch": 2.413793103448276, "percentage": 34.48, "elapsed_time": "1:21:50", "remaining_time": "2:35:30"}
|
||||||
|
{"current_steps": 1405, "total_steps": 4060, "loss": 0.3538, "lr": 3.307830206033831e-05, "epoch": 2.4224137931034484, "percentage": 34.61, "elapsed_time": "1:22:15", "remaining_time": "2:35:27"}
|
||||||
|
{"current_steps": 1410, "total_steps": 4060, "loss": 0.3563, "lr": 3.301313411935565e-05, "epoch": 2.4310344827586206, "percentage": 34.73, "elapsed_time": "1:22:35", "remaining_time": "2:35:13"}
|
||||||
|
{"current_steps": 1415, "total_steps": 4060, "loss": 0.4188, "lr": 3.294772569563656e-05, "epoch": 2.439655172413793, "percentage": 34.85, "elapsed_time": "1:22:52", "remaining_time": "2:34:54"}
|
||||||
|
{"current_steps": 1420, "total_steps": 4060, "loss": 0.3751, "lr": 3.28820779979288e-05, "epoch": 2.4482758620689653, "percentage": 34.98, "elapsed_time": "1:23:07", "remaining_time": "2:34:33"}
|
||||||
|
{"current_steps": 1425, "total_steps": 4060, "loss": 0.3543, "lr": 3.281619223940192e-05, "epoch": 2.456896551724138, "percentage": 35.1, "elapsed_time": "1:23:21", "remaining_time": "2:34:09"}
|
||||||
|
{"current_steps": 1430, "total_steps": 4060, "loss": 0.3663, "lr": 3.2750069637624826e-05, "epoch": 2.4655172413793105, "percentage": 35.22, "elapsed_time": "1:23:40", "remaining_time": "2:33:52"}
|
||||||
|
{"current_steps": 1435, "total_steps": 4060, "loss": 0.3458, "lr": 3.2683711414543295e-05, "epoch": 2.4741379310344827, "percentage": 35.34, "elapsed_time": "1:23:52", "remaining_time": "2:33:26"}
|
||||||
|
{"current_steps": 1440, "total_steps": 4060, "loss": 0.3602, "lr": 3.261711879645737e-05, "epoch": 2.4827586206896552, "percentage": 35.47, "elapsed_time": "1:24:13", "remaining_time": "2:33:14"}
|
||||||
|
{"current_steps": 1445, "total_steps": 4060, "loss": 0.3541, "lr": 3.255029301399873e-05, "epoch": 2.4913793103448274, "percentage": 35.59, "elapsed_time": "1:24:34", "remaining_time": "2:33:02"}
|
||||||
|
{"current_steps": 1450, "total_steps": 4060, "loss": 0.3567, "lr": 3.248323530210793e-05, "epoch": 2.5, "percentage": 35.71, "elapsed_time": "1:24:53", "remaining_time": "2:32:47"}
|
||||||
|
{"current_steps": 1455, "total_steps": 4060, "loss": 0.3903, "lr": 3.241594690001157e-05, "epoch": 2.5086206896551726, "percentage": 35.84, "elapsed_time": "1:25:13", "remaining_time": "2:32:35"}
|
||||||
|
{"current_steps": 1460, "total_steps": 4060, "loss": 0.3409, "lr": 3.2348429051199424e-05, "epoch": 2.5172413793103448, "percentage": 35.96, "elapsed_time": "1:25:25", "remaining_time": "2:32:08"}
|
||||||
|
{"current_steps": 1465, "total_steps": 4060, "loss": 0.3638, "lr": 3.228068300340142e-05, "epoch": 2.5258620689655173, "percentage": 36.08, "elapsed_time": "1:25:41", "remaining_time": "2:31:48"}
|
||||||
|
{"current_steps": 1470, "total_steps": 4060, "loss": 0.3229, "lr": 3.221271000856462e-05, "epoch": 2.5344827586206895, "percentage": 36.21, "elapsed_time": "1:25:58", "remaining_time": "2:31:28"}
|
||||||
|
{"current_steps": 1475, "total_steps": 4060, "loss": 0.3808, "lr": 3.214451132283006e-05, "epoch": 2.543103448275862, "percentage": 36.33, "elapsed_time": "1:26:19", "remaining_time": "2:31:16"}
|
||||||
|
{"current_steps": 1480, "total_steps": 4060, "loss": 0.3579, "lr": 3.207608820650955e-05, "epoch": 2.5517241379310347, "percentage": 36.45, "elapsed_time": "1:26:38", "remaining_time": "2:31:02"}
|
||||||
|
{"current_steps": 1485, "total_steps": 4060, "loss": 0.3216, "lr": 3.2007441924062374e-05, "epoch": 2.560344827586207, "percentage": 36.58, "elapsed_time": "1:26:59", "remaining_time": "2:30:50"}
|
||||||
|
{"current_steps": 1490, "total_steps": 4060, "loss": 0.3498, "lr": 3.193857374407192e-05, "epoch": 2.5689655172413794, "percentage": 36.7, "elapsed_time": "1:27:11", "remaining_time": "2:30:22"}
|
||||||
|
{"current_steps": 1495, "total_steps": 4060, "loss": 0.3421, "lr": 3.186948493922225e-05, "epoch": 2.5775862068965516, "percentage": 36.82, "elapsed_time": "1:27:27", "remaining_time": "2:30:04"}
|
||||||
|
{"current_steps": 1500, "total_steps": 4060, "loss": 0.3531, "lr": 3.180017678627458e-05, "epoch": 2.586206896551724, "percentage": 36.95, "elapsed_time": "1:27:40", "remaining_time": "2:29:37"}
|
||||||
|
{"current_steps": 1505, "total_steps": 4060, "loss": 0.3533, "lr": 3.173065056604366e-05, "epoch": 2.594827586206897, "percentage": 37.07, "elapsed_time": "1:28:07", "remaining_time": "2:29:36"}
|
||||||
|
{"current_steps": 1510, "total_steps": 4060, "loss": 0.3767, "lr": 3.166090756337415e-05, "epoch": 2.603448275862069, "percentage": 37.19, "elapsed_time": "1:28:22", "remaining_time": "2:29:15"}
|
||||||
|
{"current_steps": 1515, "total_steps": 4060, "loss": 0.3825, "lr": 3.159094906711683e-05, "epoch": 2.612068965517241, "percentage": 37.32, "elapsed_time": "1:28:35", "remaining_time": "2:28:48"}
|
||||||
|
{"current_steps": 1520, "total_steps": 4060, "loss": 0.3609, "lr": 3.15207763701048e-05, "epoch": 2.6206896551724137, "percentage": 37.44, "elapsed_time": "1:28:51", "remaining_time": "2:28:29"}
|
||||||
|
{"current_steps": 1525, "total_steps": 4060, "loss": 0.3642, "lr": 3.14503907691296e-05, "epoch": 2.6293103448275863, "percentage": 37.56, "elapsed_time": "1:29:07", "remaining_time": "2:28:09"}
|
||||||
|
{"current_steps": 1530, "total_steps": 4060, "loss": 0.3431, "lr": 3.1379793564917235e-05, "epoch": 2.637931034482759, "percentage": 37.68, "elapsed_time": "1:29:23", "remaining_time": "2:27:48"}
|
||||||
|
{"current_steps": 1535, "total_steps": 4060, "loss": 0.3044, "lr": 3.130898606210414e-05, "epoch": 2.646551724137931, "percentage": 37.81, "elapsed_time": "1:29:39", "remaining_time": "2:27:29"}
|
||||||
|
{"current_steps": 1540, "total_steps": 4060, "loss": 0.3493, "lr": 3.1237969569213056e-05, "epoch": 2.655172413793103, "percentage": 37.93, "elapsed_time": "1:29:53", "remaining_time": "2:27:05"}
|
||||||
|
{"current_steps": 1545, "total_steps": 4060, "loss": 0.3395, "lr": 3.1166745398628874e-05, "epoch": 2.663793103448276, "percentage": 38.05, "elapsed_time": "1:30:06", "remaining_time": "2:26:41"}
|
||||||
|
{"current_steps": 1550, "total_steps": 4060, "loss": 0.3165, "lr": 3.109531486657437e-05, "epoch": 2.6724137931034484, "percentage": 38.18, "elapsed_time": "1:30:21", "remaining_time": "2:26:18"}
|
||||||
|
{"current_steps": 1555, "total_steps": 4060, "loss": 0.3994, "lr": 3.102367929308586e-05, "epoch": 2.6810344827586206, "percentage": 38.3, "elapsed_time": "1:30:44", "remaining_time": "2:26:10"}
|
||||||
|
{"current_steps": 1560, "total_steps": 4060, "loss": 0.3133, "lr": 3.0951840001988854e-05, "epoch": 2.689655172413793, "percentage": 38.42, "elapsed_time": "1:31:00", "remaining_time": "2:25:51"}
|
||||||
|
{"current_steps": 1565, "total_steps": 4060, "loss": 0.4048, "lr": 3.0879798320873546e-05, "epoch": 2.6982758620689653, "percentage": 38.55, "elapsed_time": "1:31:30", "remaining_time": "2:25:53"}
|
||||||
|
{"current_steps": 1570, "total_steps": 4060, "loss": 0.304, "lr": 3.0807555581070304e-05, "epoch": 2.706896551724138, "percentage": 38.67, "elapsed_time": "1:31:49", "remaining_time": "2:25:37"}
|
||||||
|
{"current_steps": 1575, "total_steps": 4060, "loss": 0.3513, "lr": 3.0735113117625045e-05, "epoch": 2.7155172413793105, "percentage": 38.79, "elapsed_time": "1:32:13", "remaining_time": "2:25:31"}
|
||||||
|
{"current_steps": 1580, "total_steps": 4060, "loss": 0.3359, "lr": 3.0662472269274617e-05, "epoch": 2.7241379310344827, "percentage": 38.92, "elapsed_time": "1:32:27", "remaining_time": "2:25:07"}
|
||||||
|
{"current_steps": 1585, "total_steps": 4060, "loss": 0.3351, "lr": 3.058963437842198e-05, "epoch": 2.7327586206896552, "percentage": 39.04, "elapsed_time": "1:32:44", "remaining_time": "2:24:48"}
|
||||||
|
{"current_steps": 1590, "total_steps": 4060, "loss": 0.3303, "lr": 3.0516600791111465e-05, "epoch": 2.7413793103448274, "percentage": 39.16, "elapsed_time": "1:33:05", "remaining_time": "2:24:36"}
|
||||||
|
{"current_steps": 1595, "total_steps": 4060, "loss": 0.3378, "lr": 3.0443372857003857e-05, "epoch": 2.75, "percentage": 39.29, "elapsed_time": "1:33:20", "remaining_time": "2:24:15"}
|
||||||
|
{"current_steps": 1600, "total_steps": 4060, "loss": 0.3478, "lr": 3.036995192935149e-05, "epoch": 2.7586206896551726, "percentage": 39.41, "elapsed_time": "1:33:33", "remaining_time": "2:23:51"}
|
||||||
|
{"current_steps": 1605, "total_steps": 4060, "loss": 0.4064, "lr": 3.029633936497321e-05, "epoch": 2.7672413793103448, "percentage": 39.53, "elapsed_time": "1:33:47", "remaining_time": "2:23:28"}
|
||||||
|
{"current_steps": 1610, "total_steps": 4060, "loss": 0.3488, "lr": 3.0222536524229293e-05, "epoch": 2.7758620689655173, "percentage": 39.66, "elapsed_time": "1:34:09", "remaining_time": "2:23:17"}
|
||||||
|
{"current_steps": 1615, "total_steps": 4060, "loss": 0.3156, "lr": 3.0148544770996343e-05, "epoch": 2.7844827586206895, "percentage": 39.78, "elapsed_time": "1:34:32", "remaining_time": "2:23:08"}
|
||||||
|
{"current_steps": 1620, "total_steps": 4060, "loss": 0.3785, "lr": 3.007436547264207e-05, "epoch": 2.793103448275862, "percentage": 39.9, "elapsed_time": "1:34:49", "remaining_time": "2:22:49"}
|
||||||
|
{"current_steps": 1625, "total_steps": 4060, "loss": 0.3625, "lr": 3.0000000000000004e-05, "epoch": 2.8017241379310347, "percentage": 40.02, "elapsed_time": "1:35:06", "remaining_time": "2:22:30"}
|
||||||
|
{"current_steps": 1630, "total_steps": 4060, "loss": 0.3809, "lr": 2.9925449727344184e-05, "epoch": 2.810344827586207, "percentage": 40.15, "elapsed_time": "1:35:31", "remaining_time": "2:22:25"}
|
||||||
|
{"current_steps": 1635, "total_steps": 4060, "loss": 0.3375, "lr": 2.985071603236374e-05, "epoch": 2.8189655172413794, "percentage": 40.27, "elapsed_time": "1:35:47", "remaining_time": "2:22:04"}
|
||||||
|
{"current_steps": 1640, "total_steps": 4060, "loss": 0.3783, "lr": 2.9775800296137474e-05, "epoch": 2.8275862068965516, "percentage": 40.39, "elapsed_time": "1:36:05", "remaining_time": "2:21:47"}
|
||||||
|
{"current_steps": 1645, "total_steps": 4060, "loss": 0.3573, "lr": 2.970070390310828e-05, "epoch": 2.836206896551724, "percentage": 40.52, "elapsed_time": "1:36:19", "remaining_time": "2:21:24"}
|
||||||
|
{"current_steps": 1650, "total_steps": 4060, "loss": 0.3168, "lr": 2.962542824105762e-05, "epoch": 2.844827586206897, "percentage": 40.64, "elapsed_time": "1:36:36", "remaining_time": "2:21:06"}
|
||||||
|
{"current_steps": 1655, "total_steps": 4060, "loss": 0.3008, "lr": 2.954997470107982e-05, "epoch": 2.853448275862069, "percentage": 40.76, "elapsed_time": "1:36:55", "remaining_time": "2:20:50"}
|
||||||
|
{"current_steps": 1660, "total_steps": 4060, "loss": 0.3209, "lr": 2.947434467755641e-05, "epoch": 2.862068965517241, "percentage": 40.89, "elapsed_time": "1:37:11", "remaining_time": "2:20:30"}
|
||||||
|
{"current_steps": 1665, "total_steps": 4060, "loss": 0.3683, "lr": 2.9398539568130327e-05, "epoch": 2.8706896551724137, "percentage": 41.01, "elapsed_time": "1:37:32", "remaining_time": "2:20:18"}
|
||||||
|
{"current_steps": 1670, "total_steps": 4060, "loss": 0.3384, "lr": 2.9322560773680087e-05, "epoch": 2.8793103448275863, "percentage": 41.13, "elapsed_time": "1:37:45", "remaining_time": "2:19:54"}
|
||||||
|
{"current_steps": 1675, "total_steps": 4060, "loss": 0.2976, "lr": 2.924640969829393e-05, "epoch": 2.887931034482759, "percentage": 41.26, "elapsed_time": "1:38:03", "remaining_time": "2:19:37"}
|
||||||
|
{"current_steps": 1680, "total_steps": 4060, "loss": 0.3428, "lr": 2.9170087749243832e-05, "epoch": 2.896551724137931, "percentage": 41.38, "elapsed_time": "1:38:18", "remaining_time": "2:19:15"}
|
||||||
|
{"current_steps": 1685, "total_steps": 4060, "loss": 0.3961, "lr": 2.9093596336959513e-05, "epoch": 2.905172413793103, "percentage": 41.5, "elapsed_time": "1:38:35", "remaining_time": "2:18:57"}
|
||||||
|
{"current_steps": 1690, "total_steps": 4060, "loss": 0.3575, "lr": 2.9016936875002377e-05, "epoch": 2.913793103448276, "percentage": 41.63, "elapsed_time": "1:38:57", "remaining_time": "2:18:45"}
|
||||||
|
{"current_steps": 1695, "total_steps": 4060, "loss": 0.3016, "lr": 2.8940110780039385e-05, "epoch": 2.9224137931034484, "percentage": 41.75, "elapsed_time": "1:39:12", "remaining_time": "2:18:25"}
|
||||||
|
{"current_steps": 1700, "total_steps": 4060, "loss": 0.3878, "lr": 2.8863119471816878e-05, "epoch": 2.9310344827586206, "percentage": 41.87, "elapsed_time": "1:39:24", "remaining_time": "2:18:00"}
|
||||||
|
{"current_steps": 1705, "total_steps": 4060, "loss": 0.3377, "lr": 2.878596437313434e-05, "epoch": 2.939655172413793, "percentage": 42.0, "elapsed_time": "1:39:43", "remaining_time": "2:17:44"}
|
||||||
|
{"current_steps": 1710, "total_steps": 4060, "loss": 0.3652, "lr": 2.87086469098181e-05, "epoch": 2.9482758620689653, "percentage": 42.12, "elapsed_time": "1:40:01", "remaining_time": "2:17:28"}
|
||||||
|
{"current_steps": 1715, "total_steps": 4060, "loss": 0.3815, "lr": 2.863116851069499e-05, "epoch": 2.956896551724138, "percentage": 42.24, "elapsed_time": "1:40:17", "remaining_time": "2:17:07"}
|
||||||
|
{"current_steps": 1720, "total_steps": 4060, "loss": 0.3345, "lr": 2.855353060756593e-05, "epoch": 2.9655172413793105, "percentage": 42.36, "elapsed_time": "1:40:36", "remaining_time": "2:16:51"}
|
||||||
|
{"current_steps": 1725, "total_steps": 4060, "loss": 0.3766, "lr": 2.8475734635179472e-05, "epoch": 2.9741379310344827, "percentage": 42.49, "elapsed_time": "1:40:49", "remaining_time": "2:16:29"}
|
||||||
|
{"current_steps": 1730, "total_steps": 4060, "loss": 0.3333, "lr": 2.8397782031205295e-05, "epoch": 2.9827586206896552, "percentage": 42.61, "elapsed_time": "1:41:08", "remaining_time": "2:16:13"}
|
||||||
|
{"current_steps": 1735, "total_steps": 4060, "loss": 0.3373, "lr": 2.8319674236207634e-05, "epoch": 2.9913793103448274, "percentage": 42.73, "elapsed_time": "1:41:24", "remaining_time": "2:15:53"}
|
||||||
|
{"current_steps": 1740, "total_steps": 4060, "loss": 0.374, "lr": 2.8241412693618638e-05, "epoch": 3.0, "percentage": 42.86, "elapsed_time": "1:41:37", "remaining_time": "2:15:29"}
|
||||||
|
{"current_steps": 1745, "total_steps": 4060, "loss": 0.3105, "lr": 2.816299884971173e-05, "epoch": 3.0086206896551726, "percentage": 42.98, "elapsed_time": "1:41:52", "remaining_time": "2:15:09"}
|
||||||
|
{"current_steps": 1750, "total_steps": 4060, "loss": 0.312, "lr": 2.8084434153574847e-05, "epoch": 3.0172413793103448, "percentage": 43.1, "elapsed_time": "1:42:04", "remaining_time": "2:14:44"}
|
||||||
|
{"current_steps": 1755, "total_steps": 4060, "loss": 0.319, "lr": 2.8005720057083685e-05, "epoch": 3.0258620689655173, "percentage": 43.23, "elapsed_time": "1:42:21", "remaining_time": "2:14:25"}
|
||||||
|
{"current_steps": 1760, "total_steps": 4060, "loss": 0.2936, "lr": 2.792685801487486e-05, "epoch": 3.0344827586206895, "percentage": 43.35, "elapsed_time": "1:42:41", "remaining_time": "2:14:12"}
|
||||||
|
{"current_steps": 1765, "total_steps": 4060, "loss": 0.3171, "lr": 2.7847849484319008e-05, "epoch": 3.043103448275862, "percentage": 43.47, "elapsed_time": "1:42:52", "remaining_time": "2:13:46"}
|
||||||
|
{"current_steps": 1770, "total_steps": 4060, "loss": 0.2905, "lr": 2.7768695925493897e-05, "epoch": 3.0517241379310347, "percentage": 43.6, "elapsed_time": "1:43:03", "remaining_time": "2:13:20"}
|
||||||
|
{"current_steps": 1775, "total_steps": 4060, "loss": 0.3208, "lr": 2.7689398801157393e-05, "epoch": 3.060344827586207, "percentage": 43.72, "elapsed_time": "1:43:23", "remaining_time": "2:13:05"}
|
||||||
|
{"current_steps": 1780, "total_steps": 4060, "loss": 0.2768, "lr": 2.7609959576720467e-05, "epoch": 3.0689655172413794, "percentage": 43.84, "elapsed_time": "1:43:46", "remaining_time": "2:12:55"}
|
||||||
|
{"current_steps": 1785, "total_steps": 4060, "loss": 0.3087, "lr": 2.7530379720220096e-05, "epoch": 3.0775862068965516, "percentage": 43.97, "elapsed_time": "1:44:04", "remaining_time": "2:12:38"}
|
||||||
|
{"current_steps": 1790, "total_steps": 4060, "loss": 0.3371, "lr": 2.7450660702292132e-05, "epoch": 3.086206896551724, "percentage": 44.09, "elapsed_time": "1:44:20", "remaining_time": "2:12:19"}
|
||||||
|
{"current_steps": 1795, "total_steps": 4060, "loss": 0.3354, "lr": 2.7370803996144143e-05, "epoch": 3.0948275862068964, "percentage": 44.21, "elapsed_time": "1:44:37", "remaining_time": "2:12:01"}
|
||||||
|
{"current_steps": 1800, "total_steps": 4060, "loss": 0.3155, "lr": 2.7290811077528166e-05, "epoch": 3.103448275862069, "percentage": 44.33, "elapsed_time": "1:44:51", "remaining_time": "2:11:38"}
|
||||||
|
{"current_steps": 1805, "total_steps": 4060, "loss": 0.2928, "lr": 2.7210683424713447e-05, "epoch": 3.1120689655172415, "percentage": 44.46, "elapsed_time": "1:45:05", "remaining_time": "2:11:17"}
|
||||||
|
{"current_steps": 1810, "total_steps": 4060, "loss": 0.2923, "lr": 2.7130422518459113e-05, "epoch": 3.1206896551724137, "percentage": 44.58, "elapsed_time": "1:45:24", "remaining_time": "2:11:02"}
|
||||||
|
{"current_steps": 1815, "total_steps": 4060, "loss": 0.2816, "lr": 2.705002984198684e-05, "epoch": 3.1293103448275863, "percentage": 44.7, "elapsed_time": "1:45:38", "remaining_time": "2:10:40"}
|
||||||
|
{"current_steps": 1820, "total_steps": 4060, "loss": 0.3179, "lr": 2.6969506880953384e-05, "epoch": 3.1379310344827585, "percentage": 44.83, "elapsed_time": "1:45:58", "remaining_time": "2:10:25"}
|
||||||
|
{"current_steps": 1825, "total_steps": 4060, "loss": 0.2852, "lr": 2.688885512342318e-05, "epoch": 3.146551724137931, "percentage": 44.95, "elapsed_time": "1:46:12", "remaining_time": "2:10:04"}
|
||||||
|
{"current_steps": 1830, "total_steps": 4060, "loss": 0.2745, "lr": 2.680807605984082e-05, "epoch": 3.1551724137931036, "percentage": 45.07, "elapsed_time": "1:46:29", "remaining_time": "2:09:45"}
|
||||||
|
{"current_steps": 1835, "total_steps": 4060, "loss": 0.3037, "lr": 2.6727171183003502e-05, "epoch": 3.163793103448276, "percentage": 45.2, "elapsed_time": "1:46:41", "remaining_time": "2:09:21"}
|
||||||
|
{"current_steps": 1840, "total_steps": 4060, "loss": 0.3106, "lr": 2.6646141988033475e-05, "epoch": 3.1724137931034484, "percentage": 45.32, "elapsed_time": "1:46:58", "remaining_time": "2:09:04"}
|
||||||
|
{"current_steps": 1845, "total_steps": 4060, "loss": 0.2762, "lr": 2.6564989972350364e-05, "epoch": 3.1810344827586206, "percentage": 45.44, "elapsed_time": "1:47:14", "remaining_time": "2:08:44"}
|
||||||
|
{"current_steps": 1850, "total_steps": 4060, "loss": 0.3212, "lr": 2.6483716635643535e-05, "epoch": 3.189655172413793, "percentage": 45.57, "elapsed_time": "1:47:35", "remaining_time": "2:08:32"}
|
||||||
|
{"current_steps": 1855, "total_steps": 4060, "loss": 0.3001, "lr": 2.6402323479844364e-05, "epoch": 3.1982758620689653, "percentage": 45.69, "elapsed_time": "1:47:52", "remaining_time": "2:08:13"}
|
||||||
|
{"current_steps": 1860, "total_steps": 4060, "loss": 0.284, "lr": 2.6320812009098472e-05, "epoch": 3.206896551724138, "percentage": 45.81, "elapsed_time": "1:48:05", "remaining_time": "2:07:50"}
|
||||||
|
{"current_steps": 1865, "total_steps": 4060, "loss": 0.37, "lr": 2.6239183729737957e-05, "epoch": 3.2155172413793105, "percentage": 45.94, "elapsed_time": "1:48:31", "remaining_time": "2:07:43"}
|
||||||
|
{"current_steps": 1870, "total_steps": 4060, "loss": 0.3062, "lr": 2.6157440150253535e-05, "epoch": 3.2241379310344827, "percentage": 46.06, "elapsed_time": "1:48:43", "remaining_time": "2:07:19"}
|
||||||
|
{"current_steps": 1875, "total_steps": 4060, "loss": 0.2597, "lr": 2.6075582781266665e-05, "epoch": 3.2327586206896552, "percentage": 46.18, "elapsed_time": "1:49:03", "remaining_time": "2:07:05"}
|
||||||
|
{"current_steps": 1880, "total_steps": 4060, "loss": 0.2981, "lr": 2.5993613135501643e-05, "epoch": 3.2413793103448274, "percentage": 46.31, "elapsed_time": "1:49:20", "remaining_time": "2:06:47"}
|
||||||
|
{"current_steps": 1885, "total_steps": 4060, "loss": 0.3171, "lr": 2.5911532727757625e-05, "epoch": 3.25, "percentage": 46.43, "elapsed_time": "1:49:40", "remaining_time": "2:06:32"}
|
||||||
|
{"current_steps": 1890, "total_steps": 4060, "loss": 0.2704, "lr": 2.582934307488067e-05, "epoch": 3.2586206896551726, "percentage": 46.55, "elapsed_time": "1:49:57", "remaining_time": "2:06:14"}
|
||||||
|
{"current_steps": 1895, "total_steps": 4060, "loss": 0.2713, "lr": 2.5747045695735674e-05, "epoch": 3.2672413793103448, "percentage": 46.67, "elapsed_time": "1:50:20", "remaining_time": "2:06:03"}
|
||||||
|
{"current_steps": 1900, "total_steps": 4060, "loss": 0.2758, "lr": 2.5664642111178312e-05, "epoch": 3.2758620689655173, "percentage": 46.8, "elapsed_time": "1:50:39", "remaining_time": "2:05:48"}
|
||||||
|
{"current_steps": 1905, "total_steps": 4060, "loss": 0.2886, "lr": 2.5582133844026943e-05, "epoch": 3.2844827586206895, "percentage": 46.92, "elapsed_time": "1:50:53", "remaining_time": "2:05:26"}
|
||||||
|
{"current_steps": 1910, "total_steps": 4060, "loss": 0.2619, "lr": 2.5499522419034462e-05, "epoch": 3.293103448275862, "percentage": 47.04, "elapsed_time": "1:51:11", "remaining_time": "2:05:10"}
|
||||||
|
{"current_steps": 1915, "total_steps": 4060, "loss": 0.3082, "lr": 2.5416809362860107e-05, "epoch": 3.3017241379310347, "percentage": 47.17, "elapsed_time": "1:51:26", "remaining_time": "2:04:49"}
|
||||||
|
{"current_steps": 1920, "total_steps": 4060, "loss": 0.3104, "lr": 2.5333996204041276e-05, "epoch": 3.310344827586207, "percentage": 47.29, "elapsed_time": "1:51:41", "remaining_time": "2:04:29"}
|
||||||
|
{"current_steps": 1925, "total_steps": 4060, "loss": 0.2787, "lr": 2.5251084472965257e-05, "epoch": 3.3189655172413794, "percentage": 47.41, "elapsed_time": "1:51:56", "remaining_time": "2:04:09"}
|
||||||
|
{"current_steps": 1930, "total_steps": 4060, "loss": 0.2821, "lr": 2.5168075701840948e-05, "epoch": 3.3275862068965516, "percentage": 47.54, "elapsed_time": "1:52:13", "remaining_time": "2:03:51"}
|
||||||
|
{"current_steps": 1935, "total_steps": 4060, "loss": 0.2936, "lr": 2.5084971424670568e-05, "epoch": 3.336206896551724, "percentage": 47.66, "elapsed_time": "1:52:38", "remaining_time": "2:03:42"}
|
||||||
|
{"current_steps": 1940, "total_steps": 4060, "loss": 0.2938, "lr": 2.500177317722126e-05, "epoch": 3.344827586206897, "percentage": 47.78, "elapsed_time": "1:52:51", "remaining_time": "2:03:20"}
|
||||||
|
{"current_steps": 1945, "total_steps": 4060, "loss": 0.2802, "lr": 2.4918482496996757e-05, "epoch": 3.353448275862069, "percentage": 47.91, "elapsed_time": "1:53:07", "remaining_time": "2:03:00"}
|
||||||
|
{"current_steps": 1950, "total_steps": 4060, "loss": 0.3046, "lr": 2.483510092320895e-05, "epoch": 3.3620689655172415, "percentage": 48.03, "elapsed_time": "1:53:27", "remaining_time": "2:02:45"}
|
||||||
|
{"current_steps": 1955, "total_steps": 4060, "loss": 0.3201, "lr": 2.4751629996749427e-05, "epoch": 3.3706896551724137, "percentage": 48.15, "elapsed_time": "1:53:48", "remaining_time": "2:02:31"}
|
||||||
|
{"current_steps": 1960, "total_steps": 4060, "loss": 0.3211, "lr": 2.4668071260161022e-05, "epoch": 3.3793103448275863, "percentage": 48.28, "elapsed_time": "1:54:09", "remaining_time": "2:02:18"}
|
||||||
|
{"current_steps": 1965, "total_steps": 4060, "loss": 0.2863, "lr": 2.4584426257609315e-05, "epoch": 3.3879310344827585, "percentage": 48.4, "elapsed_time": "1:54:23", "remaining_time": "2:01:57"}
|
||||||
|
{"current_steps": 1970, "total_steps": 4060, "loss": 0.3134, "lr": 2.4500696534854062e-05, "epoch": 3.396551724137931, "percentage": 48.52, "elapsed_time": "1:54:36", "remaining_time": "2:01:35"}
|
||||||
|
{"current_steps": 1975, "total_steps": 4060, "loss": 0.3325, "lr": 2.4416883639220647e-05, "epoch": 3.405172413793103, "percentage": 48.65, "elapsed_time": "1:54:56", "remaining_time": "2:01:20"}
|
||||||
|
{"current_steps": 1980, "total_steps": 4060, "loss": 0.3449, "lr": 2.4332989119571506e-05, "epoch": 3.413793103448276, "percentage": 48.77, "elapsed_time": "1:55:08", "remaining_time": "2:00:57"}
|
||||||
|
{"current_steps": 1985, "total_steps": 4060, "loss": 0.3181, "lr": 2.4249014526277473e-05, "epoch": 3.4224137931034484, "percentage": 48.89, "elapsed_time": "1:55:23", "remaining_time": "2:00:37"}
|
||||||
|
{"current_steps": 1990, "total_steps": 4060, "loss": 0.2906, "lr": 2.416496141118915e-05, "epoch": 3.4310344827586206, "percentage": 49.01, "elapsed_time": "1:55:35", "remaining_time": "2:00:14"}
|
||||||
|
{"current_steps": 1995, "total_steps": 4060, "loss": 0.2867, "lr": 2.4080831327608224e-05, "epoch": 3.439655172413793, "percentage": 49.14, "elapsed_time": "1:55:52", "remaining_time": "1:59:56"}
|
||||||
|
{"current_steps": 2000, "total_steps": 4060, "loss": 0.3165, "lr": 2.3996625830258742e-05, "epoch": 3.4482758620689653, "percentage": 49.26, "elapsed_time": "1:56:16", "remaining_time": "1:59:46"}
|
||||||
|
{"current_steps": 2005, "total_steps": 4060, "loss": 0.2961, "lr": 2.3912346475258424e-05, "epoch": 3.456896551724138, "percentage": 49.38, "elapsed_time": "1:56:32", "remaining_time": "1:59:26"}
|
||||||
|
{"current_steps": 2010, "total_steps": 4060, "loss": 0.3004, "lr": 2.3827994820089856e-05, "epoch": 3.4655172413793105, "percentage": 49.51, "elapsed_time": "1:56:51", "remaining_time": "1:59:10"}
|
||||||
|
{"current_steps": 2015, "total_steps": 4060, "loss": 0.2986, "lr": 2.3743572423571752e-05, "epoch": 3.4741379310344827, "percentage": 49.63, "elapsed_time": "1:57:08", "remaining_time": "1:58:53"}
|
||||||
|
{"current_steps": 2020, "total_steps": 4060, "loss": 0.2833, "lr": 2.365908084583011e-05, "epoch": 3.4827586206896552, "percentage": 49.75, "elapsed_time": "1:57:23", "remaining_time": "1:58:33"}
|
||||||
|
{"current_steps": 2025, "total_steps": 4060, "loss": 0.3176, "lr": 2.3574521648269406e-05, "epoch": 3.4913793103448274, "percentage": 49.88, "elapsed_time": "1:57:38", "remaining_time": "1:58:13"}
|
||||||
|
{"current_steps": 2030, "total_steps": 4060, "loss": 0.3221, "lr": 2.3489896393543717e-05, "epoch": 3.5, "percentage": 50.0, "elapsed_time": "1:57:54", "remaining_time": "1:57:54"}
|
||||||
|
{"current_steps": 2035, "total_steps": 4060, "loss": 0.2995, "lr": 2.340520664552788e-05, "epoch": 3.5086206896551726, "percentage": 50.12, "elapsed_time": "1:58:10", "remaining_time": "1:57:35"}
|
||||||
|
{"current_steps": 2040, "total_steps": 4060, "loss": 0.3381, "lr": 2.3320453969288553e-05, "epoch": 3.5172413793103448, "percentage": 50.25, "elapsed_time": "1:58:30", "remaining_time": "1:57:20"}
|
||||||
|
{"current_steps": 2045, "total_steps": 4060, "loss": 0.2943, "lr": 2.32356399310553e-05, "epoch": 3.5258620689655173, "percentage": 50.37, "elapsed_time": "1:58:43", "remaining_time": "1:56:59"}
|
||||||
|
{"current_steps": 2050, "total_steps": 4060, "loss": 0.2877, "lr": 2.3150766098191667e-05, "epoch": 3.5344827586206895, "percentage": 50.49, "elapsed_time": "1:59:00", "remaining_time": "1:56:40"}
|
||||||
|
{"current_steps": 2055, "total_steps": 4060, "loss": 0.2921, "lr": 2.3065834039166212e-05, "epoch": 3.543103448275862, "percentage": 50.62, "elapsed_time": "1:59:19", "remaining_time": "1:56:25"}
|
||||||
|
{"current_steps": 2060, "total_steps": 4060, "loss": 0.2969, "lr": 2.2980845323523487e-05, "epoch": 3.5517241379310347, "percentage": 50.74, "elapsed_time": "1:59:34", "remaining_time": "1:56:05"}
|
||||||
|
{"current_steps": 2065, "total_steps": 4060, "loss": 0.2854, "lr": 2.2895801521855096e-05, "epoch": 3.560344827586207, "percentage": 50.86, "elapsed_time": "1:59:51", "remaining_time": "1:55:47"}
|
||||||
|
{"current_steps": 2070, "total_steps": 4060, "loss": 0.2596, "lr": 2.2810704205770587e-05, "epoch": 3.5689655172413794, "percentage": 50.99, "elapsed_time": "2:00:14", "remaining_time": "1:55:35"}
|
||||||
|
{"current_steps": 2075, "total_steps": 4060, "loss": 0.2835, "lr": 2.2725554947868495e-05, "epoch": 3.5775862068965516, "percentage": 51.11, "elapsed_time": "2:00:24", "remaining_time": "1:55:11"}
|
||||||
|
{"current_steps": 2080, "total_steps": 4060, "loss": 0.2776, "lr": 2.2640355321707218e-05, "epoch": 3.586206896551724, "percentage": 51.23, "elapsed_time": "2:00:41", "remaining_time": "1:54:52"}
|
||||||
|
{"current_steps": 2085, "total_steps": 4060, "loss": 0.3753, "lr": 2.2555106901775955e-05, "epoch": 3.594827586206897, "percentage": 51.35, "elapsed_time": "2:00:53", "remaining_time": "1:54:30"}
|
||||||
|
{"current_steps": 2090, "total_steps": 4060, "loss": 0.3496, "lr": 2.246981126346564e-05, "epoch": 3.603448275862069, "percentage": 51.48, "elapsed_time": "2:01:04", "remaining_time": "1:54:07"}
|
||||||
|
{"current_steps": 2095, "total_steps": 4060, "loss": 0.3238, "lr": 2.238446998303977e-05, "epoch": 3.612068965517241, "percentage": 51.6, "elapsed_time": "2:01:19", "remaining_time": "1:53:47"}
|
||||||
|
{"current_steps": 2100, "total_steps": 4060, "loss": 0.2631, "lr": 2.2299084637605343e-05, "epoch": 3.6206896551724137, "percentage": 51.72, "elapsed_time": "2:01:35", "remaining_time": "1:53:28"}
|
||||||
|
{"current_steps": 2105, "total_steps": 4060, "loss": 0.2621, "lr": 2.221365680508364e-05, "epoch": 3.6293103448275863, "percentage": 51.85, "elapsed_time": "2:01:55", "remaining_time": "1:53:14"}
|
||||||
|
{"current_steps": 2110, "total_steps": 4060, "loss": 0.3036, "lr": 2.2128188064181143e-05, "epoch": 3.637931034482759, "percentage": 51.97, "elapsed_time": "2:02:12", "remaining_time": "1:52:56"}
|
||||||
|
{"current_steps": 2115, "total_steps": 4060, "loss": 0.3115, "lr": 2.2042679994360296e-05, "epoch": 3.646551724137931, "percentage": 52.09, "elapsed_time": "2:02:28", "remaining_time": "1:52:38"}
|
||||||
|
{"current_steps": 2120, "total_steps": 4060, "loss": 0.2941, "lr": 2.195713417581033e-05, "epoch": 3.655172413793103, "percentage": 52.22, "elapsed_time": "2:02:44", "remaining_time": "1:52:19"}
|
||||||
|
{"current_steps": 2125, "total_steps": 4060, "loss": 0.2995, "lr": 2.1871552189418113e-05, "epoch": 3.663793103448276, "percentage": 52.34, "elapsed_time": "2:03:10", "remaining_time": "1:52:10"}
|
||||||
|
{"current_steps": 2130, "total_steps": 4060, "loss": 0.3911, "lr": 2.1785935616738855e-05, "epoch": 3.6724137931034484, "percentage": 52.46, "elapsed_time": "2:03:30", "remaining_time": "1:51:54"}
|
||||||
|
{"current_steps": 2135, "total_steps": 4060, "loss": 0.3385, "lr": 2.170028603996695e-05, "epoch": 3.6810344827586206, "percentage": 52.59, "elapsed_time": "2:03:53", "remaining_time": "1:51:42"}
|
||||||
|
{"current_steps": 2140, "total_steps": 4060, "loss": 0.3331, "lr": 2.161460504190668e-05, "epoch": 3.689655172413793, "percentage": 52.71, "elapsed_time": "2:04:11", "remaining_time": "1:51:25"}
|
||||||
|
{"current_steps": 2145, "total_steps": 4060, "loss": 0.3372, "lr": 2.1528894205943017e-05, "epoch": 3.6982758620689653, "percentage": 52.83, "elapsed_time": "2:04:29", "remaining_time": "1:51:08"}
|
||||||
|
{"current_steps": 2150, "total_steps": 4060, "loss": 0.3117, "lr": 2.1443155116012328e-05, "epoch": 3.706896551724138, "percentage": 52.96, "elapsed_time": "2:04:46", "remaining_time": "1:50:50"}
|
||||||
|
{"current_steps": 2155, "total_steps": 4060, "loss": 0.286, "lr": 2.1357389356573098e-05, "epoch": 3.7155172413793105, "percentage": 53.08, "elapsed_time": "2:05:03", "remaining_time": "1:50:32"}
|
||||||
|
{"current_steps": 2160, "total_steps": 4060, "loss": 0.3153, "lr": 2.1271598512576705e-05, "epoch": 3.7241379310344827, "percentage": 53.2, "elapsed_time": "2:05:19", "remaining_time": "1:50:14"}
|
||||||
|
{"current_steps": 2165, "total_steps": 4060, "loss": 0.3009, "lr": 2.1185784169438047e-05, "epoch": 3.7327586206896552, "percentage": 53.33, "elapsed_time": "2:05:36", "remaining_time": "1:49:56"}
|
||||||
|
{"current_steps": 2170, "total_steps": 4060, "loss": 0.3166, "lr": 2.1099947913006303e-05, "epoch": 3.7413793103448274, "percentage": 53.45, "elapsed_time": "2:05:54", "remaining_time": "1:49:39"}
|
||||||
|
{"current_steps": 2175, "total_steps": 4060, "loss": 0.2763, "lr": 2.1014091329535618e-05, "epoch": 3.75, "percentage": 53.57, "elapsed_time": "2:06:10", "remaining_time": "1:49:20"}
|
||||||
|
{"current_steps": 2180, "total_steps": 4060, "loss": 0.2965, "lr": 2.0928216005655762e-05, "epoch": 3.7586206896551726, "percentage": 53.69, "elapsed_time": "2:06:30", "remaining_time": "1:49:05"}
|
||||||
|
{"current_steps": 2185, "total_steps": 4060, "loss": 0.3292, "lr": 2.084232352834285e-05, "epoch": 3.7672413793103448, "percentage": 53.82, "elapsed_time": "2:06:48", "remaining_time": "1:48:48"}
|
||||||
|
{"current_steps": 2190, "total_steps": 4060, "loss": 0.291, "lr": 2.0756415484889975e-05, "epoch": 3.7758620689655173, "percentage": 53.94, "elapsed_time": "2:07:10", "remaining_time": "1:48:35"}
|
||||||
|
{"current_steps": 2195, "total_steps": 4060, "loss": 0.3169, "lr": 2.0670493462877897e-05, "epoch": 3.7844827586206895, "percentage": 54.06, "elapsed_time": "2:07:22", "remaining_time": "1:48:13"}
|
||||||
|
{"current_steps": 2200, "total_steps": 4060, "loss": 0.329, "lr": 2.0584559050145706e-05, "epoch": 3.793103448275862, "percentage": 54.19, "elapsed_time": "2:07:40", "remaining_time": "1:47:56"}
|
||||||
|
{"current_steps": 2205, "total_steps": 4060, "loss": 0.3085, "lr": 2.0498613834761462e-05, "epoch": 3.8017241379310347, "percentage": 54.31, "elapsed_time": "2:07:53", "remaining_time": "1:47:35"}
|
||||||
|
{"current_steps": 2210, "total_steps": 4060, "loss": 0.2971, "lr": 2.0412659404992862e-05, "epoch": 3.810344827586207, "percentage": 54.43, "elapsed_time": "2:08:09", "remaining_time": "1:47:16"}
|
||||||
|
{"current_steps": 2215, "total_steps": 4060, "loss": 0.2971, "lr": 2.0326697349277893e-05, "epoch": 3.8189655172413794, "percentage": 54.56, "elapsed_time": "2:08:31", "remaining_time": "1:47:03"}
|
||||||
|
{"current_steps": 2220, "total_steps": 4060, "loss": 0.3069, "lr": 2.024072925619546e-05, "epoch": 3.8275862068965516, "percentage": 54.68, "elapsed_time": "2:08:46", "remaining_time": "1:46:43"}
|
||||||
|
{"current_steps": 2225, "total_steps": 4060, "loss": 0.3114, "lr": 2.0154756714436043e-05, "epoch": 3.836206896551724, "percentage": 54.8, "elapsed_time": "2:09:07", "remaining_time": "1:46:29"}
|
||||||
|
{"current_steps": 2230, "total_steps": 4060, "loss": 0.2882, "lr": 2.006878131277233e-05, "epoch": 3.844827586206897, "percentage": 54.93, "elapsed_time": "2:09:20", "remaining_time": "1:46:08"}
|
||||||
|
{"current_steps": 2235, "total_steps": 4060, "loss": 0.3146, "lr": 1.9982804640029864e-05, "epoch": 3.853448275862069, "percentage": 55.05, "elapsed_time": "2:09:36", "remaining_time": "1:45:49"}
|
||||||
|
{"current_steps": 2240, "total_steps": 4060, "loss": 0.3303, "lr": 1.989682828505767e-05, "epoch": 3.862068965517241, "percentage": 55.17, "elapsed_time": "2:09:49", "remaining_time": "1:45:29"}
|
||||||
|
{"current_steps": 2245, "total_steps": 4060, "loss": 0.2725, "lr": 1.9810853836698913e-05, "epoch": 3.8706896551724137, "percentage": 55.3, "elapsed_time": "2:10:02", "remaining_time": "1:45:08"}
|
||||||
|
{"current_steps": 2250, "total_steps": 4060, "loss": 0.3018, "lr": 1.972488288376151e-05, "epoch": 3.8793103448275863, "percentage": 55.42, "elapsed_time": "2:10:21", "remaining_time": "1:44:52"}
|
||||||
|
{"current_steps": 2255, "total_steps": 4060, "loss": 0.312, "lr": 1.963891701498879e-05, "epoch": 3.887931034482759, "percentage": 55.54, "elapsed_time": "2:10:32", "remaining_time": "1:44:29"}
|
||||||
|
{"current_steps": 2260, "total_steps": 4060, "loss": 0.3228, "lr": 1.955295781903014e-05, "epoch": 3.896551724137931, "percentage": 55.67, "elapsed_time": "2:10:45", "remaining_time": "1:44:08"}
|
||||||
|
{"current_steps": 2265, "total_steps": 4060, "loss": 0.3004, "lr": 1.9467006884411605e-05, "epoch": 3.905172413793103, "percentage": 55.79, "elapsed_time": "2:11:02", "remaining_time": "1:43:51"}
|
||||||
|
{"current_steps": 2270, "total_steps": 4060, "loss": 0.281, "lr": 1.9381065799506583e-05, "epoch": 3.913793103448276, "percentage": 55.91, "elapsed_time": "2:11:18", "remaining_time": "1:43:32"}
|
||||||
|
{"current_steps": 2275, "total_steps": 4060, "loss": 0.2768, "lr": 1.929513615250643e-05, "epoch": 3.9224137931034484, "percentage": 56.03, "elapsed_time": "2:11:32", "remaining_time": "1:43:12"}
|
||||||
|
{"current_steps": 2280, "total_steps": 4060, "loss": 0.2909, "lr": 1.9209219531391155e-05, "epoch": 3.9310344827586206, "percentage": 56.16, "elapsed_time": "2:11:50", "remaining_time": "1:42:56"}
|
||||||
|
{"current_steps": 2285, "total_steps": 4060, "loss": 0.2966, "lr": 1.9123317523900015e-05, "epoch": 3.939655172413793, "percentage": 56.28, "elapsed_time": "2:12:11", "remaining_time": "1:42:41"}
|
||||||
|
{"current_steps": 2290, "total_steps": 4060, "loss": 0.3214, "lr": 1.9037431717502253e-05, "epoch": 3.9482758620689653, "percentage": 56.4, "elapsed_time": "2:12:37", "remaining_time": "1:42:30"}
|
||||||
|
{"current_steps": 2295, "total_steps": 4060, "loss": 0.3097, "lr": 1.8951563699367673e-05, "epoch": 3.956896551724138, "percentage": 56.53, "elapsed_time": "2:12:53", "remaining_time": "1:42:11"}
|
||||||
|
{"current_steps": 2300, "total_steps": 4060, "loss": 0.3055, "lr": 1.886571505633737e-05, "epoch": 3.9655172413793105, "percentage": 56.65, "elapsed_time": "2:13:06", "remaining_time": "1:41:51"}
|
||||||
|
{"current_steps": 2305, "total_steps": 4060, "loss": 0.3062, "lr": 1.8779887374894384e-05, "epoch": 3.9741379310344827, "percentage": 56.77, "elapsed_time": "2:13:21", "remaining_time": "1:41:32"}
|
||||||
|
{"current_steps": 2310, "total_steps": 4060, "loss": 0.3484, "lr": 1.8694082241134385e-05, "epoch": 3.9827586206896552, "percentage": 56.9, "elapsed_time": "2:13:36", "remaining_time": "1:41:13"}
|
||||||
|
{"current_steps": 2315, "total_steps": 4060, "loss": 0.2905, "lr": 1.8608301240736378e-05, "epoch": 3.9913793103448274, "percentage": 57.02, "elapsed_time": "2:13:50", "remaining_time": "1:40:53"}
|
||||||
|
{"current_steps": 2320, "total_steps": 4060, "loss": 0.3191, "lr": 1.852254595893335e-05, "epoch": 4.0, "percentage": 57.14, "elapsed_time": "2:14:04", "remaining_time": "1:40:33"}
|
||||||
|
{"current_steps": 2325, "total_steps": 4060, "loss": 0.2731, "lr": 1.8436817980483035e-05, "epoch": 4.008620689655173, "percentage": 57.27, "elapsed_time": "2:14:19", "remaining_time": "1:40:14"}
|
||||||
|
{"current_steps": 2330, "total_steps": 4060, "loss": 0.2828, "lr": 1.835111888963859e-05, "epoch": 4.017241379310345, "percentage": 57.39, "elapsed_time": "2:14:32", "remaining_time": "1:39:53"}
|
||||||
|
{"current_steps": 2335, "total_steps": 4060, "loss": 0.2168, "lr": 1.8265450270119335e-05, "epoch": 4.025862068965517, "percentage": 57.51, "elapsed_time": "2:14:51", "remaining_time": "1:39:37"}
|
||||||
|
{"current_steps": 2340, "total_steps": 4060, "loss": 0.2683, "lr": 1.8179813705081468e-05, "epoch": 4.0344827586206895, "percentage": 57.64, "elapsed_time": "2:15:08", "remaining_time": "1:39:20"}
|
||||||
|
{"current_steps": 2345, "total_steps": 4060, "loss": 0.2811, "lr": 1.8094210777088833e-05, "epoch": 4.043103448275862, "percentage": 57.76, "elapsed_time": "2:15:25", "remaining_time": "1:39:02"}
|
||||||
|
{"current_steps": 2350, "total_steps": 4060, "loss": 0.2341, "lr": 1.800864306808367e-05, "epoch": 4.051724137931035, "percentage": 57.88, "elapsed_time": "2:15:45", "remaining_time": "1:38:47"}
|
||||||
|
{"current_steps": 2355, "total_steps": 4060, "loss": 0.2411, "lr": 1.7923112159357344e-05, "epoch": 4.060344827586207, "percentage": 58.0, "elapsed_time": "2:16:00", "remaining_time": "1:38:27"}
|
||||||
|
{"current_steps": 2360, "total_steps": 4060, "loss": 0.2544, "lr": 1.783761963152117e-05, "epoch": 4.068965517241379, "percentage": 58.13, "elapsed_time": "2:16:17", "remaining_time": "1:38:10"}
|
||||||
|
{"current_steps": 2365, "total_steps": 4060, "loss": 0.2629, "lr": 1.7752167064477173e-05, "epoch": 4.077586206896552, "percentage": 58.25, "elapsed_time": "2:16:31", "remaining_time": "1:37:51"}
|
||||||
|
{"current_steps": 2370, "total_steps": 4060, "loss": 0.2739, "lr": 1.7666756037388923e-05, "epoch": 4.086206896551724, "percentage": 58.37, "elapsed_time": "2:16:46", "remaining_time": "1:37:31"}
|
||||||
|
{"current_steps": 2375, "total_steps": 4060, "loss": 0.2972, "lr": 1.7581388128652315e-05, "epoch": 4.094827586206897, "percentage": 58.5, "elapsed_time": "2:17:03", "remaining_time": "1:37:14"}
|
||||||
|
{"current_steps": 2380, "total_steps": 4060, "loss": 0.2526, "lr": 1.7496064915866414e-05, "epoch": 4.103448275862069, "percentage": 58.62, "elapsed_time": "2:17:18", "remaining_time": "1:36:55"}
|
||||||
|
{"current_steps": 2385, "total_steps": 4060, "loss": 0.2637, "lr": 1.7410787975804314e-05, "epoch": 4.112068965517241, "percentage": 58.74, "elapsed_time": "2:17:30", "remaining_time": "1:36:34"}
|
||||||
|
{"current_steps": 2390, "total_steps": 4060, "loss": 0.2801, "lr": 1.732555888438398e-05, "epoch": 4.120689655172414, "percentage": 58.87, "elapsed_time": "2:17:43", "remaining_time": "1:36:14"}
|
||||||
|
{"current_steps": 2395, "total_steps": 4060, "loss": 0.3059, "lr": 1.7240379216639136e-05, "epoch": 4.129310344827586, "percentage": 58.99, "elapsed_time": "2:17:59", "remaining_time": "1:35:56"}
|
||||||
|
{"current_steps": 2400, "total_steps": 4060, "loss": 0.2314, "lr": 1.7155250546690173e-05, "epoch": 4.137931034482759, "percentage": 59.11, "elapsed_time": "2:18:14", "remaining_time": "1:35:37"}
|
||||||
|
{"current_steps": 2405, "total_steps": 4060, "loss": 0.2487, "lr": 1.707017444771502e-05, "epoch": 4.146551724137931, "percentage": 59.24, "elapsed_time": "2:18:27", "remaining_time": "1:35:17"}
|
||||||
|
{"current_steps": 2410, "total_steps": 4060, "loss": 0.2962, "lr": 1.6985152491920103e-05, "epoch": 4.155172413793103, "percentage": 59.36, "elapsed_time": "2:18:43", "remaining_time": "1:34:58"}
|
||||||
|
{"current_steps": 2415, "total_steps": 4060, "loss": 0.2365, "lr": 1.690018625051128e-05, "epoch": 4.163793103448276, "percentage": 59.48, "elapsed_time": "2:19:02", "remaining_time": "1:34:42"}
|
||||||
|
{"current_steps": 2420, "total_steps": 4060, "loss": 0.2694, "lr": 1.681527729366481e-05, "epoch": 4.172413793103448, "percentage": 59.61, "elapsed_time": "2:19:16", "remaining_time": "1:34:23"}
|
||||||
|
{"current_steps": 2425, "total_steps": 4060, "loss": 0.2529, "lr": 1.673042719049834e-05, "epoch": 4.181034482758621, "percentage": 59.73, "elapsed_time": "2:19:31", "remaining_time": "1:34:04"}
|
||||||
|
{"current_steps": 2430, "total_steps": 4060, "loss": 0.2526, "lr": 1.664563750904188e-05, "epoch": 4.189655172413793, "percentage": 59.85, "elapsed_time": "2:19:49", "remaining_time": "1:33:47"}
|
||||||
|
{"current_steps": 2435, "total_steps": 4060, "loss": 0.2801, "lr": 1.656090981620888e-05, "epoch": 4.198275862068965, "percentage": 59.98, "elapsed_time": "2:20:05", "remaining_time": "1:33:29"}
|
||||||
|
{"current_steps": 2440, "total_steps": 4060, "loss": 0.2471, "lr": 1.64762456777672e-05, "epoch": 4.206896551724138, "percentage": 60.1, "elapsed_time": "2:20:20", "remaining_time": "1:33:10"}
|
||||||
|
{"current_steps": 2445, "total_steps": 4060, "loss": 0.2776, "lr": 1.6391646658310242e-05, "epoch": 4.2155172413793105, "percentage": 60.22, "elapsed_time": "2:20:34", "remaining_time": "1:32:51"}
|
||||||
|
{"current_steps": 2450, "total_steps": 4060, "loss": 0.2705, "lr": 1.6307114321227996e-05, "epoch": 4.224137931034483, "percentage": 60.34, "elapsed_time": "2:20:54", "remaining_time": "1:32:35"}
|
||||||
|
{"current_steps": 2455, "total_steps": 4060, "loss": 0.3203, "lr": 1.622265022867818e-05, "epoch": 4.232758620689655, "percentage": 60.47, "elapsed_time": "2:21:11", "remaining_time": "1:32:18"}
|
||||||
|
{"current_steps": 2460, "total_steps": 4060, "loss": 0.275, "lr": 1.6138255941557336e-05, "epoch": 4.241379310344827, "percentage": 60.59, "elapsed_time": "2:21:25", "remaining_time": "1:31:59"}
|
||||||
|
{"current_steps": 2465, "total_steps": 4060, "loss": 0.2459, "lr": 1.6053933019472003e-05, "epoch": 4.25, "percentage": 60.71, "elapsed_time": "2:21:43", "remaining_time": "1:31:42"}
|
||||||
|
{"current_steps": 2470, "total_steps": 4060, "loss": 0.2323, "lr": 1.5969683020709902e-05, "epoch": 4.258620689655173, "percentage": 60.84, "elapsed_time": "2:22:00", "remaining_time": "1:31:24"}
|
||||||
|
{"current_steps": 2475, "total_steps": 4060, "loss": 0.242, "lr": 1.5885507502211108e-05, "epoch": 4.267241379310345, "percentage": 60.96, "elapsed_time": "2:22:23", "remaining_time": "1:31:11"}
|
||||||
|
{"current_steps": 2480, "total_steps": 4060, "loss": 0.254, "lr": 1.5801408019539345e-05, "epoch": 4.275862068965517, "percentage": 61.08, "elapsed_time": "2:22:44", "remaining_time": "1:30:56"}
|
||||||
|
{"current_steps": 2485, "total_steps": 4060, "loss": 0.2334, "lr": 1.5717386126853156e-05, "epoch": 4.2844827586206895, "percentage": 61.21, "elapsed_time": "2:22:56", "remaining_time": "1:30:35"}
|
||||||
|
{"current_steps": 2490, "total_steps": 4060, "loss": 0.2797, "lr": 1.5633443376877236e-05, "epoch": 4.293103448275862, "percentage": 61.33, "elapsed_time": "2:23:17", "remaining_time": "1:30:20"}
|
||||||
|
{"current_steps": 2495, "total_steps": 4060, "loss": 0.2734, "lr": 1.5549581320873715e-05, "epoch": 4.301724137931035, "percentage": 61.45, "elapsed_time": "2:23:32", "remaining_time": "1:30:02"}
|
||||||
|
{"current_steps": 2500, "total_steps": 4060, "loss": 0.3097, "lr": 1.546580150861351e-05, "epoch": 4.310344827586207, "percentage": 61.58, "elapsed_time": "2:23:45", "remaining_time": "1:29:42"}
|
||||||
|
{"current_steps": 2505, "total_steps": 4060, "loss": 0.2556, "lr": 1.5382105488347654e-05, "epoch": 4.318965517241379, "percentage": 61.7, "elapsed_time": "2:24:00", "remaining_time": "1:29:23"}
|
||||||
|
{"current_steps": 2510, "total_steps": 4060, "loss": 0.2651, "lr": 1.5298494806778733e-05, "epoch": 4.327586206896552, "percentage": 61.82, "elapsed_time": "2:24:17", "remaining_time": "1:29:05"}
|
||||||
|
{"current_steps": 2515, "total_steps": 4060, "loss": 0.2789, "lr": 1.5214971009032251e-05, "epoch": 4.336206896551724, "percentage": 61.95, "elapsed_time": "2:24:40", "remaining_time": "1:28:52"}
|
||||||
|
{"current_steps": 2520, "total_steps": 4060, "loss": 0.26, "lr": 1.51315356386281e-05, "epoch": 4.344827586206897, "percentage": 62.07, "elapsed_time": "2:24:55", "remaining_time": "1:28:33"}
|
||||||
|
{"current_steps": 2525, "total_steps": 4060, "loss": 0.2502, "lr": 1.5048190237452052e-05, "epoch": 4.353448275862069, "percentage": 62.19, "elapsed_time": "2:25:06", "remaining_time": "1:28:13"}
|
||||||
|
{"current_steps": 2530, "total_steps": 4060, "loss": 0.2738, "lr": 1.4964936345727217e-05, "epoch": 4.362068965517241, "percentage": 62.32, "elapsed_time": "2:25:23", "remaining_time": "1:27:55"}
|
||||||
|
{"current_steps": 2535, "total_steps": 4060, "loss": 0.2738, "lr": 1.4881775501985645e-05, "epoch": 4.370689655172414, "percentage": 62.44, "elapsed_time": "2:25:45", "remaining_time": "1:27:41"}
|
||||||
|
{"current_steps": 2540, "total_steps": 4060, "loss": 0.2526, "lr": 1.4798709243039842e-05, "epoch": 4.379310344827586, "percentage": 62.56, "elapsed_time": "2:25:59", "remaining_time": "1:27:21"}
|
||||||
|
{"current_steps": 2545, "total_steps": 4060, "loss": 0.2643, "lr": 1.4715739103954375e-05, "epoch": 4.387931034482759, "percentage": 62.68, "elapsed_time": "2:26:10", "remaining_time": "1:27:00"}
|
||||||
|
{"current_steps": 2550, "total_steps": 4060, "loss": 0.2447, "lr": 1.4632866618017543e-05, "epoch": 4.396551724137931, "percentage": 62.81, "elapsed_time": "2:26:32", "remaining_time": "1:26:46"}
|
||||||
|
{"current_steps": 2555, "total_steps": 4060, "loss": 0.2577, "lr": 1.4550093316712987e-05, "epoch": 4.405172413793103, "percentage": 62.93, "elapsed_time": "2:26:45", "remaining_time": "1:26:26"}
|
||||||
|
{"current_steps": 2560, "total_steps": 4060, "loss": 0.2498, "lr": 1.4467420729691433e-05, "epoch": 4.413793103448276, "percentage": 63.05, "elapsed_time": "2:26:59", "remaining_time": "1:26:07"}
|
||||||
|
{"current_steps": 2565, "total_steps": 4060, "loss": 0.2555, "lr": 1.4384850384742412e-05, "epoch": 4.422413793103448, "percentage": 63.18, "elapsed_time": "2:27:17", "remaining_time": "1:25:50"}
|
||||||
|
{"current_steps": 2570, "total_steps": 4060, "loss": 0.2598, "lr": 1.4302383807766003e-05, "epoch": 4.431034482758621, "percentage": 63.3, "elapsed_time": "2:27:39", "remaining_time": "1:25:36"}
|
||||||
|
{"current_steps": 2575, "total_steps": 4060, "loss": 0.2878, "lr": 1.4220022522744667e-05, "epoch": 4.439655172413793, "percentage": 63.42, "elapsed_time": "2:27:54", "remaining_time": "1:25:17"}
|
||||||
|
{"current_steps": 2580, "total_steps": 4060, "loss": 0.2535, "lr": 1.4137768051715059e-05, "epoch": 4.448275862068965, "percentage": 63.55, "elapsed_time": "2:28:07", "remaining_time": "1:24:58"}
|
||||||
|
{"current_steps": 2585, "total_steps": 4060, "loss": 0.2634, "lr": 1.4055621914739915e-05, "epoch": 4.456896551724138, "percentage": 63.67, "elapsed_time": "2:28:31", "remaining_time": "1:24:44"}
|
||||||
|
{"current_steps": 2590, "total_steps": 4060, "loss": 0.2558, "lr": 1.3973585629879973e-05, "epoch": 4.4655172413793105, "percentage": 63.79, "elapsed_time": "2:28:56", "remaining_time": "1:24:32"}
|
||||||
|
{"current_steps": 2595, "total_steps": 4060, "loss": 0.2441, "lr": 1.3891660713165873e-05, "epoch": 4.474137931034483, "percentage": 63.92, "elapsed_time": "2:29:16", "remaining_time": "1:24:16"}
|
||||||
|
{"current_steps": 2600, "total_steps": 4060, "loss": 0.2609, "lr": 1.3809848678570204e-05, "epoch": 4.482758620689655, "percentage": 64.04, "elapsed_time": "2:29:30", "remaining_time": "1:23:57"}
|
||||||
|
{"current_steps": 2605, "total_steps": 4060, "loss": 0.2256, "lr": 1.3728151037979468e-05, "epoch": 4.491379310344827, "percentage": 64.16, "elapsed_time": "2:29:53", "remaining_time": "1:23:43"}
|
||||||
|
{"current_steps": 2610, "total_steps": 4060, "loss": 0.302, "lr": 1.3646569301166177e-05, "epoch": 4.5, "percentage": 64.29, "elapsed_time": "2:30:11", "remaining_time": "1:23:26"}
|
||||||
|
{"current_steps": 2615, "total_steps": 4060, "loss": 0.2386, "lr": 1.3565104975760936e-05, "epoch": 4.508620689655173, "percentage": 64.41, "elapsed_time": "2:30:28", "remaining_time": "1:23:08"}
|
||||||
|
{"current_steps": 2620, "total_steps": 4060, "loss": 0.2951, "lr": 1.34837595672246e-05, "epoch": 4.517241379310345, "percentage": 64.53, "elapsed_time": "2:30:46", "remaining_time": "1:22:52"}
|
||||||
|
{"current_steps": 2625, "total_steps": 4060, "loss": 0.2379, "lr": 1.3402534578820428e-05, "epoch": 4.525862068965517, "percentage": 64.66, "elapsed_time": "2:30:58", "remaining_time": "1:22:31"}
|
||||||
|
{"current_steps": 2630, "total_steps": 4060, "loss": 0.2518, "lr": 1.3321431511586308e-05, "epoch": 4.5344827586206895, "percentage": 64.78, "elapsed_time": "2:31:18", "remaining_time": "1:22:15"}
|
||||||
|
{"current_steps": 2635, "total_steps": 4060, "loss": 0.2694, "lr": 1.3240451864307048e-05, "epoch": 4.543103448275862, "percentage": 64.9, "elapsed_time": "2:31:36", "remaining_time": "1:21:59"}
|
||||||
|
{"current_steps": 2640, "total_steps": 4060, "loss": 0.2678, "lr": 1.3159597133486628e-05, "epoch": 4.551724137931035, "percentage": 65.02, "elapsed_time": "2:31:50", "remaining_time": "1:21:40"}
|
||||||
|
{"current_steps": 2645, "total_steps": 4060, "loss": 0.2679, "lr": 1.3078868813320594e-05, "epoch": 4.560344827586206, "percentage": 65.15, "elapsed_time": "2:32:06", "remaining_time": "1:21:22"}
|
||||||
|
{"current_steps": 2650, "total_steps": 4060, "loss": 0.2767, "lr": 1.2998268395668412e-05, "epoch": 4.568965517241379, "percentage": 65.27, "elapsed_time": "2:32:17", "remaining_time": "1:21:01"}
|
||||||
|
{"current_steps": 2655, "total_steps": 4060, "loss": 0.2494, "lr": 1.2917797370025908e-05, "epoch": 4.577586206896552, "percentage": 65.39, "elapsed_time": "2:32:35", "remaining_time": "1:20:45"}
|
||||||
|
{"current_steps": 2660, "total_steps": 4060, "loss": 0.2494, "lr": 1.2837457223497754e-05, "epoch": 4.586206896551724, "percentage": 65.52, "elapsed_time": "2:32:48", "remaining_time": "1:20:25"}
|
||||||
|
{"current_steps": 2665, "total_steps": 4060, "loss": 0.2622, "lr": 1.2757249440769957e-05, "epoch": 4.594827586206897, "percentage": 65.64, "elapsed_time": "2:33:05", "remaining_time": "1:20:08"}
|
||||||
|
{"current_steps": 2670, "total_steps": 4060, "loss": 0.2457, "lr": 1.2677175504082452e-05, "epoch": 4.603448275862069, "percentage": 65.76, "elapsed_time": "2:33:18", "remaining_time": "1:19:48"}
|
||||||
|
{"current_steps": 2675, "total_steps": 4060, "loss": 0.3195, "lr": 1.2597236893201712e-05, "epoch": 4.612068965517241, "percentage": 65.89, "elapsed_time": "2:33:39", "remaining_time": "1:19:33"}
|
||||||
|
{"current_steps": 2680, "total_steps": 4060, "loss": 0.2708, "lr": 1.2517435085393373e-05, "epoch": 4.620689655172414, "percentage": 66.01, "elapsed_time": "2:34:03", "remaining_time": "1:19:19"}
|
||||||
|
{"current_steps": 2685, "total_steps": 4060, "loss": 0.2867, "lr": 1.2437771555394944e-05, "epoch": 4.629310344827586, "percentage": 66.13, "elapsed_time": "2:34:21", "remaining_time": "1:19:02"}
|
||||||
|
{"current_steps": 2690, "total_steps": 4060, "loss": 0.2619, "lr": 1.2358247775388578e-05, "epoch": 4.637931034482759, "percentage": 66.26, "elapsed_time": "2:34:41", "remaining_time": "1:18:47"}
|
||||||
|
{"current_steps": 2695, "total_steps": 4060, "loss": 0.2894, "lr": 1.227886521497383e-05, "epoch": 4.646551724137931, "percentage": 66.38, "elapsed_time": "2:34:55", "remaining_time": "1:18:28"}
|
||||||
|
{"current_steps": 2700, "total_steps": 4060, "loss": 0.2897, "lr": 1.2199625341140533e-05, "epoch": 4.655172413793103, "percentage": 66.5, "elapsed_time": "2:35:12", "remaining_time": "1:18:10"}
|
||||||
|
{"current_steps": 2705, "total_steps": 4060, "loss": 0.2671, "lr": 1.2120529618241665e-05, "epoch": 4.663793103448276, "percentage": 66.63, "elapsed_time": "2:35:26", "remaining_time": "1:17:51"}
|
||||||
|
{"current_steps": 2710, "total_steps": 4060, "loss": 0.2532, "lr": 1.2041579507966288e-05, "epoch": 4.672413793103448, "percentage": 66.75, "elapsed_time": "2:35:41", "remaining_time": "1:17:33"}
|
||||||
|
{"current_steps": 2715, "total_steps": 4060, "loss": 0.2814, "lr": 1.1962776469312556e-05, "epoch": 4.681034482758621, "percentage": 66.87, "elapsed_time": "2:35:53", "remaining_time": "1:17:13"}
|
||||||
|
{"current_steps": 2720, "total_steps": 4060, "loss": 0.2354, "lr": 1.1884121958560721e-05, "epoch": 4.689655172413794, "percentage": 67.0, "elapsed_time": "2:36:06", "remaining_time": "1:16:54"}
|
||||||
|
{"current_steps": 2725, "total_steps": 4060, "loss": 0.3175, "lr": 1.1805617429246254e-05, "epoch": 4.698275862068965, "percentage": 67.12, "elapsed_time": "2:36:23", "remaining_time": "1:16:37"}
|
||||||
|
{"current_steps": 2730, "total_steps": 4060, "loss": 0.2971, "lr": 1.1727264332132978e-05, "epoch": 4.706896551724138, "percentage": 67.24, "elapsed_time": "2:36:43", "remaining_time": "1:16:21"}
|
||||||
|
{"current_steps": 2735, "total_steps": 4060, "loss": 0.2445, "lr": 1.1649064115186216e-05, "epoch": 4.7155172413793105, "percentage": 67.36, "elapsed_time": "2:36:56", "remaining_time": "1:16:01"}
|
||||||
|
{"current_steps": 2740, "total_steps": 4060, "loss": 0.2482, "lr": 1.1571018223546095e-05, "epoch": 4.724137931034483, "percentage": 67.49, "elapsed_time": "2:37:10", "remaining_time": "1:15:43"}
|
||||||
|
{"current_steps": 2745, "total_steps": 4060, "loss": 0.2526, "lr": 1.1493128099500806e-05, "epoch": 4.732758620689655, "percentage": 67.61, "elapsed_time": "2:37:27", "remaining_time": "1:15:25"}
|
||||||
|
{"current_steps": 2750, "total_steps": 4060, "loss": 0.2621, "lr": 1.1415395182459925e-05, "epoch": 4.741379310344827, "percentage": 67.73, "elapsed_time": "2:37:44", "remaining_time": "1:15:08"}
|
||||||
|
{"current_steps": 2755, "total_steps": 4060, "loss": 0.2598, "lr": 1.1337820908927891e-05, "epoch": 4.75, "percentage": 67.86, "elapsed_time": "2:38:07", "remaining_time": "1:14:53"}
|
||||||
|
{"current_steps": 2760, "total_steps": 4060, "loss": 0.3048, "lr": 1.126040671247738e-05, "epoch": 4.758620689655173, "percentage": 67.98, "elapsed_time": "2:38:24", "remaining_time": "1:14:36"}
|
||||||
|
{"current_steps": 2765, "total_steps": 4060, "loss": 0.3053, "lr": 1.1183154023722839e-05, "epoch": 4.767241379310345, "percentage": 68.1, "elapsed_time": "2:38:55", "remaining_time": "1:14:26"}
|
||||||
|
{"current_steps": 2770, "total_steps": 4060, "loss": 0.2311, "lr": 1.1106064270294068e-05, "epoch": 4.775862068965517, "percentage": 68.23, "elapsed_time": "2:39:12", "remaining_time": "1:14:08"}
|
||||||
|
{"current_steps": 2775, "total_steps": 4060, "loss": 0.2629, "lr": 1.1029138876809818e-05, "epoch": 4.7844827586206895, "percentage": 68.35, "elapsed_time": "2:39:30", "remaining_time": "1:13:51"}
|
||||||
|
{"current_steps": 2780, "total_steps": 4060, "loss": 0.2354, "lr": 1.0952379264851464e-05, "epoch": 4.793103448275862, "percentage": 68.47, "elapsed_time": "2:39:43", "remaining_time": "1:13:32"}
|
||||||
|
{"current_steps": 2785, "total_steps": 4060, "loss": 0.2372, "lr": 1.087578685293674e-05, "epoch": 4.801724137931035, "percentage": 68.6, "elapsed_time": "2:40:00", "remaining_time": "1:13:15"}
|
||||||
|
{"current_steps": 2790, "total_steps": 4060, "loss": 0.274, "lr": 1.0799363056493529e-05, "epoch": 4.810344827586206, "percentage": 68.72, "elapsed_time": "2:40:14", "remaining_time": "1:12:56"}
|
||||||
|
{"current_steps": 2795, "total_steps": 4060, "loss": 0.3045, "lr": 1.0723109287833697e-05, "epoch": 4.818965517241379, "percentage": 68.84, "elapsed_time": "2:40:31", "remaining_time": "1:12:38"}
|
||||||
|
{"current_steps": 2800, "total_steps": 4060, "loss": 0.2538, "lr": 1.0647026956126979e-05, "epoch": 4.827586206896552, "percentage": 68.97, "elapsed_time": "2:40:46", "remaining_time": "1:12:20"}
|
||||||
|
{"current_steps": 2805, "total_steps": 4060, "loss": 0.2661, "lr": 1.0571117467374972e-05, "epoch": 4.836206896551724, "percentage": 69.09, "elapsed_time": "2:41:03", "remaining_time": "1:12:03"}
|
||||||
|
{"current_steps": 2810, "total_steps": 4060, "loss": 0.2769, "lr": 1.0495382224385154e-05, "epoch": 4.844827586206897, "percentage": 69.21, "elapsed_time": "2:41:19", "remaining_time": "1:11:45"}
|
||||||
|
{"current_steps": 2815, "total_steps": 4060, "loss": 0.2634, "lr": 1.0419822626744894e-05, "epoch": 4.853448275862069, "percentage": 69.33, "elapsed_time": "2:41:33", "remaining_time": "1:11:27"}
|
||||||
|
{"current_steps": 2820, "total_steps": 4060, "loss": 0.2723, "lr": 1.0344440070795671e-05, "epoch": 4.862068965517241, "percentage": 69.46, "elapsed_time": "2:41:46", "remaining_time": "1:11:07"}
|
||||||
|
{"current_steps": 2825, "total_steps": 4060, "loss": 0.2912, "lr": 1.0269235949607223e-05, "epoch": 4.870689655172414, "percentage": 69.58, "elapsed_time": "2:42:00", "remaining_time": "1:10:49"}
|
||||||
|
{"current_steps": 2830, "total_steps": 4060, "loss": 0.2504, "lr": 1.019421165295182e-05, "epoch": 4.879310344827586, "percentage": 69.7, "elapsed_time": "2:42:23", "remaining_time": "1:10:34"}
|
||||||
|
{"current_steps": 2835, "total_steps": 4060, "loss": 0.2727, "lr": 1.0119368567278545e-05, "epoch": 4.887931034482759, "percentage": 69.83, "elapsed_time": "2:42:48", "remaining_time": "1:10:20"}
|
||||||
|
{"current_steps": 2840, "total_steps": 4060, "loss": 0.2757, "lr": 1.0044708075687746e-05, "epoch": 4.896551724137931, "percentage": 69.95, "elapsed_time": "2:42:59", "remaining_time": "1:10:00"}
|
||||||
|
{"current_steps": 2845, "total_steps": 4060, "loss": 0.2975, "lr": 9.97023155790541e-06, "epoch": 4.905172413793103, "percentage": 70.07, "elapsed_time": "2:43:18", "remaining_time": "1:09:44"}
|
||||||
|
{"current_steps": 2850, "total_steps": 4060, "loss": 0.2632, "lr": 9.895940390257675e-06, "epoch": 4.913793103448276, "percentage": 70.2, "elapsed_time": "2:43:32", "remaining_time": "1:09:26"}
|
||||||
|
{"current_steps": 2855, "total_steps": 4060, "loss": 0.2852, "lr": 9.821835945645426e-06, "epoch": 4.922413793103448, "percentage": 70.32, "elapsed_time": "2:43:56", "remaining_time": "1:09:11"}
|
||||||
|
{"current_steps": 2860, "total_steps": 4060, "loss": 0.2819, "lr": 9.747919593518897e-06, "epoch": 4.931034482758621, "percentage": 70.44, "elapsed_time": "2:44:11", "remaining_time": "1:08:53"}
|
||||||
|
{"current_steps": 2865, "total_steps": 4060, "loss": 0.3151, "lr": 9.674192699852397e-06, "epoch": 4.939655172413794, "percentage": 70.57, "elapsed_time": "2:44:30", "remaining_time": "1:08:36"}
|
||||||
|
{"current_steps": 2870, "total_steps": 4060, "loss": 0.2808, "lr": 9.600656627119e-06, "epoch": 4.948275862068965, "percentage": 70.69, "elapsed_time": "2:44:48", "remaining_time": "1:08:20"}
|
||||||
|
{"current_steps": 2875, "total_steps": 4060, "loss": 0.3068, "lr": 9.52731273426544e-06, "epoch": 4.956896551724138, "percentage": 70.81, "elapsed_time": "2:45:11", "remaining_time": "1:08:05"}
|
||||||
|
{"current_steps": 2880, "total_steps": 4060, "loss": 0.2727, "lr": 9.454162376686959e-06, "epoch": 4.9655172413793105, "percentage": 70.94, "elapsed_time": "2:45:26", "remaining_time": "1:07:47"}
|
||||||
|
{"current_steps": 2885, "total_steps": 4060, "loss": 0.2938, "lr": 9.381206906202268e-06, "epoch": 4.974137931034483, "percentage": 71.06, "elapsed_time": "2:45:47", "remaining_time": "1:07:31"}
|
||||||
|
{"current_steps": 2890, "total_steps": 4060, "loss": 0.2573, "lr": 9.308447671028546e-06, "epoch": 4.982758620689655, "percentage": 71.18, "elapsed_time": "2:46:03", "remaining_time": "1:07:13"}
|
||||||
|
{"current_steps": 2895, "total_steps": 4060, "loss": 0.2651, "lr": 9.235886015756579e-06, "epoch": 4.991379310344827, "percentage": 71.31, "elapsed_time": "2:46:23", "remaining_time": "1:06:57"}
|
||||||
|
{"current_steps": 2900, "total_steps": 4060, "loss": 0.2606, "lr": 9.163523281325855e-06, "epoch": 5.0, "percentage": 71.43, "elapsed_time": "2:46:39", "remaining_time": "1:06:39"}
|
||||||
|
{"current_steps": 2905, "total_steps": 4060, "loss": 0.2309, "lr": 9.09136080499979e-06, "epoch": 5.008620689655173, "percentage": 71.55, "elapsed_time": "2:46:55", "remaining_time": "1:06:22"}
|
||||||
|
{"current_steps": 2910, "total_steps": 4060, "loss": 0.2241, "lr": 9.019399920341056e-06, "epoch": 5.017241379310345, "percentage": 71.67, "elapsed_time": "2:47:19", "remaining_time": "1:06:07"}
|
||||||
|
{"current_steps": 2915, "total_steps": 4060, "loss": 0.2091, "lr": 8.947641957186901e-06, "epoch": 5.025862068965517, "percentage": 71.8, "elapsed_time": "2:47:43", "remaining_time": "1:05:52"}
|
||||||
|
{"current_steps": 2920, "total_steps": 4060, "loss": 0.2387, "lr": 8.876088241624581e-06, "epoch": 5.0344827586206895, "percentage": 71.92, "elapsed_time": "2:48:00", "remaining_time": "1:05:35"}
|
||||||
|
{"current_steps": 2925, "total_steps": 4060, "loss": 0.2243, "lr": 8.804740095966854e-06, "epoch": 5.043103448275862, "percentage": 72.04, "elapsed_time": "2:48:12", "remaining_time": "1:05:16"}
|
||||||
|
{"current_steps": 2930, "total_steps": 4060, "loss": 0.2794, "lr": 8.733598838727559e-06, "epoch": 5.051724137931035, "percentage": 72.17, "elapsed_time": "2:48:25", "remaining_time": "1:04:57"}
|
||||||
|
{"current_steps": 2935, "total_steps": 4060, "loss": 0.231, "lr": 8.662665784597229e-06, "epoch": 5.060344827586207, "percentage": 72.29, "elapsed_time": "2:48:37", "remaining_time": "1:04:38"}
|
||||||
|
{"current_steps": 2940, "total_steps": 4060, "loss": 0.2198, "lr": 8.591942244418787e-06, "epoch": 5.068965517241379, "percentage": 72.41, "elapsed_time": "2:48:49", "remaining_time": "1:04:19"}
|
||||||
|
{"current_steps": 2945, "total_steps": 4060, "loss": 0.1996, "lr": 8.521429525163353e-06, "epoch": 5.077586206896552, "percentage": 72.54, "elapsed_time": "2:49:13", "remaining_time": "1:04:04"}
|
||||||
|
{"current_steps": 2950, "total_steps": 4060, "loss": 0.2051, "lr": 8.451128929906103e-06, "epoch": 5.086206896551724, "percentage": 72.66, "elapsed_time": "2:49:27", "remaining_time": "1:03:45"}
|
||||||
|
{"current_steps": 2955, "total_steps": 4060, "loss": 0.2243, "lr": 8.381041757802104e-06, "epoch": 5.094827586206897, "percentage": 72.78, "elapsed_time": "2:49:40", "remaining_time": "1:03:26"}
|
||||||
|
{"current_steps": 2960, "total_steps": 4060, "loss": 0.2369, "lr": 8.311169304062408e-06, "epoch": 5.103448275862069, "percentage": 72.91, "elapsed_time": "2:49:50", "remaining_time": "1:03:07"}
|
||||||
|
{"current_steps": 2965, "total_steps": 4060, "loss": 0.2416, "lr": 8.24151285993005e-06, "epoch": 5.112068965517241, "percentage": 73.03, "elapsed_time": "2:50:07", "remaining_time": "1:02:49"}
|
||||||
|
{"current_steps": 2970, "total_steps": 4060, "loss": 0.2527, "lr": 8.172073712656217e-06, "epoch": 5.120689655172414, "percentage": 73.15, "elapsed_time": "2:50:23", "remaining_time": "1:02:32"}
|
||||||
|
{"current_steps": 2975, "total_steps": 4060, "loss": 0.2402, "lr": 8.102853145476443e-06, "epoch": 5.129310344827586, "percentage": 73.28, "elapsed_time": "2:50:36", "remaining_time": "1:02:13"}
|
||||||
|
{"current_steps": 2980, "total_steps": 4060, "loss": 0.241, "lr": 8.033852437586909e-06, "epoch": 5.137931034482759, "percentage": 73.4, "elapsed_time": "2:50:54", "remaining_time": "1:01:56"}
|
||||||
|
{"current_steps": 2985, "total_steps": 4060, "loss": 0.2321, "lr": 7.965072864120795e-06, "epoch": 5.146551724137931, "percentage": 73.52, "elapsed_time": "2:51:09", "remaining_time": "1:01:38"}
|
||||||
|
{"current_steps": 2990, "total_steps": 4060, "loss": 0.2139, "lr": 7.896515696124703e-06, "epoch": 5.155172413793103, "percentage": 73.65, "elapsed_time": "2:51:23", "remaining_time": "1:01:20"}
|
||||||
|
{"current_steps": 2995, "total_steps": 4060, "loss": 0.2276, "lr": 7.828182200535192e-06, "epoch": 5.163793103448276, "percentage": 73.77, "elapsed_time": "2:51:35", "remaining_time": "1:01:01"}
|
||||||
|
{"current_steps": 3000, "total_steps": 4060, "loss": 0.2685, "lr": 7.760073640155363e-06, "epoch": 5.172413793103448, "percentage": 73.89, "elapsed_time": "2:52:01", "remaining_time": "1:00:46"}
|
||||||
|
{"current_steps": 3005, "total_steps": 4060, "loss": 0.2457, "lr": 7.6921912736315e-06, "epoch": 5.181034482758621, "percentage": 74.01, "elapsed_time": "2:52:31", "remaining_time": "1:00:34"}
|
||||||
|
{"current_steps": 3010, "total_steps": 4060, "loss": 0.2438, "lr": 7.624536355429832e-06, "epoch": 5.189655172413793, "percentage": 74.14, "elapsed_time": "2:52:46", "remaining_time": "1:00:16"}
|
||||||
|
{"current_steps": 3015, "total_steps": 4060, "loss": 0.2495, "lr": 7.557110135813341e-06, "epoch": 5.198275862068965, "percentage": 74.26, "elapsed_time": "2:53:08", "remaining_time": "1:00:00"}
|
||||||
|
{"current_steps": 3020, "total_steps": 4060, "loss": 0.2085, "lr": 7.489913860818662e-06, "epoch": 5.206896551724138, "percentage": 74.38, "elapsed_time": "2:53:22", "remaining_time": "0:59:42"}
|
||||||
|
{"current_steps": 3025, "total_steps": 4060, "loss": 0.2721, "lr": 7.4229487722330315e-06, "epoch": 5.2155172413793105, "percentage": 74.51, "elapsed_time": "2:53:49", "remaining_time": "0:59:28"}
|
||||||
|
{"current_steps": 3030, "total_steps": 4060, "loss": 0.2534, "lr": 7.356216107571399e-06, "epoch": 5.224137931034483, "percentage": 74.63, "elapsed_time": "2:54:14", "remaining_time": "0:59:13"}
|
||||||
|
{"current_steps": 3035, "total_steps": 4060, "loss": 0.2441, "lr": 7.289717100053497e-06, "epoch": 5.232758620689655, "percentage": 74.75, "elapsed_time": "2:54:31", "remaining_time": "0:58:56"}
|
||||||
|
{"current_steps": 3040, "total_steps": 4060, "loss": 0.2324, "lr": 7.2234529785810645e-06, "epoch": 5.241379310344827, "percentage": 74.88, "elapsed_time": "2:54:47", "remaining_time": "0:58:38"}
|
||||||
|
{"current_steps": 3045, "total_steps": 4060, "loss": 0.2561, "lr": 7.157424967715163e-06, "epoch": 5.25, "percentage": 75.0, "elapsed_time": "2:55:02", "remaining_time": "0:58:20"}
|
||||||
|
{"current_steps": 3050, "total_steps": 4060, "loss": 0.2467, "lr": 7.091634287653526e-06, "epoch": 5.258620689655173, "percentage": 75.12, "elapsed_time": "2:55:16", "remaining_time": "0:58:02"}
|
||||||
|
{"current_steps": 3055, "total_steps": 4060, "loss": 0.2251, "lr": 7.026082154208012e-06, "epoch": 5.267241379310345, "percentage": 75.25, "elapsed_time": "2:55:35", "remaining_time": "0:57:45"}
|
||||||
|
{"current_steps": 3060, "total_steps": 4060, "loss": 0.24, "lr": 6.960769778782133e-06, "epoch": 5.275862068965517, "percentage": 75.37, "elapsed_time": "2:55:56", "remaining_time": "0:57:29"}
|
||||||
|
{"current_steps": 3065, "total_steps": 4060, "loss": 0.2249, "lr": 6.89569836834868e-06, "epoch": 5.2844827586206895, "percentage": 75.49, "elapsed_time": "2:56:15", "remaining_time": "0:57:13"}
|
||||||
|
{"current_steps": 3070, "total_steps": 4060, "loss": 0.2669, "lr": 6.830869125427406e-06, "epoch": 5.293103448275862, "percentage": 75.62, "elapsed_time": "2:56:35", "remaining_time": "0:56:56"}
|
||||||
|
{"current_steps": 3075, "total_steps": 4060, "loss": 0.2456, "lr": 6.766283248062817e-06, "epoch": 5.301724137931035, "percentage": 75.74, "elapsed_time": "2:56:49", "remaining_time": "0:56:38"}
|
||||||
|
{"current_steps": 3080, "total_steps": 4060, "loss": 0.2394, "lr": 6.701941929801996e-06, "epoch": 5.310344827586207, "percentage": 75.86, "elapsed_time": "2:57:02", "remaining_time": "0:56:20"}
|
||||||
|
{"current_steps": 3085, "total_steps": 4060, "loss": 0.2123, "lr": 6.637846359672611e-06, "epoch": 5.318965517241379, "percentage": 75.99, "elapsed_time": "2:57:19", "remaining_time": "0:56:02"}
|
||||||
|
{"current_steps": 3090, "total_steps": 4060, "loss": 0.2953, "lr": 6.57399772216089e-06, "epoch": 5.327586206896552, "percentage": 76.11, "elapsed_time": "2:57:36", "remaining_time": "0:55:45"}
|
||||||
|
{"current_steps": 3095, "total_steps": 4060, "loss": 0.2464, "lr": 6.510397197189724e-06, "epoch": 5.336206896551724, "percentage": 76.23, "elapsed_time": "2:57:53", "remaining_time": "0:55:27"}
|
||||||
|
{"current_steps": 3100, "total_steps": 4060, "loss": 0.2404, "lr": 6.447045960096909e-06, "epoch": 5.344827586206897, "percentage": 76.35, "elapsed_time": "2:58:08", "remaining_time": "0:55:09"}
|
||||||
|
{"current_steps": 3105, "total_steps": 4060, "loss": 0.2518, "lr": 6.383945181613398e-06, "epoch": 5.353448275862069, "percentage": 76.48, "elapsed_time": "2:58:24", "remaining_time": "0:54:52"}
|
||||||
|
{"current_steps": 3110, "total_steps": 4060, "loss": 0.2227, "lr": 6.32109602784166e-06, "epoch": 5.362068965517241, "percentage": 76.6, "elapsed_time": "2:58:38", "remaining_time": "0:54:34"}
|
||||||
|
{"current_steps": 3115, "total_steps": 4060, "loss": 0.2853, "lr": 6.258499660234147e-06, "epoch": 5.370689655172414, "percentage": 76.72, "elapsed_time": "2:59:00", "remaining_time": "0:54:18"}
|
||||||
|
{"current_steps": 3120, "total_steps": 4060, "loss": 0.2708, "lr": 6.196157235571813e-06, "epoch": 5.379310344827586, "percentage": 76.85, "elapsed_time": "2:59:24", "remaining_time": "0:54:03"}
|
||||||
|
{"current_steps": 3125, "total_steps": 4060, "loss": 0.2353, "lr": 6.134069905942764e-06, "epoch": 5.387931034482759, "percentage": 76.97, "elapsed_time": "2:59:36", "remaining_time": "0:53:44"}
|
||||||
|
{"current_steps": 3130, "total_steps": 4060, "loss": 0.2381, "lr": 6.072238818720919e-06, "epoch": 5.396551724137931, "percentage": 77.09, "elapsed_time": "3:00:01", "remaining_time": "0:53:29"}
|
||||||
|
{"current_steps": 3135, "total_steps": 4060, "loss": 0.2319, "lr": 6.010665116544858e-06, "epoch": 5.405172413793103, "percentage": 77.22, "elapsed_time": "3:00:17", "remaining_time": "0:53:11"}
|
||||||
|
{"current_steps": 3140, "total_steps": 4060, "loss": 0.2294, "lr": 5.9493499372967e-06, "epoch": 5.413793103448276, "percentage": 77.34, "elapsed_time": "3:00:33", "remaining_time": "0:52:54"}
|
||||||
|
{"current_steps": 3145, "total_steps": 4060, "loss": 0.2439, "lr": 5.888294414081024e-06, "epoch": 5.422413793103448, "percentage": 77.46, "elapsed_time": "3:00:45", "remaining_time": "0:52:35"}
|
||||||
|
{"current_steps": 3150, "total_steps": 4060, "loss": 0.2635, "lr": 5.827499675203987e-06, "epoch": 5.431034482758621, "percentage": 77.59, "elapsed_time": "3:00:59", "remaining_time": "0:52:17"}
|
||||||
|
{"current_steps": 3155, "total_steps": 4060, "loss": 0.191, "lr": 5.76696684415245e-06, "epoch": 5.439655172413793, "percentage": 77.71, "elapsed_time": "3:01:21", "remaining_time": "0:52:01"}
|
||||||
|
{"current_steps": 3160, "total_steps": 4060, "loss": 0.2421, "lr": 5.706697039573217e-06, "epoch": 5.448275862068965, "percentage": 77.83, "elapsed_time": "3:01:41", "remaining_time": "0:51:44"}
|
||||||
|
{"current_steps": 3165, "total_steps": 4060, "loss": 0.243, "lr": 5.646691375252344e-06, "epoch": 5.456896551724138, "percentage": 77.96, "elapsed_time": "3:02:03", "remaining_time": "0:51:28"}
|
||||||
|
{"current_steps": 3170, "total_steps": 4060, "loss": 0.2189, "lr": 5.586950960094606e-06, "epoch": 5.4655172413793105, "percentage": 78.08, "elapsed_time": "3:02:19", "remaining_time": "0:51:11"}
|
||||||
|
{"current_steps": 3175, "total_steps": 4060, "loss": 0.2347, "lr": 5.527476898102959e-06, "epoch": 5.474137931034483, "percentage": 78.2, "elapsed_time": "3:02:34", "remaining_time": "0:50:53"}
|
||||||
|
{"current_steps": 3180, "total_steps": 4060, "loss": 0.1949, "lr": 5.4682702883581395e-06, "epoch": 5.482758620689655, "percentage": 78.33, "elapsed_time": "3:02:51", "remaining_time": "0:50:36"}
|
||||||
|
{"current_steps": 3185, "total_steps": 4060, "loss": 0.264, "lr": 5.40933222499838e-06, "epoch": 5.491379310344827, "percentage": 78.45, "elapsed_time": "3:03:08", "remaining_time": "0:50:18"}
|
||||||
|
{"current_steps": 3190, "total_steps": 4060, "loss": 0.2186, "lr": 5.350663797199174e-06, "epoch": 5.5, "percentage": 78.57, "elapsed_time": "3:03:27", "remaining_time": "0:50:02"}
|
||||||
|
{"current_steps": 3195, "total_steps": 4060, "loss": 0.2218, "lr": 5.292266089153149e-06, "epoch": 5.508620689655173, "percentage": 78.69, "elapsed_time": "3:03:45", "remaining_time": "0:49:44"}
|
||||||
|
{"current_steps": 3200, "total_steps": 4060, "loss": 0.2312, "lr": 5.234140180050029e-06, "epoch": 5.517241379310345, "percentage": 78.82, "elapsed_time": "3:03:59", "remaining_time": "0:49:26"}
|
||||||
|
{"current_steps": 3205, "total_steps": 4060, "loss": 0.2455, "lr": 5.1762871440566935e-06, "epoch": 5.525862068965517, "percentage": 78.94, "elapsed_time": "3:04:11", "remaining_time": "0:49:08"}
|
||||||
|
{"current_steps": 3210, "total_steps": 4060, "loss": 0.2752, "lr": 5.118708050297332e-06, "epoch": 5.5344827586206895, "percentage": 79.06, "elapsed_time": "3:04:34", "remaining_time": "0:48:52"}
|
||||||
|
{"current_steps": 3215, "total_steps": 4060, "loss": 0.247, "lr": 5.061403962833669e-06, "epoch": 5.543103448275862, "percentage": 79.19, "elapsed_time": "3:04:59", "remaining_time": "0:48:37"}
|
||||||
|
{"current_steps": 3220, "total_steps": 4060, "loss": 0.2891, "lr": 5.004375940645314e-06, "epoch": 5.551724137931035, "percentage": 79.31, "elapsed_time": "3:05:20", "remaining_time": "0:48:21"}
|
||||||
|
{"current_steps": 3225, "total_steps": 4060, "loss": 0.2172, "lr": 4.947625037610219e-06, "epoch": 5.560344827586206, "percentage": 79.43, "elapsed_time": "3:05:43", "remaining_time": "0:48:05"}
|
||||||
|
{"current_steps": 3230, "total_steps": 4060, "loss": 0.2469, "lr": 4.8911523024851295e-06, "epoch": 5.568965517241379, "percentage": 79.56, "elapsed_time": "3:05:59", "remaining_time": "0:47:47"}
|
||||||
|
{"current_steps": 3235, "total_steps": 4060, "loss": 0.2305, "lr": 4.834958778886271e-06, "epoch": 5.577586206896552, "percentage": 79.68, "elapsed_time": "3:06:12", "remaining_time": "0:47:29"}
|
||||||
|
{"current_steps": 3240, "total_steps": 4060, "loss": 0.2435, "lr": 4.779045505270043e-06, "epoch": 5.586206896551724, "percentage": 79.8, "elapsed_time": "3:06:29", "remaining_time": "0:47:11"}
|
||||||
|
{"current_steps": 3245, "total_steps": 4060, "loss": 0.2293, "lr": 4.723413514913817e-06, "epoch": 5.594827586206897, "percentage": 79.93, "elapsed_time": "3:06:51", "remaining_time": "0:46:55"}
|
||||||
|
{"current_steps": 3250, "total_steps": 4060, "loss": 0.265, "lr": 4.66806383589685e-06, "epoch": 5.603448275862069, "percentage": 80.05, "elapsed_time": "3:07:07", "remaining_time": "0:46:38"}
|
||||||
|
{"current_steps": 3255, "total_steps": 4060, "loss": 0.2273, "lr": 4.6129974910812855e-06, "epoch": 5.612068965517241, "percentage": 80.17, "elapsed_time": "3:07:19", "remaining_time": "0:46:19"}
|
||||||
|
{"current_steps": 3260, "total_steps": 4060, "loss": 0.2677, "lr": 4.558215498093252e-06, "epoch": 5.620689655172414, "percentage": 80.3, "elapsed_time": "3:07:34", "remaining_time": "0:46:01"}
|
||||||
|
{"current_steps": 3265, "total_steps": 4060, "loss": 0.2408, "lr": 4.503718869304063e-06, "epoch": 5.629310344827586, "percentage": 80.42, "elapsed_time": "3:07:53", "remaining_time": "0:45:44"}
|
||||||
|
{"current_steps": 3270, "total_steps": 4060, "loss": 0.3139, "lr": 4.449508611811482e-06, "epoch": 5.637931034482759, "percentage": 80.54, "elapsed_time": "3:08:14", "remaining_time": "0:45:28"}
|
||||||
|
{"current_steps": 3275, "total_steps": 4060, "loss": 0.2727, "lr": 4.395585727421139e-06, "epoch": 5.646551724137931, "percentage": 80.67, "elapsed_time": "3:08:30", "remaining_time": "0:45:11"}
|
||||||
|
{"current_steps": 3280, "total_steps": 4060, "loss": 0.2462, "lr": 4.341951212628031e-06, "epoch": 5.655172413793103, "percentage": 80.79, "elapsed_time": "3:08:48", "remaining_time": "0:44:53"}
|
||||||
|
{"current_steps": 3285, "total_steps": 4060, "loss": 0.2297, "lr": 4.288606058598048e-06, "epoch": 5.663793103448276, "percentage": 80.91, "elapsed_time": "3:09:03", "remaining_time": "0:44:36"}
|
||||||
|
{"current_steps": 3290, "total_steps": 4060, "loss": 0.2428, "lr": 4.235551251149714e-06, "epoch": 5.672413793103448, "percentage": 81.03, "elapsed_time": "3:09:21", "remaining_time": "0:44:18"}
|
||||||
|
{"current_steps": 3295, "total_steps": 4060, "loss": 0.2371, "lr": 4.1827877707359474e-06, "epoch": 5.681034482758621, "percentage": 81.16, "elapsed_time": "3:09:36", "remaining_time": "0:44:01"}
|
||||||
|
{"current_steps": 3300, "total_steps": 4060, "loss": 0.2379, "lr": 4.130316592425934e-06, "epoch": 5.689655172413794, "percentage": 81.28, "elapsed_time": "3:09:50", "remaining_time": "0:43:43"}
|
||||||
|
{"current_steps": 3305, "total_steps": 4060, "loss": 0.2095, "lr": 4.078138685887125e-06, "epoch": 5.698275862068965, "percentage": 81.4, "elapsed_time": "3:10:12", "remaining_time": "0:43:27"}
|
||||||
|
{"current_steps": 3310, "total_steps": 4060, "loss": 0.2235, "lr": 4.026255015367302e-06, "epoch": 5.706896551724138, "percentage": 81.53, "elapsed_time": "3:10:29", "remaining_time": "0:43:09"}
|
||||||
|
{"current_steps": 3315, "total_steps": 4060, "loss": 0.2597, "lr": 3.974666539676774e-06, "epoch": 5.7155172413793105, "percentage": 81.65, "elapsed_time": "3:10:42", "remaining_time": "0:42:51"}
|
||||||
|
{"current_steps": 3320, "total_steps": 4060, "loss": 0.2431, "lr": 3.923374212170634e-06, "epoch": 5.724137931034483, "percentage": 81.77, "elapsed_time": "3:10:59", "remaining_time": "0:42:34"}
|
||||||
|
{"current_steps": 3325, "total_steps": 4060, "loss": 0.2225, "lr": 3.872378980731168e-06, "epoch": 5.732758620689655, "percentage": 81.9, "elapsed_time": "3:11:18", "remaining_time": "0:42:17"}
|
||||||
|
{"current_steps": 3330, "total_steps": 4060, "loss": 0.2548, "lr": 3.821681787750327e-06, "epoch": 5.741379310344827, "percentage": 82.02, "elapsed_time": "3:11:33", "remaining_time": "0:41:59"}
|
||||||
|
{"current_steps": 3335, "total_steps": 4060, "loss": 0.2263, "lr": 3.7712835701122985e-06, "epoch": 5.75, "percentage": 82.14, "elapsed_time": "3:11:47", "remaining_time": "0:41:41"}
|
||||||
|
{"current_steps": 3340, "total_steps": 4060, "loss": 0.2894, "lr": 3.721185259176223e-06, "epoch": 5.758620689655173, "percentage": 82.27, "elapsed_time": "3:12:10", "remaining_time": "0:41:25"}
|
||||||
|
{"current_steps": 3345, "total_steps": 4060, "loss": 0.2406, "lr": 3.6713877807589503e-06, "epoch": 5.767241379310345, "percentage": 82.39, "elapsed_time": "3:12:22", "remaining_time": "0:41:07"}
|
||||||
|
{"current_steps": 3350, "total_steps": 4060, "loss": 0.2462, "lr": 3.621892055117955e-06, "epoch": 5.775862068965517, "percentage": 82.51, "elapsed_time": "3:12:43", "remaining_time": "0:40:50"}
|
||||||
|
{"current_steps": 3355, "total_steps": 4060, "loss": 0.2559, "lr": 3.572698996934303e-06, "epoch": 5.7844827586206895, "percentage": 82.64, "elapsed_time": "3:12:57", "remaining_time": "0:40:32"}
|
||||||
|
{"current_steps": 3360, "total_steps": 4060, "loss": 0.2423, "lr": 3.5238095152957906e-06, "epoch": 5.793103448275862, "percentage": 82.76, "elapsed_time": "3:13:09", "remaining_time": "0:40:14"}
|
||||||
|
{"current_steps": 3365, "total_steps": 4060, "loss": 0.2497, "lr": 3.4752245136801065e-06, "epoch": 5.801724137931035, "percentage": 82.88, "elapsed_time": "3:13:26", "remaining_time": "0:39:57"}
|
||||||
|
{"current_steps": 3370, "total_steps": 4060, "loss": 0.218, "lr": 3.4269448899381354e-06, "epoch": 5.810344827586206, "percentage": 83.0, "elapsed_time": "3:13:43", "remaining_time": "0:39:39"}
|
||||||
|
{"current_steps": 3375, "total_steps": 4060, "loss": 0.2209, "lr": 3.3789715362773955e-06, "epoch": 5.818965517241379, "percentage": 83.13, "elapsed_time": "3:14:00", "remaining_time": "0:39:22"}
|
||||||
|
{"current_steps": 3380, "total_steps": 4060, "loss": 0.2409, "lr": 3.3313053392455317e-06, "epoch": 5.827586206896552, "percentage": 83.25, "elapsed_time": "3:14:15", "remaining_time": "0:39:04"}
|
||||||
|
{"current_steps": 3385, "total_steps": 4060, "loss": 0.2342, "lr": 3.2839471797139287e-06, "epoch": 5.836206896551724, "percentage": 83.37, "elapsed_time": "3:14:32", "remaining_time": "0:38:47"}
|
||||||
|
{"current_steps": 3390, "total_steps": 4060, "loss": 0.2294, "lr": 3.236897932861438e-06, "epoch": 5.844827586206897, "percentage": 83.5, "elapsed_time": "3:14:49", "remaining_time": "0:38:30"}
|
||||||
|
{"current_steps": 3395, "total_steps": 4060, "loss": 0.271, "lr": 3.190158468158209e-06, "epoch": 5.853448275862069, "percentage": 83.62, "elapsed_time": "3:15:08", "remaining_time": "0:38:13"}
|
||||||
|
{"current_steps": 3400, "total_steps": 4060, "loss": 0.218, "lr": 3.1437296493496183e-06, "epoch": 5.862068965517241, "percentage": 83.74, "elapsed_time": "3:15:22", "remaining_time": "0:37:55"}
|
||||||
|
{"current_steps": 3405, "total_steps": 4060, "loss": 0.2312, "lr": 3.0976123344402897e-06, "epoch": 5.870689655172414, "percentage": 83.87, "elapsed_time": "3:15:39", "remaining_time": "0:37:38"}
|
||||||
|
{"current_steps": 3410, "total_steps": 4060, "loss": 0.2295, "lr": 3.0518073756782683e-06, "epoch": 5.879310344827586, "percentage": 83.99, "elapsed_time": "3:15:53", "remaining_time": "0:37:20"}
|
||||||
|
{"current_steps": 3415, "total_steps": 4060, "loss": 0.243, "lr": 3.0063156195392685e-06, "epoch": 5.887931034482759, "percentage": 84.11, "elapsed_time": "3:16:08", "remaining_time": "0:37:02"}
|
||||||
|
{"current_steps": 3420, "total_steps": 4060, "loss": 0.2541, "lr": 2.9611379067109914e-06, "epoch": 5.896551724137931, "percentage": 84.24, "elapsed_time": "3:16:22", "remaining_time": "0:36:44"}
|
||||||
|
{"current_steps": 3425, "total_steps": 4060, "loss": 0.2338, "lr": 2.9162750720776366e-06, "epoch": 5.905172413793103, "percentage": 84.36, "elapsed_time": "3:16:44", "remaining_time": "0:36:28"}
|
||||||
|
{"current_steps": 3430, "total_steps": 4060, "loss": 0.2439, "lr": 2.871727944704452e-06, "epoch": 5.913793103448276, "percentage": 84.48, "elapsed_time": "3:17:03", "remaining_time": "0:36:11"}
|
||||||
|
{"current_steps": 3435, "total_steps": 4060, "loss": 0.2621, "lr": 2.8274973478224167e-06, "epoch": 5.922413793103448, "percentage": 84.61, "elapsed_time": "3:17:17", "remaining_time": "0:35:53"}
|
||||||
|
{"current_steps": 3440, "total_steps": 4060, "loss": 0.2391, "lr": 2.783584098813006e-06, "epoch": 5.931034482758621, "percentage": 84.73, "elapsed_time": "3:17:33", "remaining_time": "0:35:36"}
|
||||||
|
{"current_steps": 3445, "total_steps": 4060, "loss": 0.2637, "lr": 2.739989009193138e-06, "epoch": 5.939655172413794, "percentage": 84.85, "elapsed_time": "3:17:47", "remaining_time": "0:35:18"}
|
||||||
|
{"current_steps": 3450, "total_steps": 4060, "loss": 0.24, "lr": 2.6967128846001234e-06, "epoch": 5.948275862068965, "percentage": 84.98, "elapsed_time": "3:17:59", "remaining_time": "0:35:00"}
|
||||||
|
{"current_steps": 3455, "total_steps": 4060, "loss": 0.2738, "lr": 2.6537565247768094e-06, "epoch": 5.956896551724138, "percentage": 85.1, "elapsed_time": "3:18:18", "remaining_time": "0:34:43"}
|
||||||
|
{"current_steps": 3460, "total_steps": 4060, "loss": 0.2392, "lr": 2.611120723556775e-06, "epoch": 5.9655172413793105, "percentage": 85.22, "elapsed_time": "3:18:31", "remaining_time": "0:34:25"}
|
||||||
|
{"current_steps": 3465, "total_steps": 4060, "loss": 0.27, "lr": 2.568806268849684e-06, "epoch": 5.974137931034483, "percentage": 85.34, "elapsed_time": "3:18:49", "remaining_time": "0:34:08"}
|
||||||
|
{"current_steps": 3470, "total_steps": 4060, "loss": 0.2301, "lr": 2.526813942626736e-06, "epoch": 5.982758620689655, "percentage": 85.47, "elapsed_time": "3:19:00", "remaining_time": "0:33:50"}
|
||||||
|
{"current_steps": 3475, "total_steps": 4060, "loss": 0.2525, "lr": 2.4851445209061574e-06, "epoch": 5.991379310344827, "percentage": 85.59, "elapsed_time": "3:19:19", "remaining_time": "0:33:33"}
|
||||||
|
{"current_steps": 3480, "total_steps": 4060, "loss": 0.2906, "lr": 2.4437987737389277e-06, "epoch": 6.0, "percentage": 85.71, "elapsed_time": "3:19:34", "remaining_time": "0:33:15"}
|
||||||
|
{"current_steps": 3485, "total_steps": 4060, "loss": 0.2246, "lr": 2.40277746519451e-06, "epoch": 6.008620689655173, "percentage": 85.84, "elapsed_time": "3:19:47", "remaining_time": "0:32:57"}
|
||||||
|
{"current_steps": 3490, "total_steps": 4060, "loss": 0.234, "lr": 2.362081353346746e-06, "epoch": 6.017241379310345, "percentage": 85.96, "elapsed_time": "3:20:01", "remaining_time": "0:32:40"}
|
||||||
|
{"current_steps": 3495, "total_steps": 4060, "loss": 0.2488, "lr": 2.3217111902598298e-06, "epoch": 6.025862068965517, "percentage": 86.08, "elapsed_time": "3:20:15", "remaining_time": "0:32:22"}
|
||||||
|
{"current_steps": 3500, "total_steps": 4060, "loss": 0.2225, "lr": 2.2816677219744388e-06, "epoch": 6.0344827586206895, "percentage": 86.21, "elapsed_time": "3:20:27", "remaining_time": "0:32:04"}
|
||||||
|
{"current_steps": 3505, "total_steps": 4060, "loss": 0.267, "lr": 2.241951688493924e-06, "epoch": 6.043103448275862, "percentage": 86.33, "elapsed_time": "3:20:51", "remaining_time": "0:31:48"}
|
||||||
|
{"current_steps": 3510, "total_steps": 4060, "loss": 0.2241, "lr": 2.2025638237706294e-06, "epoch": 6.051724137931035, "percentage": 86.45, "elapsed_time": "3:21:08", "remaining_time": "0:31:31"}
|
||||||
|
{"current_steps": 3515, "total_steps": 4060, "loss": 0.2314, "lr": 2.1635048556923555e-06, "epoch": 6.060344827586207, "percentage": 86.58, "elapsed_time": "3:21:28", "remaining_time": "0:31:14"}
|
||||||
|
{"current_steps": 3520, "total_steps": 4060, "loss": 0.2125, "lr": 2.1247755060688856e-06, "epoch": 6.068965517241379, "percentage": 86.7, "elapsed_time": "3:21:41", "remaining_time": "0:30:56"}
|
||||||
|
{"current_steps": 3525, "total_steps": 4060, "loss": 0.2403, "lr": 2.0863764906186514e-06, "epoch": 6.077586206896552, "percentage": 86.82, "elapsed_time": "3:22:00", "remaining_time": "0:30:39"}
|
||||||
|
{"current_steps": 3530, "total_steps": 4060, "loss": 0.2031, "lr": 2.048308518955515e-06, "epoch": 6.086206896551724, "percentage": 86.95, "elapsed_time": "3:22:22", "remaining_time": "0:30:23"}
|
||||||
|
{"current_steps": 3535, "total_steps": 4060, "loss": 0.2636, "lr": 2.010572294575641e-06, "epoch": 6.094827586206897, "percentage": 87.07, "elapsed_time": "3:22:41", "remaining_time": "0:30:06"}
|
||||||
|
{"current_steps": 3540, "total_steps": 4060, "loss": 0.2792, "lr": 1.9731685148445168e-06, "epoch": 6.103448275862069, "percentage": 87.19, "elapsed_time": "3:23:08", "remaining_time": "0:29:50"}
|
||||||
|
{"current_steps": 3545, "total_steps": 4060, "loss": 0.2133, "lr": 1.9360978709840304e-06, "epoch": 6.112068965517241, "percentage": 87.32, "elapsed_time": "3:23:23", "remaining_time": "0:29:32"}
|
||||||
|
{"current_steps": 3550, "total_steps": 4060, "loss": 0.2409, "lr": 1.8993610480597359e-06, "epoch": 6.120689655172414, "percentage": 87.44, "elapsed_time": "3:23:38", "remaining_time": "0:29:15"}
|
||||||
|
{"current_steps": 3555, "total_steps": 4060, "loss": 0.1985, "lr": 1.8629587249681802e-06, "epoch": 6.129310344827586, "percentage": 87.56, "elapsed_time": "3:23:52", "remaining_time": "0:28:57"}
|
||||||
|
{"current_steps": 3560, "total_steps": 4060, "loss": 0.2246, "lr": 1.8268915744243321e-06, "epoch": 6.137931034482759, "percentage": 87.68, "elapsed_time": "3:24:11", "remaining_time": "0:28:40"}
|
||||||
|
{"current_steps": 3565, "total_steps": 4060, "loss": 0.2997, "lr": 1.7911602629491876e-06, "epoch": 6.146551724137931, "percentage": 87.81, "elapsed_time": "3:24:32", "remaining_time": "0:28:24"}
|
||||||
|
{"current_steps": 3570, "total_steps": 4060, "loss": 0.2505, "lr": 1.7557654508574339e-06, "epoch": 6.155172413793103, "percentage": 87.93, "elapsed_time": "3:24:44", "remaining_time": "0:28:06"}
|
||||||
|
{"current_steps": 3575, "total_steps": 4060, "loss": 0.2367, "lr": 1.7207077922452465e-06, "epoch": 6.163793103448276, "percentage": 88.05, "elapsed_time": "3:24:57", "remaining_time": "0:27:48"}
|
||||||
|
{"current_steps": 3580, "total_steps": 4060, "loss": 0.2092, "lr": 1.6859879349782016e-06, "epoch": 6.172413793103448, "percentage": 88.18, "elapsed_time": "3:25:08", "remaining_time": "0:27:30"}
|
||||||
|
{"current_steps": 3585, "total_steps": 4060, "loss": 0.2474, "lr": 1.6516065206793142e-06, "epoch": 6.181034482758621, "percentage": 88.3, "elapsed_time": "3:25:23", "remaining_time": "0:27:12"}
|
||||||
|
{"current_steps": 3590, "total_steps": 4060, "loss": 0.1913, "lr": 1.6175641847171687e-06, "epoch": 6.189655172413793, "percentage": 88.42, "elapsed_time": "3:25:38", "remaining_time": "0:26:55"}
|
||||||
|
{"current_steps": 3595, "total_steps": 4060, "loss": 0.2323, "lr": 1.5838615561941705e-06, "epoch": 6.198275862068965, "percentage": 88.55, "elapsed_time": "3:25:56", "remaining_time": "0:26:38"}
|
||||||
|
{"current_steps": 3600, "total_steps": 4060, "loss": 0.2557, "lr": 1.550499257934952e-06, "epoch": 6.206896551724138, "percentage": 88.67, "elapsed_time": "3:26:17", "remaining_time": "0:26:21"}
|
||||||
|
{"current_steps": 3605, "total_steps": 4060, "loss": 0.2743, "lr": 1.5174779064748246e-06, "epoch": 6.2155172413793105, "percentage": 88.79, "elapsed_time": "3:26:32", "remaining_time": "0:26:04"}
|
||||||
|
{"current_steps": 3610, "total_steps": 4060, "loss": 0.2045, "lr": 1.4847981120484089e-06, "epoch": 6.224137931034483, "percentage": 88.92, "elapsed_time": "3:26:49", "remaining_time": "0:25:46"}
|
||||||
|
{"current_steps": 3615, "total_steps": 4060, "loss": 0.2456, "lr": 1.4524604785783548e-06, "epoch": 6.232758620689655, "percentage": 89.04, "elapsed_time": "3:27:02", "remaining_time": "0:25:29"}
|
||||||
|
{"current_steps": 3620, "total_steps": 4060, "loss": 0.235, "lr": 1.4204656036641717e-06, "epoch": 6.241379310344827, "percentage": 89.16, "elapsed_time": "3:27:24", "remaining_time": "0:25:12"}
|
||||||
|
{"current_steps": 3625, "total_steps": 4060, "loss": 0.2274, "lr": 1.3888140785711945e-06, "epoch": 6.25, "percentage": 89.29, "elapsed_time": "3:27:44", "remaining_time": "0:24:55"}
|
||||||
|
{"current_steps": 3630, "total_steps": 4060, "loss": 0.2241, "lr": 1.3575064882196398e-06, "epoch": 6.258620689655173, "percentage": 89.41, "elapsed_time": "3:28:07", "remaining_time": "0:24:39"}
|
||||||
|
{"current_steps": 3635, "total_steps": 4060, "loss": 0.2764, "lr": 1.326543411173833e-06, "epoch": 6.267241379310345, "percentage": 89.53, "elapsed_time": "3:28:27", "remaining_time": "0:24:22"}
|
||||||
|
{"current_steps": 3640, "total_steps": 4060, "loss": 0.2381, "lr": 1.295925419631474e-06, "epoch": 6.275862068965517, "percentage": 89.66, "elapsed_time": "3:28:41", "remaining_time": "0:24:04"}
|
||||||
|
{"current_steps": 3645, "total_steps": 4060, "loss": 0.2255, "lr": 1.265653079413094e-06, "epoch": 6.2844827586206895, "percentage": 89.78, "elapsed_time": "3:28:52", "remaining_time": "0:23:46"}
|
||||||
|
{"current_steps": 3650, "total_steps": 4060, "loss": 0.2098, "lr": 1.2357269499515745e-06, "epoch": 6.293103448275862, "percentage": 89.9, "elapsed_time": "3:29:05", "remaining_time": "0:23:29"}
|
||||||
|
{"current_steps": 3655, "total_steps": 4060, "loss": 0.1923, "lr": 1.2061475842818337e-06, "epoch": 6.301724137931035, "percentage": 90.02, "elapsed_time": "3:29:21", "remaining_time": "0:23:11"}
|
||||||
|
{"current_steps": 3660, "total_steps": 4060, "loss": 0.2001, "lr": 1.176915529030589e-06, "epoch": 6.310344827586207, "percentage": 90.15, "elapsed_time": "3:29:39", "remaining_time": "0:22:54"}
|
||||||
|
{"current_steps": 3665, "total_steps": 4060, "loss": 0.224, "lr": 1.1480313244062603e-06, "epoch": 6.318965517241379, "percentage": 90.27, "elapsed_time": "3:29:55", "remaining_time": "0:22:37"}
|
||||||
|
{"current_steps": 3670, "total_steps": 4060, "loss": 0.2747, "lr": 1.1194955041889898e-06, "epoch": 6.327586206896552, "percentage": 90.39, "elapsed_time": "3:30:15", "remaining_time": "0:22:20"}
|
||||||
|
{"current_steps": 3675, "total_steps": 4060, "loss": 0.2437, "lr": 1.0913085957207748e-06, "epoch": 6.336206896551724, "percentage": 90.52, "elapsed_time": "3:30:34", "remaining_time": "0:22:03"}
|
||||||
|
{"current_steps": 3680, "total_steps": 4060, "loss": 0.2513, "lr": 1.063471119895727e-06, "epoch": 6.344827586206897, "percentage": 90.64, "elapsed_time": "3:30:46", "remaining_time": "0:21:45"}
|
||||||
|
{"current_steps": 3685, "total_steps": 4060, "loss": 0.2238, "lr": 1.0359835911504246e-06, "epoch": 6.353448275862069, "percentage": 90.76, "elapsed_time": "3:31:03", "remaining_time": "0:21:28"}
|
||||||
|
{"current_steps": 3690, "total_steps": 4060, "loss": 0.2239, "lr": 1.0088465174544514e-06, "epoch": 6.362068965517241, "percentage": 90.89, "elapsed_time": "3:31:19", "remaining_time": "0:21:11"}
|
||||||
|
{"current_steps": 3695, "total_steps": 4060, "loss": 0.2394, "lr": 9.820604003009614e-07, "epoch": 6.370689655172414, "percentage": 91.01, "elapsed_time": "3:31:36", "remaining_time": "0:20:54"}
|
||||||
|
{"current_steps": 3700, "total_steps": 4060, "loss": 0.2327, "lr": 9.556257346974319e-07, "epoch": 6.379310344827586, "percentage": 91.13, "elapsed_time": "3:31:53", "remaining_time": "0:20:36"}
|
||||||
|
{"current_steps": 3705, "total_steps": 4060, "loss": 0.225, "lr": 9.295430091565261e-07, "epoch": 6.387931034482759, "percentage": 91.26, "elapsed_time": "3:32:07", "remaining_time": "0:20:19"}
|
||||||
|
{"current_steps": 3710, "total_steps": 4060, "loss": 0.1977, "lr": 9.038127056870416e-07, "epoch": 6.396551724137931, "percentage": 91.38, "elapsed_time": "3:32:26", "remaining_time": "0:20:02"}
|
||||||
|
{"current_steps": 3715, "total_steps": 4060, "loss": 0.2323, "lr": 8.784352997850277e-07, "epoch": 6.405172413793103, "percentage": 91.5, "elapsed_time": "3:32:41", "remaining_time": "0:19:45"}
|
||||||
|
{"current_steps": 3720, "total_steps": 4060, "loss": 0.2296, "lr": 8.534112604249789e-07, "epoch": 6.413793103448276, "percentage": 91.63, "elapsed_time": "3:32:54", "remaining_time": "0:19:27"}
|
||||||
|
{"current_steps": 3725, "total_steps": 4060, "loss": 0.2189, "lr": 8.287410500511739e-07, "epoch": 6.422413793103448, "percentage": 91.75, "elapsed_time": "3:33:10", "remaining_time": "0:19:10"}
|
||||||
|
{"current_steps": 3730, "total_steps": 4060, "loss": 0.2151, "lr": 8.044251245691393e-07, "epoch": 6.431034482758621, "percentage": 91.87, "elapsed_time": "3:33:37", "remaining_time": "0:18:53"}
|
||||||
|
{"current_steps": 3735, "total_steps": 4060, "loss": 0.2285, "lr": 7.804639333372077e-07, "epoch": 6.439655172413793, "percentage": 92.0, "elapsed_time": "3:33:54", "remaining_time": "0:18:36"}
|
||||||
|
{"current_steps": 3740, "total_steps": 4060, "loss": 0.2448, "lr": 7.568579191582248e-07, "epoch": 6.448275862068965, "percentage": 92.12, "elapsed_time": "3:34:09", "remaining_time": "0:18:19"}
|
||||||
|
{"current_steps": 3745, "total_steps": 4060, "loss": 0.229, "lr": 7.336075182713708e-07, "epoch": 6.456896551724138, "percentage": 92.24, "elapsed_time": "3:34:30", "remaining_time": "0:18:02"}
|
||||||
|
{"current_steps": 3750, "total_steps": 4060, "loss": 0.2076, "lr": 7.107131603440809e-07, "epoch": 6.4655172413793105, "percentage": 92.36, "elapsed_time": "3:34:47", "remaining_time": "0:17:45"}
|
||||||
|
{"current_steps": 3755, "total_steps": 4060, "loss": 0.2216, "lr": 6.881752684641219e-07, "epoch": 6.474137931034483, "percentage": 92.49, "elapsed_time": "3:35:02", "remaining_time": "0:17:27"}
|
||||||
|
{"current_steps": 3760, "total_steps": 4060, "loss": 0.1965, "lr": 6.659942591317703e-07, "epoch": 6.482758620689655, "percentage": 92.61, "elapsed_time": "3:35:19", "remaining_time": "0:17:10"}
|
||||||
|
{"current_steps": 3765, "total_steps": 4060, "loss": 0.2343, "lr": 6.441705422521072e-07, "epoch": 6.491379310344827, "percentage": 92.73, "elapsed_time": "3:35:40", "remaining_time": "0:16:53"}
|
||||||
|
{"current_steps": 3770, "total_steps": 4060, "loss": 0.2119, "lr": 6.22704521127444e-07, "epoch": 6.5, "percentage": 92.86, "elapsed_time": "3:35:58", "remaining_time": "0:16:36"}
|
||||||
|
{"current_steps": 3775, "total_steps": 4060, "loss": 0.2217, "lr": 6.015965924498912e-07, "epoch": 6.508620689655173, "percentage": 92.98, "elapsed_time": "3:36:10", "remaining_time": "0:16:19"}
|
||||||
|
{"current_steps": 3780, "total_steps": 4060, "loss": 0.2165, "lr": 5.808471462939946e-07, "epoch": 6.517241379310345, "percentage": 93.1, "elapsed_time": "3:36:24", "remaining_time": "0:16:01"}
|
||||||
|
{"current_steps": 3785, "total_steps": 4060, "loss": 0.2125, "lr": 5.604565661095484e-07, "epoch": 6.525862068965517, "percentage": 93.23, "elapsed_time": "3:36:42", "remaining_time": "0:15:44"}
|
||||||
|
{"current_steps": 3790, "total_steps": 4060, "loss": 0.2221, "lr": 5.404252287145006e-07, "epoch": 6.5344827586206895, "percentage": 93.35, "elapsed_time": "3:36:58", "remaining_time": "0:15:27"}
|
||||||
|
{"current_steps": 3795, "total_steps": 4060, "loss": 0.2277, "lr": 5.207535042879963e-07, "epoch": 6.543103448275862, "percentage": 93.47, "elapsed_time": "3:37:17", "remaining_time": "0:15:10"}
|
||||||
|
{"current_steps": 3800, "total_steps": 4060, "loss": 0.2514, "lr": 5.014417563635276e-07, "epoch": 6.551724137931035, "percentage": 93.6, "elapsed_time": "3:37:38", "remaining_time": "0:14:53"}
|
||||||
|
{"current_steps": 3805, "total_steps": 4060, "loss": 0.2293, "lr": 4.824903418222259e-07, "epoch": 6.560344827586206, "percentage": 93.72, "elapsed_time": "3:37:57", "remaining_time": "0:14:36"}
|
||||||
|
{"current_steps": 3810, "total_steps": 4060, "loss": 0.2457, "lr": 4.638996108862559e-07, "epoch": 6.568965517241379, "percentage": 93.84, "elapsed_time": "3:38:15", "remaining_time": "0:14:19"}
|
||||||
|
{"current_steps": 3815, "total_steps": 4060, "loss": 0.2297, "lr": 4.456699071123538e-07, "epoch": 6.577586206896552, "percentage": 93.97, "elapsed_time": "3:38:30", "remaining_time": "0:14:01"}
|
||||||
|
{"current_steps": 3820, "total_steps": 4060, "loss": 0.2499, "lr": 4.2780156738546407e-07, "epoch": 6.586206896551724, "percentage": 94.09, "elapsed_time": "3:38:45", "remaining_time": "0:13:44"}
|
||||||
|
{"current_steps": 3825, "total_steps": 4060, "loss": 0.2736, "lr": 4.1029492191253296e-07, "epoch": 6.594827586206897, "percentage": 94.21, "elapsed_time": "3:39:04", "remaining_time": "0:13:27"}
|
||||||
|
{"current_steps": 3830, "total_steps": 4060, "loss": 0.2305, "lr": 3.931502942163956e-07, "epoch": 6.603448275862069, "percentage": 94.33, "elapsed_time": "3:39:17", "remaining_time": "0:13:10"}
|
||||||
|
{"current_steps": 3835, "total_steps": 4060, "loss": 0.2294, "lr": 3.763680011297921e-07, "epoch": 6.612068965517241, "percentage": 94.46, "elapsed_time": "3:39:36", "remaining_time": "0:12:53"}
|
||||||
|
{"current_steps": 3840, "total_steps": 4060, "loss": 0.1897, "lr": 3.599483527895231e-07, "epoch": 6.620689655172414, "percentage": 94.58, "elapsed_time": "3:40:07", "remaining_time": "0:12:36"}
|
||||||
|
{"current_steps": 3845, "total_steps": 4060, "loss": 0.2054, "lr": 3.4389165263071233e-07, "epoch": 6.629310344827586, "percentage": 94.7, "elapsed_time": "3:40:20", "remaining_time": "0:12:19"}
|
||||||
|
{"current_steps": 3850, "total_steps": 4060, "loss": 0.2692, "lr": 3.2819819738119983e-07, "epoch": 6.637931034482759, "percentage": 94.83, "elapsed_time": "3:40:45", "remaining_time": "0:12:02"}
|
||||||
|
{"current_steps": 3855, "total_steps": 4060, "loss": 0.2167, "lr": 3.1286827705605984e-07, "epoch": 6.646551724137931, "percentage": 94.95, "elapsed_time": "3:41:01", "remaining_time": "0:11:45"}
|
||||||
|
{"current_steps": 3860, "total_steps": 4060, "loss": 0.1909, "lr": 2.979021749522448e-07, "epoch": 6.655172413793103, "percentage": 95.07, "elapsed_time": "3:41:21", "remaining_time": "0:11:28"}
|
||||||
|
{"current_steps": 3865, "total_steps": 4060, "loss": 0.2157, "lr": 2.833001676433367e-07, "epoch": 6.663793103448276, "percentage": 95.2, "elapsed_time": "3:41:40", "remaining_time": "0:11:11"}
|
||||||
|
{"current_steps": 3870, "total_steps": 4060, "loss": 0.2458, "lr": 2.690625249744572e-07, "epoch": 6.672413793103448, "percentage": 95.32, "elapsed_time": "3:41:54", "remaining_time": "0:10:53"}
|
||||||
|
{"current_steps": 3875, "total_steps": 4060, "loss": 0.2078, "lr": 2.551895100572566e-07, "epoch": 6.681034482758621, "percentage": 95.44, "elapsed_time": "3:42:09", "remaining_time": "0:10:36"}
|
||||||
|
{"current_steps": 3880, "total_steps": 4060, "loss": 0.2288, "lr": 2.4168137926506854e-07, "epoch": 6.689655172413794, "percentage": 95.57, "elapsed_time": "3:42:22", "remaining_time": "0:10:18"}
|
||||||
|
{"current_steps": 3885, "total_steps": 4060, "loss": 0.2171, "lr": 2.2853838222817616e-07, "epoch": 6.698275862068965, "percentage": 95.69, "elapsed_time": "3:42:37", "remaining_time": "0:10:01"}
|
||||||
|
{"current_steps": 3890, "total_steps": 4060, "loss": 0.2266, "lr": 2.1576076182917794e-07, "epoch": 6.706896551724138, "percentage": 95.81, "elapsed_time": "3:42:51", "remaining_time": "0:09:44"}
|
||||||
|
{"current_steps": 3895, "total_steps": 4060, "loss": 0.2324, "lr": 2.0334875419851573e-07, "epoch": 6.7155172413793105, "percentage": 95.94, "elapsed_time": "3:43:05", "remaining_time": "0:09:27"}
|
||||||
|
{"current_steps": 3900, "total_steps": 4060, "loss": 0.2558, "lr": 1.9130258871011165e-07, "epoch": 6.724137931034483, "percentage": 96.06, "elapsed_time": "3:43:16", "remaining_time": "0:09:09"}
|
||||||
|
{"current_steps": 3905, "total_steps": 4060, "loss": 0.2185, "lr": 1.7962248797711356e-07, "epoch": 6.732758620689655, "percentage": 96.18, "elapsed_time": "3:43:31", "remaining_time": "0:08:52"}
|
||||||
|
{"current_steps": 3910, "total_steps": 4060, "loss": 0.2359, "lr": 1.683086678478074e-07, "epoch": 6.741379310344827, "percentage": 96.31, "elapsed_time": "3:43:51", "remaining_time": "0:08:35"}
|
||||||
|
{"current_steps": 3915, "total_steps": 4060, "loss": 0.2452, "lr": 1.573613374015981e-07, "epoch": 6.75, "percentage": 96.43, "elapsed_time": "3:44:10", "remaining_time": "0:08:18"}
|
||||||
|
{"current_steps": 3920, "total_steps": 4060, "loss": 0.2113, "lr": 1.4678069894517033e-07, "epoch": 6.758620689655173, "percentage": 96.55, "elapsed_time": "3:44:29", "remaining_time": "0:08:01"}
|
||||||
|
{"current_steps": 3925, "total_steps": 4060, "loss": 0.2195, "lr": 1.3656694800873614e-07, "epoch": 6.767241379310345, "percentage": 96.67, "elapsed_time": "3:44:44", "remaining_time": "0:07:43"}
|
||||||
|
{"current_steps": 3930, "total_steps": 4060, "loss": 0.2235, "lr": 1.2672027334242887e-07, "epoch": 6.775862068965517, "percentage": 96.8, "elapsed_time": "3:44:58", "remaining_time": "0:07:26"}
|
||||||
|
{"current_steps": 3935, "total_steps": 4060, "loss": 0.2128, "lr": 1.1724085691280806e-07, "epoch": 6.7844827586206895, "percentage": 96.92, "elapsed_time": "3:45:14", "remaining_time": "0:07:09"}
|
||||||
|
{"current_steps": 3940, "total_steps": 4060, "loss": 0.2473, "lr": 1.0812887389950233e-07, "epoch": 6.793103448275862, "percentage": 97.04, "elapsed_time": "3:45:34", "remaining_time": "0:06:52"}
|
||||||
|
{"current_steps": 3945, "total_steps": 4060, "loss": 0.2282, "lr": 9.938449269197181e-08, "epoch": 6.801724137931035, "percentage": 97.17, "elapsed_time": "3:45:48", "remaining_time": "0:06:34"}
|
||||||
|
{"current_steps": 3950, "total_steps": 4060, "loss": 0.2283, "lr": 9.100787488639295e-08, "epoch": 6.810344827586206, "percentage": 97.29, "elapsed_time": "3:46:01", "remaining_time": "0:06:17"}
|
||||||
|
{"current_steps": 3955, "total_steps": 4060, "loss": 0.2147, "lr": 8.299917528267198e-08, "epoch": 6.818965517241379, "percentage": 97.41, "elapsed_time": "3:46:14", "remaining_time": "0:06:00"}
|
||||||
|
{"current_steps": 3960, "total_steps": 4060, "loss": 0.2234, "lr": 7.535854188159164e-08, "epoch": 6.827586206896552, "percentage": 97.54, "elapsed_time": "3:46:27", "remaining_time": "0:05:43"}
|
||||||
|
{"current_steps": 3965, "total_steps": 4060, "loss": 0.2269, "lr": 6.808611588206448e-08, "epoch": 6.836206896551724, "percentage": 97.66, "elapsed_time": "3:46:50", "remaining_time": "0:05:26"}
|
||||||
|
{"current_steps": 3970, "total_steps": 4060, "loss": 0.2337, "lr": 6.11820316785372e-08, "epoch": 6.844827586206897, "percentage": 97.78, "elapsed_time": "3:47:09", "remaining_time": "0:05:08"}
|
||||||
|
{"current_steps": 3975, "total_steps": 4060, "loss": 0.2573, "lr": 5.464641685849259e-08, "epoch": 6.853448275862069, "percentage": 97.91, "elapsed_time": "3:47:28", "remaining_time": "0:04:51"}
|
||||||
|
{"current_steps": 3980, "total_steps": 4060, "loss": 0.2287, "lr": 4.8479392200100336e-08, "epoch": 6.862068965517241, "percentage": 98.03, "elapsed_time": "3:47:42", "remaining_time": "0:04:34"}
|
||||||
|
{"current_steps": 3985, "total_steps": 4060, "loss": 0.2561, "lr": 4.268107166998769e-08, "epoch": 6.870689655172414, "percentage": 98.15, "elapsed_time": "3:48:02", "remaining_time": "0:04:17"}
|
||||||
|
{"current_steps": 3990, "total_steps": 4060, "loss": 0.2197, "lr": 3.7251562421123375e-08, "epoch": 6.879310344827586, "percentage": 98.28, "elapsed_time": "3:48:23", "remaining_time": "0:04:00"}
|
||||||
|
{"current_steps": 3995, "total_steps": 4060, "loss": 0.2174, "lr": 3.219096479084804e-08, "epoch": 6.887931034482759, "percentage": 98.4, "elapsed_time": "3:48:39", "remaining_time": "0:03:43"}
|
||||||
|
{"current_steps": 4000, "total_steps": 4060, "loss": 0.2369, "lr": 2.749937229901134e-08, "epoch": 6.896551724137931, "percentage": 98.52, "elapsed_time": "3:48:58", "remaining_time": "0:03:26"}
|
||||||
|
{"current_steps": 4005, "total_steps": 4060, "loss": 0.2656, "lr": 2.317687164624882e-08, "epoch": 6.905172413793103, "percentage": 98.65, "elapsed_time": "3:49:10", "remaining_time": "0:03:08"}
|
||||||
|
{"current_steps": 4010, "total_steps": 4060, "loss": 0.2477, "lr": 1.9223542712381026e-08, "epoch": 6.913793103448276, "percentage": 98.77, "elapsed_time": "3:49:27", "remaining_time": "0:02:51"}
|
||||||
|
{"current_steps": 4015, "total_steps": 4060, "loss": 0.2504, "lr": 1.563945855492799e-08, "epoch": 6.922413793103448, "percentage": 98.89, "elapsed_time": "3:49:54", "remaining_time": "0:02:34"}
|
||||||
|
{"current_steps": 4020, "total_steps": 4060, "loss": 0.2139, "lr": 1.242468540777253e-08, "epoch": 6.931034482758621, "percentage": 99.01, "elapsed_time": "3:50:07", "remaining_time": "0:02:17"}
|
||||||
|
{"current_steps": 4025, "total_steps": 4060, "loss": 0.2623, "lr": 9.579282679927915e-09, "epoch": 6.939655172413794, "percentage": 99.14, "elapsed_time": "3:50:27", "remaining_time": "0:02:00"}
|
||||||
|
{"current_steps": 4030, "total_steps": 4060, "loss": 0.2419, "lr": 7.1033029544365085e-09, "epoch": 6.948275862068965, "percentage": 99.26, "elapsed_time": "3:50:43", "remaining_time": "0:01:43"}
|
||||||
|
{"current_steps": 4035, "total_steps": 4060, "loss": 0.1926, "lr": 4.996791987410543e-09, "epoch": 6.956896551724138, "percentage": 99.38, "elapsed_time": "3:51:00", "remaining_time": "0:01:25"}
|
||||||
|
{"current_steps": 4040, "total_steps": 4060, "loss": 0.2278, "lr": 3.2597887071750266e-09, "epoch": 6.9655172413793105, "percentage": 99.51, "elapsed_time": "3:51:17", "remaining_time": "0:01:08"}
|
||||||
|
{"current_steps": 4045, "total_steps": 4060, "loss": 0.2269, "lr": 1.892325213552759e-09, "epoch": 6.974137931034483, "percentage": 99.63, "elapsed_time": "3:51:34", "remaining_time": "0:00:51"}
|
||||||
|
{"current_steps": 4050, "total_steps": 4060, "loss": 0.1967, "lr": 8.944267772692527e-10, "epoch": 6.982758620689655, "percentage": 99.75, "elapsed_time": "3:51:47", "remaining_time": "0:00:34"}
|
||||||
|
{"current_steps": 4055, "total_steps": 4060, "loss": 0.1941, "lr": 2.66111839490879e-10, "epoch": 6.991379310344827, "percentage": 99.88, "elapsed_time": "3:52:03", "remaining_time": "0:00:17"}
|
||||||
|
{"current_steps": 4060, "total_steps": 4060, "loss": 0.2152, "lr": 7.392011478479787e-12, "epoch": 7.0, "percentage": 100.0, "elapsed_time": "3:52:18", "remaining_time": "0:00:00"}
|
||||||
|
{"current_steps": 4060, "total_steps": 4060, "epoch": 7.0, "percentage": 100.0, "elapsed_time": "3:52:28", "remaining_time": "0:00:00"}
|
||||||
8979
trainer_state.json
Normal file
8979
trainer_state.json
Normal file
File diff suppressed because it is too large
Load Diff
3
training_args.bin
Normal file
3
training_args.bin
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
version https://git-lfs.github.com/spec/v1
|
||||||
|
oid sha256:f7cee162e9aca776db0bd2870e099b596a2e4dd2cd73b91e43ab5e53f5292803
|
||||||
|
size 8593
|
||||||
BIN
training_loss.png
Normal file
BIN
training_loss.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 50 KiB |
1
vocab.json
Normal file
1
vocab.json
Normal file
File diff suppressed because one or more lines are too long
Reference in New Issue
Block a user