初始化项目,由ModelHub XC社区提供模型
Model: laion/r2egym-31600-opt100k__Qwen3-8B Source: Original Platform
This commit is contained in:
36
.gitattributes
vendored
Normal file
36
.gitattributes
vendored
Normal file
@@ -0,0 +1,36 @@
|
||||
*.7z filter=lfs diff=lfs merge=lfs -text
|
||||
*.arrow filter=lfs diff=lfs merge=lfs -text
|
||||
*.bin filter=lfs diff=lfs merge=lfs -text
|
||||
*.bz2 filter=lfs diff=lfs merge=lfs -text
|
||||
*.ckpt filter=lfs diff=lfs merge=lfs -text
|
||||
*.ftz filter=lfs diff=lfs merge=lfs -text
|
||||
*.gz filter=lfs diff=lfs merge=lfs -text
|
||||
*.h5 filter=lfs diff=lfs merge=lfs -text
|
||||
*.joblib filter=lfs diff=lfs merge=lfs -text
|
||||
*.lfs.* filter=lfs diff=lfs merge=lfs -text
|
||||
*.mlmodel filter=lfs diff=lfs merge=lfs -text
|
||||
*.model filter=lfs diff=lfs merge=lfs -text
|
||||
*.msgpack filter=lfs diff=lfs merge=lfs -text
|
||||
*.npy filter=lfs diff=lfs merge=lfs -text
|
||||
*.npz filter=lfs diff=lfs merge=lfs -text
|
||||
*.onnx filter=lfs diff=lfs merge=lfs -text
|
||||
*.ot filter=lfs diff=lfs merge=lfs -text
|
||||
*.parquet filter=lfs diff=lfs merge=lfs -text
|
||||
*.pb filter=lfs diff=lfs merge=lfs -text
|
||||
*.pickle filter=lfs diff=lfs merge=lfs -text
|
||||
*.pkl filter=lfs diff=lfs merge=lfs -text
|
||||
*.pt filter=lfs diff=lfs merge=lfs -text
|
||||
*.pth filter=lfs diff=lfs merge=lfs -text
|
||||
*.rar filter=lfs diff=lfs merge=lfs -text
|
||||
*.safetensors filter=lfs diff=lfs merge=lfs -text
|
||||
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
||||
*.tar.* filter=lfs diff=lfs merge=lfs -text
|
||||
*.tar filter=lfs diff=lfs merge=lfs -text
|
||||
*.tflite filter=lfs diff=lfs merge=lfs -text
|
||||
*.tgz filter=lfs diff=lfs merge=lfs -text
|
||||
*.wasm filter=lfs diff=lfs merge=lfs -text
|
||||
*.xz filter=lfs diff=lfs merge=lfs -text
|
||||
*.zip filter=lfs diff=lfs merge=lfs -text
|
||||
*.zst filter=lfs diff=lfs merge=lfs -text
|
||||
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
||||
tokenizer.json filter=lfs diff=lfs merge=lfs -text
|
||||
61
README.md
Normal file
61
README.md
Normal file
@@ -0,0 +1,61 @@
|
||||
---
|
||||
library_name: transformers
|
||||
license: other
|
||||
base_model: Qwen/Qwen3-8B
|
||||
tags:
|
||||
- llama-factory
|
||||
- full
|
||||
- generated_from_trainer
|
||||
model-index:
|
||||
- name: r2egym-31600__Qwen3-8B
|
||||
results: []
|
||||
---
|
||||
|
||||
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
||||
should probably proofread and complete it, then remove this comment. -->
|
||||
|
||||
# r2egym-31600__Qwen3-8B
|
||||
|
||||
This model is a fine-tuned version of [Qwen/Qwen3-8B](https://huggingface.co/Qwen/Qwen3-8B) on the /e/data1/datasets/playground/ot/hf_hub/datasets--laion--r2egym-unified-31600/snapshots/68e1b38fd891a5a7c593dfcf25d1109f2dec75a5_thinking_preprocessed dataset.
|
||||
|
||||
## Model description
|
||||
|
||||
More information needed
|
||||
|
||||
## Intended uses & limitations
|
||||
|
||||
More information needed
|
||||
|
||||
## Training and evaluation data
|
||||
|
||||
More information needed
|
||||
|
||||
## Training procedure
|
||||
|
||||
### Training hyperparameters
|
||||
|
||||
The following hyperparameters were used during training:
|
||||
- learning_rate: 4e-05
|
||||
- train_batch_size: 1
|
||||
- eval_batch_size: 8
|
||||
- seed: 42
|
||||
- distributed_type: multi-GPU
|
||||
- num_devices: 32
|
||||
- gradient_accumulation_steps: 3
|
||||
- total_train_batch_size: 96
|
||||
- total_eval_batch_size: 256
|
||||
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.98) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
||||
- lr_scheduler_type: cosine
|
||||
- lr_scheduler_warmup_ratio: 0.1
|
||||
- num_epochs: 5.0
|
||||
|
||||
### Training results
|
||||
|
||||
|
||||
|
||||
### Framework versions
|
||||
|
||||
- Transformers 4.57.6
|
||||
- Pytorch 2.9.1+cu130
|
||||
- Datasets 4.7.0
|
||||
- Tokenizers 0.22.2
|
||||
28
added_tokens.json
Normal file
28
added_tokens.json
Normal file
@@ -0,0 +1,28 @@
|
||||
{
|
||||
"</think>": 151668,
|
||||
"</tool_call>": 151658,
|
||||
"</tool_response>": 151666,
|
||||
"<think>": 151667,
|
||||
"<tool_call>": 151657,
|
||||
"<tool_response>": 151665,
|
||||
"<|box_end|>": 151649,
|
||||
"<|box_start|>": 151648,
|
||||
"<|endoftext|>": 151643,
|
||||
"<|file_sep|>": 151664,
|
||||
"<|fim_middle|>": 151660,
|
||||
"<|fim_pad|>": 151662,
|
||||
"<|fim_prefix|>": 151659,
|
||||
"<|fim_suffix|>": 151661,
|
||||
"<|im_end|>": 151645,
|
||||
"<|im_start|>": 151644,
|
||||
"<|image_pad|>": 151655,
|
||||
"<|object_ref_end|>": 151647,
|
||||
"<|object_ref_start|>": 151646,
|
||||
"<|quad_end|>": 151651,
|
||||
"<|quad_start|>": 151650,
|
||||
"<|repo_name|>": 151663,
|
||||
"<|video_pad|>": 151656,
|
||||
"<|vision_end|>": 151653,
|
||||
"<|vision_pad|>": 151654,
|
||||
"<|vision_start|>": 151652
|
||||
}
|
||||
16
all_results.json
Normal file
16
all_results.json
Normal file
@@ -0,0 +1,16 @@
|
||||
{
|
||||
"achieved_tflops_per_gpu": 78127.88678515423,
|
||||
"achieved_tflops_per_gpu_theoretical": 6570244.6283924775,
|
||||
"epoch": 5.0,
|
||||
"loss_nan_ranks": 0,
|
||||
"loss_rank_avg": 0.00010255213419441134,
|
||||
"mfu_percent": 5521.405426512666,
|
||||
"mfu_percent_theoretical": 464328.242289221,
|
||||
"total_flos": 2.127078594457895e+18,
|
||||
"train_loss": 0.0,
|
||||
"train_runtime": 0.8508,
|
||||
"train_samples_per_second": 185705.011,
|
||||
"train_steps_per_second": 1939.324,
|
||||
"valid_targets_mean": 4467.4,
|
||||
"valid_targets_min": 1993
|
||||
}
|
||||
89
chat_template.jinja
Normal file
89
chat_template.jinja
Normal file
@@ -0,0 +1,89 @@
|
||||
{%- if tools %}
|
||||
{{- '<|im_start|>system\n' }}
|
||||
{%- if messages[0].role == 'system' %}
|
||||
{{- messages[0].content + '\n\n' }}
|
||||
{%- endif %}
|
||||
{{- "# Tools\n\nYou may call one or more functions to assist with the user query.\n\nYou are provided with function signatures within <tools></tools> XML tags:\n<tools>" }}
|
||||
{%- for tool in tools %}
|
||||
{{- "\n" }}
|
||||
{{- tool | tojson }}
|
||||
{%- endfor %}
|
||||
{{- "\n</tools>\n\nFor each function call, return a json object with function name and arguments within <tool_call></tool_call> XML tags:\n<tool_call>\n{\"name\": <function-name>, \"arguments\": <args-json-object>}\n</tool_call><|im_end|>\n" }}
|
||||
{%- else %}
|
||||
{%- if messages[0].role == 'system' %}
|
||||
{{- '<|im_start|>system\n' + messages[0].content + '<|im_end|>\n' }}
|
||||
{%- endif %}
|
||||
{%- endif %}
|
||||
{%- set ns = namespace(multi_step_tool=true, last_query_index=messages|length - 1) %}
|
||||
{%- for message in messages[::-1] %}
|
||||
{%- set index = (messages|length - 1) - loop.index0 %}
|
||||
{%- if ns.multi_step_tool and message.role == "user" and message.content is string and not(message.content.startswith('<tool_response>') and message.content.endswith('</tool_response>')) %}
|
||||
{%- set ns.multi_step_tool = false %}
|
||||
{%- set ns.last_query_index = index %}
|
||||
{%- endif %}
|
||||
{%- endfor %}
|
||||
{%- for message in messages %}
|
||||
{%- if message.content is string %}
|
||||
{%- set content = message.content %}
|
||||
{%- else %}
|
||||
{%- set content = '' %}
|
||||
{%- endif %}
|
||||
{%- if (message.role == "user") or (message.role == "system" and not loop.first) %}
|
||||
{{- '<|im_start|>' + message.role + '\n' + content + '<|im_end|>' + '\n' }}
|
||||
{%- elif message.role == "assistant" %}
|
||||
{%- set reasoning_content = '' %}
|
||||
{%- if message.reasoning_content is string %}
|
||||
{%- set reasoning_content = message.reasoning_content %}
|
||||
{%- else %}
|
||||
{%- if '</think>' in content %}
|
||||
{%- set reasoning_content = content.split('</think>')[0].rstrip('\n').split('<think>')[-1].lstrip('\n') %}
|
||||
{%- set content = content.split('</think>')[-1].lstrip('\n') %}
|
||||
{%- endif %}
|
||||
{%- endif %}
|
||||
{%- if loop.index0 > ns.last_query_index %}
|
||||
{%- if loop.last or (not loop.last and reasoning_content) %}
|
||||
{{- '<|im_start|>' + message.role + '\n<think>\n' + reasoning_content.strip('\n') + '\n</think>\n\n' + content.lstrip('\n') }}
|
||||
{%- else %}
|
||||
{{- '<|im_start|>' + message.role + '\n' + content }}
|
||||
{%- endif %}
|
||||
{%- else %}
|
||||
{{- '<|im_start|>' + message.role + '\n' + content }}
|
||||
{%- endif %}
|
||||
{%- if message.tool_calls %}
|
||||
{%- for tool_call in message.tool_calls %}
|
||||
{%- if (loop.first and content) or (not loop.first) %}
|
||||
{{- '\n' }}
|
||||
{%- endif %}
|
||||
{%- if tool_call.function %}
|
||||
{%- set tool_call = tool_call.function %}
|
||||
{%- endif %}
|
||||
{{- '<tool_call>\n{"name": "' }}
|
||||
{{- tool_call.name }}
|
||||
{{- '", "arguments": ' }}
|
||||
{%- if tool_call.arguments is string %}
|
||||
{{- tool_call.arguments }}
|
||||
{%- else %}
|
||||
{{- tool_call.arguments | tojson }}
|
||||
{%- endif %}
|
||||
{{- '}\n</tool_call>' }}
|
||||
{%- endfor %}
|
||||
{%- endif %}
|
||||
{{- '<|im_end|>\n' }}
|
||||
{%- elif message.role == "tool" %}
|
||||
{%- if loop.first or (messages[loop.index0 - 1].role != "tool") %}
|
||||
{{- '<|im_start|>user' }}
|
||||
{%- endif %}
|
||||
{{- '\n<tool_response>\n' }}
|
||||
{{- content }}
|
||||
{{- '\n</tool_response>' }}
|
||||
{%- if loop.last or (messages[loop.index0 + 1].role != "tool") %}
|
||||
{{- '<|im_end|>\n' }}
|
||||
{%- endif %}
|
||||
{%- endif %}
|
||||
{%- endfor %}
|
||||
{%- if add_generation_prompt %}
|
||||
{{- '<|im_start|>assistant\n' }}
|
||||
{%- if enable_thinking is defined and enable_thinking is false %}
|
||||
{{- '<think>\n\n</think>\n\n' }}
|
||||
{%- endif %}
|
||||
{%- endif %}
|
||||
68
config.json
Normal file
68
config.json
Normal file
@@ -0,0 +1,68 @@
|
||||
{
|
||||
"architectures": [
|
||||
"Qwen3ForCausalLM"
|
||||
],
|
||||
"attention_bias": false,
|
||||
"attention_dropout": 0.0,
|
||||
"dtype": "bfloat16",
|
||||
"eos_token_id": 151645,
|
||||
"head_dim": 128,
|
||||
"hidden_act": "silu",
|
||||
"hidden_size": 4096,
|
||||
"initializer_range": 0.02,
|
||||
"intermediate_size": 12288,
|
||||
"layer_types": [
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention"
|
||||
],
|
||||
"max_position_embeddings": 40960,
|
||||
"max_window_layers": 36,
|
||||
"model_type": "qwen3",
|
||||
"num_attention_heads": 32,
|
||||
"num_hidden_layers": 36,
|
||||
"num_key_value_heads": 8,
|
||||
"pad_token_id": 151643,
|
||||
"rms_norm_eps": 1e-06,
|
||||
"rope_scaling": null,
|
||||
"rope_theta": 1000000,
|
||||
"sliding_window": null,
|
||||
"tie_word_embeddings": false,
|
||||
"transformers_version": "4.57.6",
|
||||
"use_cache": false,
|
||||
"use_sliding_window": false,
|
||||
"vocab_size": 151936
|
||||
}
|
||||
12
generation_config.json
Normal file
12
generation_config.json
Normal file
@@ -0,0 +1,12 @@
|
||||
{
|
||||
"do_sample": true,
|
||||
"eos_token_id": [
|
||||
151645,
|
||||
151643
|
||||
],
|
||||
"pad_token_id": 151643,
|
||||
"temperature": 0.6,
|
||||
"top_k": 20,
|
||||
"top_p": 0.95,
|
||||
"transformers_version": "4.57.6"
|
||||
}
|
||||
151388
merges.txt
Normal file
151388
merges.txt
Normal file
File diff suppressed because it is too large
Load Diff
3
model-00001-of-00004.safetensors
Normal file
3
model-00001-of-00004.safetensors
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:66086aefcfa8f98e0110210e562d707544051177577f74d20796b51145417ade
|
||||
size 4902257696
|
||||
3
model-00002-of-00004.safetensors
Normal file
3
model-00002-of-00004.safetensors
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:19371c898ad782ca0ffa4115d97923c7a80e42205808b94fc53359115a77e938
|
||||
size 4915960368
|
||||
3
model-00003-of-00004.safetensors
Normal file
3
model-00003-of-00004.safetensors
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:507421f90a4b7a7e595ffba37ece8c2c16a9fd4750e3a5cae9f4174a4339ae2a
|
||||
size 4983068496
|
||||
3
model-00004-of-00004.safetensors
Normal file
3
model-00004-of-00004.safetensors
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:ffa605a2f2406955f1b5d88bf78b893dd46fd08209ed30919f1061a687218f14
|
||||
size 1580230264
|
||||
407
model.safetensors.index.json
Normal file
407
model.safetensors.index.json
Normal file
@@ -0,0 +1,407 @@
|
||||
{
|
||||
"metadata": {
|
||||
"total_parameters": 308224,
|
||||
"total_size": 16381470720
|
||||
},
|
||||
"weight_map": {
|
||||
"lm_head.weight": "model-00004-of-00004.safetensors",
|
||||
"model.embed_tokens.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.0.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.0.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.0.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.0.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.0.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.0.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.0.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.0.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.0.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.0.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.0.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.1.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.1.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.1.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.1.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.1.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.1.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.1.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.1.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.1.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.1.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.1.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.10.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.10.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.10.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.10.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.10.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.10.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.10.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.10.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.10.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.10.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.10.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.11.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.11.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.11.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.11.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.11.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.11.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.11.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.11.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.11.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.11.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.11.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.12.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.12.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.12.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.12.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.12.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.12.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.12.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.12.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.12.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.12.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.12.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.13.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.13.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.13.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.13.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.13.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.13.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.13.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.13.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.13.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.13.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.13.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.14.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.14.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.14.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.14.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.14.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.14.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.14.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.14.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.14.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.14.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.14.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.15.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.15.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.15.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.15.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.15.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.15.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.15.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.15.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.15.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.15.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.15.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.16.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.16.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.16.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.16.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.16.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.16.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.16.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.16.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.16.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.16.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.16.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.17.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.17.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.17.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.17.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.17.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.17.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.17.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.17.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.17.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.17.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.17.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.18.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.18.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.18.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.18.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.18.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.18.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.18.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.18.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.18.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.18.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.18.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.19.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.19.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.19.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.19.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.19.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.19.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.19.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.19.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.19.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.19.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.19.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.2.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.2.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.2.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.2.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.2.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.2.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.2.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.2.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.2.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.2.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.2.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.20.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.20.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.20.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.20.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.20.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.20.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.20.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.20.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.20.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.20.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.20.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.21.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.21.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.21.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.21.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.21.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.21.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.21.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.21.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.21.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.21.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.21.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.22.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.22.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.22.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.22.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.22.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.22.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.22.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.22.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.22.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.22.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.22.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.23.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.23.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.23.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.23.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.23.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.23.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.23.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.23.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.23.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.23.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.23.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.24.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.24.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.24.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.24.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.24.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.24.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.24.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.24.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.24.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.24.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.24.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.25.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.25.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.25.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.25.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.25.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.25.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.25.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.25.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.25.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.25.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.25.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.26.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.26.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.26.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.26.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.26.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.26.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.26.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.26.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.26.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.26.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.26.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.27.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.27.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.27.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.27.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.27.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.27.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.27.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.27.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.27.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.27.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.27.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.28.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.28.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.28.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.28.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.28.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.28.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.28.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.28.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.28.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.28.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.28.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.29.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.29.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.29.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.29.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.29.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.29.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.29.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.29.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.29.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.29.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.29.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.3.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.3.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.3.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.3.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.3.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.3.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.3.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.3.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.3.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.3.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.3.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.30.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.30.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.30.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.30.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.30.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.30.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.30.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.30.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.30.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.30.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.30.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.31.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.31.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.31.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.31.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.31.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.31.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.31.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.31.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.31.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.31.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.31.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.32.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.32.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.32.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.32.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.32.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.32.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.32.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.32.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.32.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.32.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.32.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.33.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.33.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.33.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.33.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.33.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.33.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.33.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.33.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.33.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.33.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.33.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.34.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.34.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.34.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.34.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.34.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.34.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.34.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.34.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.34.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.34.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.34.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.35.input_layernorm.weight": "model-00004-of-00004.safetensors",
|
||||
"model.layers.35.mlp.down_proj.weight": "model-00004-of-00004.safetensors",
|
||||
"model.layers.35.mlp.gate_proj.weight": "model-00004-of-00004.safetensors",
|
||||
"model.layers.35.mlp.up_proj.weight": "model-00004-of-00004.safetensors",
|
||||
"model.layers.35.post_attention_layernorm.weight": "model-00004-of-00004.safetensors",
|
||||
"model.layers.35.self_attn.k_norm.weight": "model-00004-of-00004.safetensors",
|
||||
"model.layers.35.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.35.self_attn.o_proj.weight": "model-00004-of-00004.safetensors",
|
||||
"model.layers.35.self_attn.q_norm.weight": "model-00004-of-00004.safetensors",
|
||||
"model.layers.35.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.35.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.4.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.4.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.4.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.4.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.4.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.4.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.4.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.4.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.4.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.4.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.4.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.5.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.5.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.5.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.5.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.5.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.5.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.5.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.5.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.5.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.5.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.5.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.6.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.6.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.6.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.6.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.6.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.6.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.6.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.6.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.6.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.6.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.6.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.7.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.7.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.7.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.7.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.7.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.7.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.7.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.7.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.7.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.7.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.7.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.8.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.8.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.8.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.8.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.8.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.8.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.8.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.8.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.8.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.8.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.8.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.9.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.9.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.9.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.9.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.9.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.9.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.9.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.9.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.9.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.9.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.9.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.norm.weight": "model-00004-of-00004.safetensors"
|
||||
}
|
||||
}
|
||||
12
run_summary.json
Normal file
12
run_summary.json
Normal file
@@ -0,0 +1,12 @@
|
||||
{
|
||||
"agent_name": "68e1b38fd891a5a7c593dfcf25d1109f2dec75a5_thinking_preprocessed",
|
||||
"training_start": null,
|
||||
"training_end": null,
|
||||
"created_by": "DCAgent",
|
||||
"base_model_name": "Qwen/Qwen3-8B",
|
||||
"dataset_name": "/e/data1/datasets/playground/ot/hf_hub/datasets--laion--r2egym-unified-31600/snapshots/68e1b38fd891a5a7c593dfcf25d1109f2dec75a5_thinking_preprocessed",
|
||||
"training_type": "SFT",
|
||||
"training_parameters": "https://huggingface.co/laion/r2egym-unified-31600-opt100k__Qwen3-8B/blob/main/config.json",
|
||||
"wandb_link": null,
|
||||
"traces_location_s3": null
|
||||
}
|
||||
31
special_tokens_map.json
Normal file
31
special_tokens_map.json
Normal file
@@ -0,0 +1,31 @@
|
||||
{
|
||||
"additional_special_tokens": [
|
||||
"<|im_start|>",
|
||||
"<|im_end|>",
|
||||
"<|object_ref_start|>",
|
||||
"<|object_ref_end|>",
|
||||
"<|box_start|>",
|
||||
"<|box_end|>",
|
||||
"<|quad_start|>",
|
||||
"<|quad_end|>",
|
||||
"<|vision_start|>",
|
||||
"<|vision_end|>",
|
||||
"<|vision_pad|>",
|
||||
"<|image_pad|>",
|
||||
"<|video_pad|>"
|
||||
],
|
||||
"eos_token": {
|
||||
"content": "<|im_end|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false
|
||||
},
|
||||
"pad_token": {
|
||||
"content": "<|endoftext|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false
|
||||
}
|
||||
}
|
||||
3
tokenizer.json
Normal file
3
tokenizer.json
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:aeb13307a71acd8fe81861d94ad54ab689df773318809eed3cbe794b4492dae4
|
||||
size 11422654
|
||||
240
tokenizer_config.json
Normal file
240
tokenizer_config.json
Normal file
@@ -0,0 +1,240 @@
|
||||
{
|
||||
"add_bos_token": false,
|
||||
"add_prefix_space": false,
|
||||
"added_tokens_decoder": {
|
||||
"151643": {
|
||||
"content": "<|endoftext|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": true
|
||||
},
|
||||
"151644": {
|
||||
"content": "<|im_start|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": true
|
||||
},
|
||||
"151645": {
|
||||
"content": "<|im_end|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": true
|
||||
},
|
||||
"151646": {
|
||||
"content": "<|object_ref_start|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": true
|
||||
},
|
||||
"151647": {
|
||||
"content": "<|object_ref_end|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": true
|
||||
},
|
||||
"151648": {
|
||||
"content": "<|box_start|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": true
|
||||
},
|
||||
"151649": {
|
||||
"content": "<|box_end|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": true
|
||||
},
|
||||
"151650": {
|
||||
"content": "<|quad_start|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": true
|
||||
},
|
||||
"151651": {
|
||||
"content": "<|quad_end|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": true
|
||||
},
|
||||
"151652": {
|
||||
"content": "<|vision_start|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": true
|
||||
},
|
||||
"151653": {
|
||||
"content": "<|vision_end|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": true
|
||||
},
|
||||
"151654": {
|
||||
"content": "<|vision_pad|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": true
|
||||
},
|
||||
"151655": {
|
||||
"content": "<|image_pad|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": true
|
||||
},
|
||||
"151656": {
|
||||
"content": "<|video_pad|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": true
|
||||
},
|
||||
"151657": {
|
||||
"content": "<tool_call>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": false
|
||||
},
|
||||
"151658": {
|
||||
"content": "</tool_call>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": false
|
||||
},
|
||||
"151659": {
|
||||
"content": "<|fim_prefix|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": false
|
||||
},
|
||||
"151660": {
|
||||
"content": "<|fim_middle|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": false
|
||||
},
|
||||
"151661": {
|
||||
"content": "<|fim_suffix|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": false
|
||||
},
|
||||
"151662": {
|
||||
"content": "<|fim_pad|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": false
|
||||
},
|
||||
"151663": {
|
||||
"content": "<|repo_name|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": false
|
||||
},
|
||||
"151664": {
|
||||
"content": "<|file_sep|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": false
|
||||
},
|
||||
"151665": {
|
||||
"content": "<tool_response>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": false
|
||||
},
|
||||
"151666": {
|
||||
"content": "</tool_response>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": false
|
||||
},
|
||||
"151667": {
|
||||
"content": "<think>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": false
|
||||
},
|
||||
"151668": {
|
||||
"content": "</think>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": false
|
||||
}
|
||||
},
|
||||
"additional_special_tokens": [
|
||||
"<|im_start|>",
|
||||
"<|im_end|>",
|
||||
"<|object_ref_start|>",
|
||||
"<|object_ref_end|>",
|
||||
"<|box_start|>",
|
||||
"<|box_end|>",
|
||||
"<|quad_start|>",
|
||||
"<|quad_end|>",
|
||||
"<|vision_start|>",
|
||||
"<|vision_end|>",
|
||||
"<|vision_pad|>",
|
||||
"<|image_pad|>",
|
||||
"<|video_pad|>"
|
||||
],
|
||||
"bos_token": null,
|
||||
"clean_up_tokenization_spaces": false,
|
||||
"eos_token": "<|im_end|>",
|
||||
"errors": "replace",
|
||||
"extra_special_tokens": {},
|
||||
"model_max_length": 32768,
|
||||
"pad_token": "<|endoftext|>",
|
||||
"padding_side": "right",
|
||||
"split_special_tokens": false,
|
||||
"tokenizer_class": "Qwen2Tokenizer",
|
||||
"unk_token": null
|
||||
}
|
||||
12
train_results.json
Normal file
12
train_results.json
Normal file
@@ -0,0 +1,12 @@
|
||||
{
|
||||
"achieved_tflops_per_gpu": 78127.88678515423,
|
||||
"achieved_tflops_per_gpu_theoretical": 6570244.6283924775,
|
||||
"epoch": 5.0,
|
||||
"mfu_percent": 5521.405426512666,
|
||||
"mfu_percent_theoretical": 464328.242289221,
|
||||
"total_flos": 2.127078594457895e+18,
|
||||
"train_loss": 0.0,
|
||||
"train_runtime": 0.8508,
|
||||
"train_samples_per_second": 185705.011,
|
||||
"train_steps_per_second": 1939.324
|
||||
}
|
||||
333
trainer_log.jsonl
Normal file
333
trainer_log.jsonl
Normal file
@@ -0,0 +1,333 @@
|
||||
{"current_steps": 5, "total_steps": 1650, "loss": 0.384, "lr": 9.696969696969698e-07, "epoch": 0.015182186234817813, "percentage": 0.3, "elapsed_time": "0:01:17", "remaining_time": "7:04:04"}
|
||||
{"current_steps": 10, "total_steps": 1650, "loss": 0.351, "lr": 2.181818181818182e-06, "epoch": 0.030364372469635626, "percentage": 0.61, "elapsed_time": "0:02:19", "remaining_time": "6:22:30"}
|
||||
{"current_steps": 15, "total_steps": 1650, "loss": 0.3127, "lr": 3.3939393939393946e-06, "epoch": 0.04554655870445344, "percentage": 0.91, "elapsed_time": "0:03:21", "remaining_time": "6:05:34"}
|
||||
{"current_steps": 20, "total_steps": 1650, "loss": 0.2616, "lr": 4.606060606060606e-06, "epoch": 0.06072874493927125, "percentage": 1.21, "elapsed_time": "0:04:22", "remaining_time": "5:56:20"}
|
||||
{"current_steps": 25, "total_steps": 1650, "loss": 0.2401, "lr": 5.8181818181818185e-06, "epoch": 0.07591093117408906, "percentage": 1.52, "elapsed_time": "0:05:19", "remaining_time": "5:46:10"}
|
||||
{"current_steps": 30, "total_steps": 1650, "loss": 0.1989, "lr": 7.030303030303031e-06, "epoch": 0.09109311740890688, "percentage": 1.82, "elapsed_time": "0:06:17", "remaining_time": "5:40:00"}
|
||||
{"current_steps": 35, "total_steps": 1650, "loss": 0.1716, "lr": 8.242424242424243e-06, "epoch": 0.1062753036437247, "percentage": 2.12, "elapsed_time": "0:07:17", "remaining_time": "5:36:21"}
|
||||
{"current_steps": 40, "total_steps": 1650, "loss": 0.1592, "lr": 9.454545454545456e-06, "epoch": 0.1214574898785425, "percentage": 2.42, "elapsed_time": "0:08:17", "remaining_time": "5:33:51"}
|
||||
{"current_steps": 45, "total_steps": 1650, "loss": 0.1538, "lr": 1.0666666666666667e-05, "epoch": 0.13663967611336034, "percentage": 2.73, "elapsed_time": "0:09:15", "remaining_time": "5:30:09"}
|
||||
{"current_steps": 50, "total_steps": 1650, "loss": 0.1416, "lr": 1.187878787878788e-05, "epoch": 0.15182186234817813, "percentage": 3.03, "elapsed_time": "0:10:13", "remaining_time": "5:27:06"}
|
||||
{"current_steps": 55, "total_steps": 1650, "loss": 0.1317, "lr": 1.3090909090909092e-05, "epoch": 0.16700404858299595, "percentage": 3.33, "elapsed_time": "0:11:16", "remaining_time": "5:27:02"}
|
||||
{"current_steps": 60, "total_steps": 1650, "loss": 0.1237, "lr": 1.4303030303030305e-05, "epoch": 0.18218623481781376, "percentage": 3.64, "elapsed_time": "0:12:20", "remaining_time": "5:26:52"}
|
||||
{"current_steps": 65, "total_steps": 1650, "loss": 0.1189, "lr": 1.5515151515151516e-05, "epoch": 0.19736842105263158, "percentage": 3.94, "elapsed_time": "0:13:19", "remaining_time": "5:25:02"}
|
||||
{"current_steps": 70, "total_steps": 1650, "loss": 0.1159, "lr": 1.672727272727273e-05, "epoch": 0.2125506072874494, "percentage": 4.24, "elapsed_time": "0:14:23", "remaining_time": "5:24:46"}
|
||||
{"current_steps": 75, "total_steps": 1650, "loss": 0.1114, "lr": 1.7939393939393942e-05, "epoch": 0.22773279352226722, "percentage": 4.55, "elapsed_time": "0:15:37", "remaining_time": "5:27:58"}
|
||||
{"current_steps": 80, "total_steps": 1650, "loss": 0.1089, "lr": 1.9151515151515152e-05, "epoch": 0.242914979757085, "percentage": 4.85, "elapsed_time": "0:16:38", "remaining_time": "5:26:37"}
|
||||
{"current_steps": 85, "total_steps": 1650, "loss": 0.1055, "lr": 2.0363636363636365e-05, "epoch": 0.25809716599190285, "percentage": 5.15, "elapsed_time": "0:17:35", "remaining_time": "5:23:55"}
|
||||
{"current_steps": 90, "total_steps": 1650, "loss": 0.1052, "lr": 2.1575757575757578e-05, "epoch": 0.2732793522267207, "percentage": 5.45, "elapsed_time": "0:18:35", "remaining_time": "5:22:11"}
|
||||
{"current_steps": 95, "total_steps": 1650, "loss": 0.0898, "lr": 2.278787878787879e-05, "epoch": 0.28846153846153844, "percentage": 5.76, "elapsed_time": "0:19:41", "remaining_time": "5:22:14"}
|
||||
{"current_steps": 100, "total_steps": 1650, "loss": 0.0932, "lr": 2.4e-05, "epoch": 0.30364372469635625, "percentage": 6.06, "elapsed_time": "0:20:36", "remaining_time": "5:19:21"}
|
||||
{"current_steps": 105, "total_steps": 1650, "loss": 0.0847, "lr": 2.5212121212121214e-05, "epoch": 0.3188259109311741, "percentage": 6.36, "elapsed_time": "0:21:34", "remaining_time": "5:17:24"}
|
||||
{"current_steps": 110, "total_steps": 1650, "loss": 0.0797, "lr": 2.6424242424242427e-05, "epoch": 0.3340080971659919, "percentage": 6.67, "elapsed_time": "0:22:34", "remaining_time": "5:15:57"}
|
||||
{"current_steps": 115, "total_steps": 1650, "loss": 0.079, "lr": 2.763636363636364e-05, "epoch": 0.3491902834008097, "percentage": 6.97, "elapsed_time": "0:23:36", "remaining_time": "5:15:05"}
|
||||
{"current_steps": 120, "total_steps": 1650, "loss": 0.0743, "lr": 2.884848484848485e-05, "epoch": 0.3643724696356275, "percentage": 7.27, "elapsed_time": "0:24:35", "remaining_time": "5:13:37"}
|
||||
{"current_steps": 125, "total_steps": 1650, "loss": 0.0674, "lr": 3.0060606060606062e-05, "epoch": 0.37955465587044535, "percentage": 7.58, "elapsed_time": "0:25:34", "remaining_time": "5:12:06"}
|
||||
{"current_steps": 130, "total_steps": 1650, "loss": 0.0648, "lr": 3.127272727272728e-05, "epoch": 0.39473684210526316, "percentage": 7.88, "elapsed_time": "0:26:37", "remaining_time": "5:11:14"}
|
||||
{"current_steps": 135, "total_steps": 1650, "loss": 0.0609, "lr": 3.2484848484848485e-05, "epoch": 0.409919028340081, "percentage": 8.18, "elapsed_time": "0:27:40", "remaining_time": "5:10:36"}
|
||||
{"current_steps": 140, "total_steps": 1650, "loss": 0.0617, "lr": 3.36969696969697e-05, "epoch": 0.4251012145748988, "percentage": 8.48, "elapsed_time": "0:28:42", "remaining_time": "5:09:35"}
|
||||
{"current_steps": 145, "total_steps": 1650, "loss": 0.0555, "lr": 3.490909090909091e-05, "epoch": 0.4402834008097166, "percentage": 8.79, "elapsed_time": "0:29:41", "remaining_time": "5:08:05"}
|
||||
{"current_steps": 150, "total_steps": 1650, "loss": 0.0528, "lr": 3.6121212121212124e-05, "epoch": 0.45546558704453444, "percentage": 9.09, "elapsed_time": "0:30:34", "remaining_time": "5:05:48"}
|
||||
{"current_steps": 155, "total_steps": 1650, "loss": 0.0512, "lr": 3.733333333333334e-05, "epoch": 0.4706477732793522, "percentage": 9.39, "elapsed_time": "0:31:34", "remaining_time": "5:04:33"}
|
||||
{"current_steps": 160, "total_steps": 1650, "loss": 0.0464, "lr": 3.854545454545455e-05, "epoch": 0.48582995951417, "percentage": 9.7, "elapsed_time": "0:32:43", "remaining_time": "5:04:44"}
|
||||
{"current_steps": 165, "total_steps": 1650, "loss": 0.0421, "lr": 3.9757575757575757e-05, "epoch": 0.5010121457489879, "percentage": 10.0, "elapsed_time": "0:33:43", "remaining_time": "5:03:30"}
|
||||
{"current_steps": 170, "total_steps": 1650, "loss": 0.0423, "lr": 3.999928391557286e-05, "epoch": 0.5161943319838057, "percentage": 10.3, "elapsed_time": "0:34:44", "remaining_time": "5:02:30"}
|
||||
{"current_steps": 175, "total_steps": 1650, "loss": 0.0406, "lr": 3.999637491047052e-05, "epoch": 0.5313765182186235, "percentage": 10.61, "elapsed_time": "0:35:47", "remaining_time": "5:01:43"}
|
||||
{"current_steps": 180, "total_steps": 1650, "loss": 0.0363, "lr": 3.999122855464813e-05, "epoch": 0.5465587044534413, "percentage": 10.91, "elapsed_time": "0:36:46", "remaining_time": "5:00:20"}
|
||||
{"current_steps": 185, "total_steps": 1650, "loss": 0.036, "lr": 3.998384542392021e-05, "epoch": 0.5617408906882592, "percentage": 11.21, "elapsed_time": "0:37:40", "remaining_time": "4:58:18"}
|
||||
{"current_steps": 190, "total_steps": 1650, "loss": 0.0337, "lr": 3.9974226344369124e-05, "epoch": 0.5769230769230769, "percentage": 11.52, "elapsed_time": "0:38:38", "remaining_time": "4:56:58"}
|
||||
{"current_steps": 195, "total_steps": 1650, "loss": 0.0308, "lr": 3.996237239225268e-05, "epoch": 0.5921052631578947, "percentage": 11.82, "elapsed_time": "0:39:38", "remaining_time": "4:55:47"}
|
||||
{"current_steps": 200, "total_steps": 1650, "loss": 0.0303, "lr": 3.994828489388371e-05, "epoch": 0.6072874493927125, "percentage": 12.12, "elapsed_time": "0:40:36", "remaining_time": "4:54:21"}
|
||||
{"current_steps": 205, "total_steps": 1650, "loss": 0.0293, "lr": 3.993196542548162e-05, "epoch": 0.6224696356275303, "percentage": 12.42, "elapsed_time": "0:41:34", "remaining_time": "4:53:02"}
|
||||
{"current_steps": 210, "total_steps": 1650, "loss": 0.0263, "lr": 3.991341581299609e-05, "epoch": 0.6376518218623481, "percentage": 12.73, "elapsed_time": "0:42:37", "remaining_time": "4:52:18"}
|
||||
{"current_steps": 215, "total_steps": 1650, "loss": 0.0246, "lr": 3.9892638131902765e-05, "epoch": 0.652834008097166, "percentage": 13.03, "elapsed_time": "0:43:37", "remaining_time": "4:51:10"}
|
||||
{"current_steps": 220, "total_steps": 1650, "loss": 0.0227, "lr": 3.9869634706971e-05, "epoch": 0.6680161943319838, "percentage": 13.33, "elapsed_time": "0:44:40", "remaining_time": "4:50:21"}
|
||||
{"current_steps": 225, "total_steps": 1650, "loss": 0.0229, "lr": 3.984440811200379e-05, "epoch": 0.6831983805668016, "percentage": 13.64, "elapsed_time": "0:45:35", "remaining_time": "4:48:44"}
|
||||
{"current_steps": 230, "total_steps": 1650, "loss": 0.0204, "lr": 3.981696116954973e-05, "epoch": 0.6983805668016194, "percentage": 13.94, "elapsed_time": "0:46:34", "remaining_time": "4:47:30"}
|
||||
{"current_steps": 235, "total_steps": 1650, "loss": 0.0209, "lr": 3.978729695058729e-05, "epoch": 0.7135627530364372, "percentage": 14.24, "elapsed_time": "0:47:30", "remaining_time": "4:46:02"}
|
||||
{"current_steps": 240, "total_steps": 1650, "loss": 0.0198, "lr": 3.9755418774181146e-05, "epoch": 0.728744939271255, "percentage": 14.55, "elapsed_time": "0:48:28", "remaining_time": "4:44:49"}
|
||||
{"current_steps": 245, "total_steps": 1650, "loss": 0.0182, "lr": 3.9721330207110835e-05, "epoch": 0.7439271255060729, "percentage": 14.85, "elapsed_time": "0:49:26", "remaining_time": "4:43:33"}
|
||||
{"current_steps": 250, "total_steps": 1650, "loss": 0.0181, "lr": 3.9685035063471675e-05, "epoch": 0.7591093117408907, "percentage": 15.15, "elapsed_time": "0:50:22", "remaining_time": "4:42:04"}
|
||||
{"current_steps": 255, "total_steps": 1650, "loss": 0.0166, "lr": 3.964653740424804e-05, "epoch": 0.7742914979757085, "percentage": 15.45, "elapsed_time": "0:51:20", "remaining_time": "4:40:52"}
|
||||
{"current_steps": 260, "total_steps": 1650, "loss": 0.0158, "lr": 3.960584153685895e-05, "epoch": 0.7894736842105263, "percentage": 15.76, "elapsed_time": "0:52:20", "remaining_time": "4:39:47"}
|
||||
{"current_steps": 265, "total_steps": 1650, "loss": 0.0158, "lr": 3.9562952014676116e-05, "epoch": 0.8046558704453441, "percentage": 16.06, "elapsed_time": "0:53:13", "remaining_time": "4:38:07"}
|
||||
{"current_steps": 270, "total_steps": 1650, "loss": 0.0136, "lr": 3.9517873636514525e-05, "epoch": 0.819838056680162, "percentage": 16.36, "elapsed_time": "0:54:11", "remaining_time": "4:37:01"}
|
||||
{"current_steps": 275, "total_steps": 1650, "loss": 0.0132, "lr": 3.947061144609546e-05, "epoch": 0.8350202429149798, "percentage": 16.67, "elapsed_time": "0:55:10", "remaining_time": "4:35:53"}
|
||||
{"current_steps": 280, "total_steps": 1650, "loss": 0.0127, "lr": 3.942117073148221e-05, "epoch": 0.8502024291497976, "percentage": 16.97, "elapsed_time": "0:56:11", "remaining_time": "4:34:54"}
|
||||
{"current_steps": 285, "total_steps": 1650, "loss": 0.0125, "lr": 3.9369557024488345e-05, "epoch": 0.8653846153846154, "percentage": 17.27, "elapsed_time": "0:57:08", "remaining_time": "4:33:42"}
|
||||
{"current_steps": 290, "total_steps": 1650, "loss": 0.0114, "lr": 3.931577610005883e-05, "epoch": 0.8805668016194332, "percentage": 17.58, "elapsed_time": "0:58:12", "remaining_time": "4:32:57"}
|
||||
{"current_steps": 295, "total_steps": 1650, "loss": 0.0124, "lr": 3.925983397562385e-05, "epoch": 0.895748987854251, "percentage": 17.88, "elapsed_time": "0:59:07", "remaining_time": "4:31:33"}
|
||||
{"current_steps": 300, "total_steps": 1650, "loss": 0.0115, "lr": 3.920173691042554e-05, "epoch": 0.9109311740890689, "percentage": 18.18, "elapsed_time": "1:00:05", "remaining_time": "4:30:25"}
|
||||
{"current_steps": 305, "total_steps": 1650, "loss": 0.0104, "lr": 3.914149140481766e-05, "epoch": 0.9261133603238867, "percentage": 18.48, "elapsed_time": "1:01:13", "remaining_time": "4:29:59"}
|
||||
{"current_steps": 310, "total_steps": 1650, "loss": 0.0111, "lr": 3.9079104199538256e-05, "epoch": 0.9412955465587044, "percentage": 18.79, "elapsed_time": "1:02:14", "remaining_time": "4:29:04"}
|
||||
{"current_steps": 315, "total_steps": 1650, "loss": 0.0105, "lr": 3.901458227495549e-05, "epoch": 0.9564777327935222, "percentage": 19.09, "elapsed_time": "1:03:13", "remaining_time": "4:27:59"}
|
||||
{"current_steps": 320, "total_steps": 1650, "loss": 0.0091, "lr": 3.8947932850286585e-05, "epoch": 0.97165991902834, "percentage": 19.39, "elapsed_time": "1:04:13", "remaining_time": "4:26:55"}
|
||||
{"current_steps": 325, "total_steps": 1650, "loss": 0.0103, "lr": 3.887916338279014e-05, "epoch": 0.9868421052631579, "percentage": 19.7, "elapsed_time": "1:05:07", "remaining_time": "4:25:29"}
|
||||
{"current_steps": 330, "total_steps": 1650, "loss": 0.0091, "lr": 3.8808281566931675e-05, "epoch": 1.0, "percentage": 20.0, "elapsed_time": "1:05:57", "remaining_time": "4:23:48"}
|
||||
{"current_steps": 335, "total_steps": 1650, "loss": 0.0087, "lr": 3.873529533352277e-05, "epoch": 1.0151821862348178, "percentage": 20.3, "elapsed_time": "1:06:54", "remaining_time": "4:22:39"}
|
||||
{"current_steps": 340, "total_steps": 1650, "loss": 0.0084, "lr": 3.8660212848833705e-05, "epoch": 1.0303643724696356, "percentage": 20.61, "elapsed_time": "1:07:48", "remaining_time": "4:21:14"}
|
||||
{"current_steps": 345, "total_steps": 1650, "loss": 0.008, "lr": 3.858304251367972e-05, "epoch": 1.0455465587044535, "percentage": 20.91, "elapsed_time": "1:08:45", "remaining_time": "4:20:04"}
|
||||
{"current_steps": 350, "total_steps": 1650, "loss": 0.0078, "lr": 3.850379296248107e-05, "epoch": 1.0607287449392713, "percentage": 21.21, "elapsed_time": "1:09:44", "remaining_time": "4:19:03"}
|
||||
{"current_steps": 355, "total_steps": 1650, "loss": 0.0076, "lr": 3.8422473062297e-05, "epoch": 1.075910931174089, "percentage": 21.52, "elapsed_time": "1:10:40", "remaining_time": "4:17:47"}
|
||||
{"current_steps": 360, "total_steps": 1650, "loss": 0.0076, "lr": 3.8339091911833545e-05, "epoch": 1.091093117408907, "percentage": 21.82, "elapsed_time": "1:11:37", "remaining_time": "4:16:40"}
|
||||
{"current_steps": 365, "total_steps": 1650, "loss": 0.0074, "lr": 3.825365884042553e-05, "epoch": 1.1062753036437247, "percentage": 22.12, "elapsed_time": "1:12:33", "remaining_time": "4:15:27"}
|
||||
{"current_steps": 370, "total_steps": 1650, "loss": 0.0065, "lr": 3.8166183406992745e-05, "epoch": 1.1214574898785425, "percentage": 22.42, "elapsed_time": "1:13:29", "remaining_time": "4:14:13"}
|
||||
{"current_steps": 375, "total_steps": 1650, "loss": 0.0068, "lr": 3.807667539897041e-05, "epoch": 1.1366396761133604, "percentage": 22.73, "elapsed_time": "1:14:26", "remaining_time": "4:13:07"}
|
||||
{"current_steps": 380, "total_steps": 1650, "loss": 0.0065, "lr": 3.798514483121408e-05, "epoch": 1.1518218623481782, "percentage": 23.03, "elapsed_time": "1:15:23", "remaining_time": "4:11:58"}
|
||||
{"current_steps": 385, "total_steps": 1650, "loss": 0.0067, "lr": 3.789160194487908e-05, "epoch": 1.167004048582996, "percentage": 23.33, "elapsed_time": "1:16:22", "remaining_time": "4:10:57"}
|
||||
{"current_steps": 390, "total_steps": 1650, "loss": 0.0063, "lr": 3.7796057206274686e-05, "epoch": 1.1821862348178138, "percentage": 23.64, "elapsed_time": "1:17:13", "remaining_time": "4:09:31"}
|
||||
{"current_steps": 395, "total_steps": 1650, "loss": 0.0057, "lr": 3.769852130569304e-05, "epoch": 1.1973684210526316, "percentage": 23.94, "elapsed_time": "1:18:14", "remaining_time": "4:08:36"}
|
||||
{"current_steps": 400, "total_steps": 1650, "loss": 0.006, "lr": 3.7599005156213066e-05, "epoch": 1.2125506072874495, "percentage": 24.24, "elapsed_time": "1:19:09", "remaining_time": "4:07:23"}
|
||||
{"current_steps": 405, "total_steps": 1650, "loss": 0.0052, "lr": 3.74975198924794e-05, "epoch": 1.2277327935222673, "percentage": 24.55, "elapsed_time": "1:20:10", "remaining_time": "4:06:26"}
|
||||
{"current_steps": 410, "total_steps": 1650, "loss": 0.0054, "lr": 3.739407686945658e-05, "epoch": 1.242914979757085, "percentage": 24.85, "elapsed_time": "1:21:05", "remaining_time": "4:05:14"}
|
||||
{"current_steps": 415, "total_steps": 1650, "loss": 0.0059, "lr": 3.728868766115854e-05, "epoch": 1.258097165991903, "percentage": 25.15, "elapsed_time": "1:22:02", "remaining_time": "4:04:09"}
|
||||
{"current_steps": 420, "total_steps": 1650, "loss": 0.0049, "lr": 3.718136405935365e-05, "epoch": 1.2732793522267207, "percentage": 25.45, "elapsed_time": "1:22:58", "remaining_time": "4:03:00"}
|
||||
{"current_steps": 425, "total_steps": 1650, "loss": 0.005, "lr": 3.707211807224534e-05, "epoch": 1.2884615384615383, "percentage": 25.76, "elapsed_time": "1:23:54", "remaining_time": "4:01:51"}
|
||||
{"current_steps": 430, "total_steps": 1650, "loss": 0.0051, "lr": 3.696096192312852e-05, "epoch": 1.3036437246963564, "percentage": 26.06, "elapsed_time": "1:24:52", "remaining_time": "4:00:49"}
|
||||
{"current_steps": 435, "total_steps": 1650, "loss": 0.0054, "lr": 3.684790804902199e-05, "epoch": 1.318825910931174, "percentage": 26.36, "elapsed_time": "1:25:48", "remaining_time": "3:59:41"}
|
||||
{"current_steps": 440, "total_steps": 1650, "loss": 0.0051, "lr": 3.673296909927682e-05, "epoch": 1.334008097165992, "percentage": 26.67, "elapsed_time": "1:26:52", "remaining_time": "3:58:53"}
|
||||
{"current_steps": 445, "total_steps": 1650, "loss": 0.0045, "lr": 3.661615793416109e-05, "epoch": 1.3491902834008096, "percentage": 26.97, "elapsed_time": "1:27:51", "remaining_time": "3:57:54"}
|
||||
{"current_steps": 450, "total_steps": 1650, "loss": 0.0042, "lr": 3.649748762342098e-05, "epoch": 1.3643724696356276, "percentage": 27.27, "elapsed_time": "1:28:49", "remaining_time": "3:56:52"}
|
||||
{"current_steps": 455, "total_steps": 1650, "loss": 0.0048, "lr": 3.637697144481839e-05, "epoch": 1.3795546558704452, "percentage": 27.58, "elapsed_time": "1:29:44", "remaining_time": "3:55:40"}
|
||||
{"current_steps": 460, "total_steps": 1650, "loss": 0.0043, "lr": 3.625462288264536e-05, "epoch": 1.3947368421052633, "percentage": 27.88, "elapsed_time": "1:30:45", "remaining_time": "3:54:47"}
|
||||
{"current_steps": 465, "total_steps": 1650, "loss": 0.0046, "lr": 3.613045562621533e-05, "epoch": 1.4099190283400809, "percentage": 28.18, "elapsed_time": "1:31:44", "remaining_time": "3:53:47"}
|
||||
{"current_steps": 470, "total_steps": 1650, "loss": 0.004, "lr": 3.600448356833146e-05, "epoch": 1.425101214574899, "percentage": 28.48, "elapsed_time": "1:32:41", "remaining_time": "3:52:42"}
|
||||
{"current_steps": 475, "total_steps": 1650, "loss": 0.0039, "lr": 3.587672080373219e-05, "epoch": 1.4402834008097165, "percentage": 28.79, "elapsed_time": "1:33:43", "remaining_time": "3:51:51"}
|
||||
{"current_steps": 480, "total_steps": 1650, "loss": 0.0039, "lr": 3.574718162751426e-05, "epoch": 1.4554655870445345, "percentage": 29.09, "elapsed_time": "1:34:41", "remaining_time": "3:50:48"}
|
||||
{"current_steps": 485, "total_steps": 1650, "loss": 0.0043, "lr": 3.561588053353319e-05, "epoch": 1.4706477732793521, "percentage": 29.39, "elapsed_time": "1:35:34", "remaining_time": "3:49:35"}
|
||||
{"current_steps": 490, "total_steps": 1650, "loss": 0.0037, "lr": 3.5482832212781655e-05, "epoch": 1.48582995951417, "percentage": 29.7, "elapsed_time": "1:36:27", "remaining_time": "3:48:22"}
|
||||
{"current_steps": 495, "total_steps": 1650, "loss": 0.0043, "lr": 3.53480515517457e-05, "epoch": 1.5010121457489878, "percentage": 30.0, "elapsed_time": "1:37:28", "remaining_time": "3:47:25"}
|
||||
{"current_steps": 500, "total_steps": 1650, "loss": 0.0042, "lr": 3.5211553630739166e-05, "epoch": 1.5161943319838058, "percentage": 30.3, "elapsed_time": "1:38:27", "remaining_time": "3:46:26"}
|
||||
{"current_steps": 505, "total_steps": 1650, "loss": 0.0035, "lr": 3.5073353722216334e-05, "epoch": 1.5313765182186234, "percentage": 30.61, "elapsed_time": "1:39:22", "remaining_time": "3:45:18"}
|
||||
{"current_steps": 510, "total_steps": 1650, "loss": 0.0039, "lr": 3.4933467289063156e-05, "epoch": 1.5465587044534415, "percentage": 30.91, "elapsed_time": "1:40:21", "remaining_time": "3:44:18"}
|
||||
{"current_steps": 515, "total_steps": 1650, "loss": 0.0036, "lr": 3.4791909982867175e-05, "epoch": 1.561740890688259, "percentage": 31.21, "elapsed_time": "1:41:16", "remaining_time": "3:43:12"}
|
||||
{"current_steps": 520, "total_steps": 1650, "loss": 0.0039, "lr": 3.464869764216622e-05, "epoch": 1.5769230769230769, "percentage": 31.52, "elapsed_time": "1:42:08", "remaining_time": "3:41:56"}
|
||||
{"current_steps": 525, "total_steps": 1650, "loss": 0.0034, "lr": 3.450384629067635e-05, "epoch": 1.5921052631578947, "percentage": 31.82, "elapsed_time": "1:43:04", "remaining_time": "3:40:51"}
|
||||
{"current_steps": 530, "total_steps": 1650, "loss": 0.0033, "lr": 3.435737213549896e-05, "epoch": 1.6072874493927125, "percentage": 32.12, "elapsed_time": "1:43:59", "remaining_time": "3:39:44"}
|
||||
{"current_steps": 535, "total_steps": 1650, "loss": 0.0036, "lr": 3.420929156530738e-05, "epoch": 1.6224696356275303, "percentage": 32.42, "elapsed_time": "1:44:52", "remaining_time": "3:38:33"}
|
||||
{"current_steps": 540, "total_steps": 1650, "loss": 0.0034, "lr": 3.405962114851324e-05, "epoch": 1.6376518218623481, "percentage": 32.73, "elapsed_time": "1:45:50", "remaining_time": "3:37:32"}
|
||||
{"current_steps": 545, "total_steps": 1650, "loss": 0.0035, "lr": 3.390837763141261e-05, "epoch": 1.652834008097166, "percentage": 33.03, "elapsed_time": "1:46:45", "remaining_time": "3:36:26"}
|
||||
{"current_steps": 550, "total_steps": 1650, "loss": 0.0032, "lr": 3.3755577936312344e-05, "epoch": 1.6680161943319838, "percentage": 33.33, "elapsed_time": "1:47:42", "remaining_time": "3:35:25"}
|
||||
{"current_steps": 555, "total_steps": 1650, "loss": 0.003, "lr": 3.360123915963662e-05, "epoch": 1.6831983805668016, "percentage": 33.64, "elapsed_time": "1:48:35", "remaining_time": "3:34:15"}
|
||||
{"current_steps": 560, "total_steps": 1650, "loss": 0.0031, "lr": 3.3445378570014125e-05, "epoch": 1.6983805668016194, "percentage": 33.94, "elapsed_time": "1:49:30", "remaining_time": "3:33:09"}
|
||||
{"current_steps": 565, "total_steps": 1650, "loss": 0.003, "lr": 3.328801360634585e-05, "epoch": 1.7135627530364372, "percentage": 34.24, "elapsed_time": "1:50:26", "remaining_time": "3:32:05"}
|
||||
{"current_steps": 570, "total_steps": 1650, "loss": 0.0032, "lr": 3.312916187585392e-05, "epoch": 1.728744939271255, "percentage": 34.55, "elapsed_time": "1:51:21", "remaining_time": "3:30:58"}
|
||||
{"current_steps": 575, "total_steps": 1650, "loss": 0.003, "lr": 3.296884115211157e-05, "epoch": 1.7439271255060729, "percentage": 34.85, "elapsed_time": "1:52:13", "remaining_time": "3:29:49"}
|
||||
{"current_steps": 580, "total_steps": 1650, "loss": 0.0031, "lr": 3.280706937305445e-05, "epoch": 1.7591093117408907, "percentage": 35.15, "elapsed_time": "1:53:12", "remaining_time": "3:28:50"}
|
||||
{"current_steps": 585, "total_steps": 1650, "loss": 0.003, "lr": 3.2643864638973645e-05, "epoch": 1.7742914979757085, "percentage": 35.45, "elapsed_time": "1:54:08", "remaining_time": "3:27:47"}
|
||||
{"current_steps": 590, "total_steps": 1650, "loss": 0.0029, "lr": 3.2479245210490434e-05, "epoch": 1.7894736842105263, "percentage": 35.76, "elapsed_time": "1:55:05", "remaining_time": "3:26:46"}
|
||||
{"current_steps": 595, "total_steps": 1650, "loss": 0.0027, "lr": 3.2313229506513167e-05, "epoch": 1.8046558704453441, "percentage": 36.06, "elapsed_time": "1:56:01", "remaining_time": "3:25:43"}
|
||||
{"current_steps": 600, "total_steps": 1650, "loss": 0.0025, "lr": 3.2145836102176424e-05, "epoch": 1.819838056680162, "percentage": 36.36, "elapsed_time": "1:56:58", "remaining_time": "3:24:42"}
|
||||
{"current_steps": 605, "total_steps": 1650, "loss": 0.0028, "lr": 3.197708372676265e-05, "epoch": 1.8350202429149798, "percentage": 36.67, "elapsed_time": "1:58:02", "remaining_time": "3:23:54"}
|
||||
{"current_steps": 610, "total_steps": 1650, "loss": 0.0024, "lr": 3.1806991261606604e-05, "epoch": 1.8502024291497976, "percentage": 36.97, "elapsed_time": "1:58:57", "remaining_time": "3:22:48"}
|
||||
{"current_steps": 615, "total_steps": 1650, "loss": 0.0025, "lr": 3.163557773798276e-05, "epoch": 1.8653846153846154, "percentage": 37.27, "elapsed_time": "1:59:58", "remaining_time": "3:21:53"}
|
||||
{"current_steps": 620, "total_steps": 1650, "loss": 0.0027, "lr": 3.146286233497593e-05, "epoch": 1.8805668016194332, "percentage": 37.58, "elapsed_time": "2:00:53", "remaining_time": "3:20:49"}
|
||||
{"current_steps": 625, "total_steps": 1650, "loss": 0.0026, "lr": 3.128886437733539e-05, "epoch": 1.895748987854251, "percentage": 37.88, "elapsed_time": "2:01:45", "remaining_time": "3:19:41"}
|
||||
{"current_steps": 630, "total_steps": 1650, "loss": 0.0026, "lr": 3.111360333331263e-05, "epoch": 1.9109311740890689, "percentage": 38.18, "elapsed_time": "2:02:41", "remaining_time": "3:18:37"}
|
||||
{"current_steps": 635, "total_steps": 1650, "loss": 0.0025, "lr": 3.093709881248312e-05, "epoch": 1.9261133603238867, "percentage": 38.48, "elapsed_time": "2:03:38", "remaining_time": "3:17:37"}
|
||||
{"current_steps": 640, "total_steps": 1650, "loss": 0.0027, "lr": 3.075937056355225e-05, "epoch": 1.9412955465587043, "percentage": 38.79, "elapsed_time": "2:04:28", "remaining_time": "3:16:25"}
|
||||
{"current_steps": 645, "total_steps": 1650, "loss": 0.0024, "lr": 3.0580438472145665e-05, "epoch": 1.9564777327935223, "percentage": 39.09, "elapsed_time": "2:05:22", "remaining_time": "3:15:20"}
|
||||
{"current_steps": 650, "total_steps": 1650, "loss": 0.0022, "lr": 3.0400322558584308e-05, "epoch": 1.97165991902834, "percentage": 39.39, "elapsed_time": "2:06:23", "remaining_time": "3:14:26"}
|
||||
{"current_steps": 655, "total_steps": 1650, "loss": 0.0022, "lr": 3.0219042975644415e-05, "epoch": 1.986842105263158, "percentage": 39.7, "elapsed_time": "2:07:23", "remaining_time": "3:13:31"}
|
||||
{"current_steps": 660, "total_steps": 1650, "loss": 0.002, "lr": 3.0036620006302624e-05, "epoch": 2.0, "percentage": 40.0, "elapsed_time": "2:08:18", "remaining_time": "3:12:27"}
|
||||
{"current_steps": 665, "total_steps": 1650, "loss": 0.002, "lr": 2.9853074061466602e-05, "epoch": 2.0151821862348176, "percentage": 40.3, "elapsed_time": "2:09:13", "remaining_time": "3:11:24"}
|
||||
{"current_steps": 670, "total_steps": 1650, "loss": 0.002, "lr": 2.9668425677691278e-05, "epoch": 2.0303643724696356, "percentage": 40.61, "elapsed_time": "2:10:07", "remaining_time": "3:10:19"}
|
||||
{"current_steps": 675, "total_steps": 1650, "loss": 0.0021, "lr": 2.948269551488108e-05, "epoch": 2.0455465587044532, "percentage": 40.91, "elapsed_time": "2:11:01", "remaining_time": "3:09:15"}
|
||||
{"current_steps": 680, "total_steps": 1650, "loss": 0.0019, "lr": 2.929590435397832e-05, "epoch": 2.0607287449392713, "percentage": 41.21, "elapsed_time": "2:11:59", "remaining_time": "3:08:17"}
|
||||
{"current_steps": 685, "total_steps": 1650, "loss": 0.0021, "lr": 2.9108073094638066e-05, "epoch": 2.075910931174089, "percentage": 41.52, "elapsed_time": "2:12:59", "remaining_time": "3:07:21"}
|
||||
{"current_steps": 690, "total_steps": 1650, "loss": 0.002, "lr": 2.8919222752889727e-05, "epoch": 2.091093117408907, "percentage": 41.82, "elapsed_time": "2:13:56", "remaining_time": "3:06:21"}
|
||||
{"current_steps": 695, "total_steps": 1650, "loss": 0.0019, "lr": 2.8729374458785647e-05, "epoch": 2.1062753036437245, "percentage": 42.12, "elapsed_time": "2:14:53", "remaining_time": "3:05:20"}
|
||||
{"current_steps": 700, "total_steps": 1650, "loss": 0.0021, "lr": 2.8538549454036838e-05, "epoch": 2.1214574898785425, "percentage": 42.42, "elapsed_time": "2:15:48", "remaining_time": "3:04:18"}
|
||||
{"current_steps": 705, "total_steps": 1650, "loss": 0.002, "lr": 2.834676908963636e-05, "epoch": 2.13663967611336, "percentage": 42.73, "elapsed_time": "2:16:42", "remaining_time": "3:03:14"}
|
||||
{"current_steps": 710, "total_steps": 1650, "loss": 0.0018, "lr": 2.815405482347037e-05, "epoch": 2.151821862348178, "percentage": 43.03, "elapsed_time": "2:17:39", "remaining_time": "3:02:14"}
|
||||
{"current_steps": 715, "total_steps": 1650, "loss": 0.002, "lr": 2.796042821791725e-05, "epoch": 2.167004048582996, "percentage": 43.33, "elapsed_time": "2:18:33", "remaining_time": "3:01:11"}
|
||||
{"current_steps": 720, "total_steps": 1650, "loss": 0.0017, "lr": 2.776591093743505e-05, "epoch": 2.182186234817814, "percentage": 43.64, "elapsed_time": "2:19:31", "remaining_time": "3:00:12"}
|
||||
{"current_steps": 725, "total_steps": 1650, "loss": 0.0016, "lr": 2.7570524746137485e-05, "epoch": 2.1973684210526314, "percentage": 43.94, "elapsed_time": "2:20:27", "remaining_time": "2:59:12"}
|
||||
{"current_steps": 730, "total_steps": 1650, "loss": 0.0018, "lr": 2.7374291505358818e-05, "epoch": 2.2125506072874495, "percentage": 44.24, "elapsed_time": "2:21:19", "remaining_time": "2:58:06"}
|
||||
{"current_steps": 735, "total_steps": 1650, "loss": 0.0021, "lr": 2.7177233171207817e-05, "epoch": 2.227732793522267, "percentage": 44.55, "elapsed_time": "2:22:11", "remaining_time": "2:57:00"}
|
||||
{"current_steps": 740, "total_steps": 1650, "loss": 0.0017, "lr": 2.6979371792111147e-05, "epoch": 2.242914979757085, "percentage": 44.85, "elapsed_time": "2:23:06", "remaining_time": "2:55:59"}
|
||||
{"current_steps": 745, "total_steps": 1650, "loss": 0.0018, "lr": 2.678072950634641e-05, "epoch": 2.2580971659919027, "percentage": 45.15, "elapsed_time": "2:24:05", "remaining_time": "2:55:02"}
|
||||
{"current_steps": 750, "total_steps": 1650, "loss": 0.0017, "lr": 2.6581328539565184e-05, "epoch": 2.2732793522267207, "percentage": 45.45, "elapsed_time": "2:25:02", "remaining_time": "2:54:02"}
|
||||
{"current_steps": 755, "total_steps": 1650, "loss": 0.0017, "lr": 2.638119120230616e-05, "epoch": 2.2884615384615383, "percentage": 45.76, "elapsed_time": "2:25:56", "remaining_time": "2:53:00"}
|
||||
{"current_steps": 760, "total_steps": 1650, "loss": 0.0018, "lr": 2.618033988749895e-05, "epoch": 2.3036437246963564, "percentage": 46.06, "elapsed_time": "2:26:47", "remaining_time": "2:51:53"}
|
||||
{"current_steps": 765, "total_steps": 1650, "loss": 0.0015, "lr": 2.5978797067958542e-05, "epoch": 2.318825910931174, "percentage": 46.36, "elapsed_time": "2:27:37", "remaining_time": "2:50:47"}
|
||||
{"current_steps": 770, "total_steps": 1650, "loss": 0.0016, "lr": 2.5776585293870877e-05, "epoch": 2.334008097165992, "percentage": 46.67, "elapsed_time": "2:28:31", "remaining_time": "2:49:44"}
|
||||
{"current_steps": 775, "total_steps": 1650, "loss": 0.0017, "lr": 2.557372719026976e-05, "epoch": 2.3491902834008096, "percentage": 46.97, "elapsed_time": "2:29:25", "remaining_time": "2:48:42"}
|
||||
{"current_steps": 780, "total_steps": 1650, "loss": 0.0015, "lr": 2.537024545450539e-05, "epoch": 2.3643724696356276, "percentage": 47.27, "elapsed_time": "2:30:21", "remaining_time": "2:47:42"}
|
||||
{"current_steps": 785, "total_steps": 1650, "loss": 0.0017, "lr": 2.5166162853704825e-05, "epoch": 2.3795546558704452, "percentage": 47.58, "elapsed_time": "2:31:15", "remaining_time": "2:46:40"}
|
||||
{"current_steps": 790, "total_steps": 1650, "loss": 0.0017, "lr": 2.496150222222458e-05, "epoch": 2.3947368421052633, "percentage": 47.88, "elapsed_time": "2:32:13", "remaining_time": "2:45:42"}
|
||||
{"current_steps": 795, "total_steps": 1650, "loss": 0.0015, "lr": 2.475628645909576e-05, "epoch": 2.409919028340081, "percentage": 48.18, "elapsed_time": "2:33:13", "remaining_time": "2:44:47"}
|
||||
{"current_steps": 800, "total_steps": 1650, "loss": 0.0014, "lr": 2.4550538525461963e-05, "epoch": 2.425101214574899, "percentage": 48.48, "elapsed_time": "2:34:11", "remaining_time": "2:43:49"}
|
||||
{"current_steps": 805, "total_steps": 1650, "loss": 0.0013, "lr": 2.434428144201016e-05, "epoch": 2.4402834008097165, "percentage": 48.79, "elapsed_time": "2:35:10", "remaining_time": "2:42:52"}
|
||||
{"current_steps": 810, "total_steps": 1650, "loss": 0.0014, "lr": 2.4137538286394976e-05, "epoch": 2.4554655870445345, "percentage": 49.09, "elapsed_time": "2:36:01", "remaining_time": "2:41:48"}
|
||||
{"current_steps": 815, "total_steps": 1650, "loss": 0.0013, "lr": 2.3930332190656604e-05, "epoch": 2.470647773279352, "percentage": 49.39, "elapsed_time": "2:36:56", "remaining_time": "2:40:47"}
|
||||
{"current_steps": 820, "total_steps": 1650, "loss": 0.0014, "lr": 2.3722686338632602e-05, "epoch": 2.48582995951417, "percentage": 49.7, "elapsed_time": "2:37:51", "remaining_time": "2:39:46"}
|
||||
{"current_steps": 825, "total_steps": 1650, "loss": 0.0012, "lr": 2.3514623963363886e-05, "epoch": 2.501012145748988, "percentage": 50.0, "elapsed_time": "2:38:50", "remaining_time": "2:38:50"}
|
||||
{"current_steps": 830, "total_steps": 1650, "loss": 0.0012, "lr": 2.330616834449525e-05, "epoch": 2.516194331983806, "percentage": 50.3, "elapsed_time": "2:39:46", "remaining_time": "2:37:50"}
|
||||
{"current_steps": 835, "total_steps": 1650, "loss": 0.0013, "lr": 2.309734280567065e-05, "epoch": 2.5313765182186234, "percentage": 50.61, "elapsed_time": "2:40:43", "remaining_time": "2:36:52"}
|
||||
{"current_steps": 840, "total_steps": 1650, "loss": 0.0013, "lr": 2.28881707119236e-05, "epoch": 2.5465587044534415, "percentage": 50.91, "elapsed_time": "2:41:44", "remaining_time": "2:35:57"}
|
||||
{"current_steps": 845, "total_steps": 1650, "loss": 0.0012, "lr": 2.267867546706287e-05, "epoch": 2.561740890688259, "percentage": 51.21, "elapsed_time": "2:42:41", "remaining_time": "2:34:59"}
|
||||
{"current_steps": 850, "total_steps": 1650, "loss": 0.0013, "lr": 2.2468880511053896e-05, "epoch": 2.5769230769230766, "percentage": 51.52, "elapsed_time": "2:43:37", "remaining_time": "2:33:59"}
|
||||
{"current_steps": 855, "total_steps": 1650, "loss": 0.0013, "lr": 2.2258809317396163e-05, "epoch": 2.5921052631578947, "percentage": 51.82, "elapsed_time": "2:44:30", "remaining_time": "2:32:57"}
|
||||
{"current_steps": 860, "total_steps": 1650, "loss": 0.0012, "lr": 2.2048485390496757e-05, "epoch": 2.6072874493927127, "percentage": 52.12, "elapsed_time": "2:45:19", "remaining_time": "2:31:52"}
|
||||
{"current_steps": 865, "total_steps": 1650, "loss": 0.001, "lr": 2.1837932263040553e-05, "epoch": 2.6224696356275303, "percentage": 52.42, "elapsed_time": "2:46:20", "remaining_time": "2:30:57"}
|
||||
{"current_steps": 870, "total_steps": 1650, "loss": 0.0011, "lr": 2.1627173493357167e-05, "epoch": 2.637651821862348, "percentage": 52.73, "elapsed_time": "2:47:17", "remaining_time": "2:29:59"}
|
||||
{"current_steps": 875, "total_steps": 1650, "loss": 0.0011, "lr": 2.1416232662785084e-05, "epoch": 2.652834008097166, "percentage": 53.03, "elapsed_time": "2:48:12", "remaining_time": "2:28:58"}
|
||||
{"current_steps": 880, "total_steps": 1650, "loss": 0.0011, "lr": 2.1205133373033173e-05, "epoch": 2.668016194331984, "percentage": 53.33, "elapsed_time": "2:49:05", "remaining_time": "2:27:57"}
|
||||
{"current_steps": 885, "total_steps": 1650, "loss": 0.0011, "lr": 2.0993899243539953e-05, "epoch": 2.6831983805668016, "percentage": 53.64, "elapsed_time": "2:50:03", "remaining_time": "2:27:00"}
|
||||
{"current_steps": 890, "total_steps": 1650, "loss": 0.001, "lr": 2.0782553908830887e-05, "epoch": 2.698380566801619, "percentage": 53.94, "elapsed_time": "2:51:00", "remaining_time": "2:26:02"}
|
||||
{"current_steps": 895, "total_steps": 1650, "loss": 0.001, "lr": 2.0571121015873924e-05, "epoch": 2.7135627530364372, "percentage": 54.24, "elapsed_time": "2:51:52", "remaining_time": "2:24:59"}
|
||||
{"current_steps": 900, "total_steps": 1650, "loss": 0.001, "lr": 2.0359624221433728e-05, "epoch": 2.7287449392712553, "percentage": 54.55, "elapsed_time": "2:52:52", "remaining_time": "2:24:03"}
|
||||
{"current_steps": 905, "total_steps": 1650, "loss": 0.0009, "lr": 2.014808718942476e-05, "epoch": 2.743927125506073, "percentage": 54.85, "elapsed_time": "2:54:00", "remaining_time": "2:23:15"}
|
||||
{"current_steps": 910, "total_steps": 1650, "loss": 0.001, "lr": 1.9936533588263557e-05, "epoch": 2.7591093117408905, "percentage": 55.15, "elapsed_time": "2:54:56", "remaining_time": "2:22:15"}
|
||||
{"current_steps": 915, "total_steps": 1650, "loss": 0.0009, "lr": 1.9724987088220565e-05, "epoch": 2.7742914979757085, "percentage": 55.45, "elapsed_time": "2:55:53", "remaining_time": "2:21:17"}
|
||||
{"current_steps": 920, "total_steps": 1650, "loss": 0.001, "lr": 1.951347135877169e-05, "epoch": 2.7894736842105265, "percentage": 55.76, "elapsed_time": "2:56:45", "remaining_time": "2:20:15"}
|
||||
{"current_steps": 925, "total_steps": 1650, "loss": 0.0008, "lr": 1.930201006594999e-05, "epoch": 2.804655870445344, "percentage": 56.06, "elapsed_time": "2:57:42", "remaining_time": "2:19:16"}
|
||||
{"current_steps": 930, "total_steps": 1650, "loss": 0.0009, "lr": 1.9090626869697714e-05, "epoch": 2.8198380566801617, "percentage": 56.36, "elapsed_time": "2:58:41", "remaining_time": "2:18:20"}
|
||||
{"current_steps": 935, "total_steps": 1650, "loss": 0.0008, "lr": 1.8879345421219063e-05, "epoch": 2.83502024291498, "percentage": 56.67, "elapsed_time": "2:59:37", "remaining_time": "2:17:21"}
|
||||
{"current_steps": 940, "total_steps": 1650, "loss": 0.0008, "lr": 1.8668189360333923e-05, "epoch": 2.850202429149798, "percentage": 56.97, "elapsed_time": "3:00:33", "remaining_time": "2:16:22"}
|
||||
{"current_steps": 945, "total_steps": 1650, "loss": 0.0008, "lr": 1.845718231283281e-05, "epoch": 2.8653846153846154, "percentage": 57.27, "elapsed_time": "3:01:33", "remaining_time": "2:15:26"}
|
||||
{"current_steps": 950, "total_steps": 1650, "loss": 0.0008, "lr": 1.8246347887833457e-05, "epoch": 2.880566801619433, "percentage": 57.58, "elapsed_time": "3:02:24", "remaining_time": "2:14:24"}
|
||||
{"current_steps": 955, "total_steps": 1650, "loss": 0.0008, "lr": 1.8035709675139258e-05, "epoch": 2.895748987854251, "percentage": 57.88, "elapsed_time": "3:03:19", "remaining_time": "2:13:24"}
|
||||
{"current_steps": 960, "total_steps": 1650, "loss": 0.0009, "lr": 1.7825291242599837e-05, "epoch": 2.910931174089069, "percentage": 58.18, "elapsed_time": "3:04:12", "remaining_time": "2:12:23"}
|
||||
{"current_steps": 965, "total_steps": 1650, "loss": 0.0008, "lr": 1.7615116133474084e-05, "epoch": 2.9261133603238867, "percentage": 58.48, "elapsed_time": "3:05:07", "remaining_time": "2:11:24"}
|
||||
{"current_steps": 970, "total_steps": 1650, "loss": 0.0007, "lr": 1.7405207863795966e-05, "epoch": 2.9412955465587043, "percentage": 58.79, "elapsed_time": "3:06:09", "remaining_time": "2:10:30"}
|
||||
{"current_steps": 975, "total_steps": 1650, "loss": 0.0007, "lr": 1.719558991974339e-05, "epoch": 2.9564777327935223, "percentage": 59.09, "elapsed_time": "3:07:05", "remaining_time": "2:09:31"}
|
||||
{"current_steps": 980, "total_steps": 1650, "loss": 0.0007, "lr": 1.698628575501034e-05, "epoch": 2.97165991902834, "percentage": 59.39, "elapsed_time": "3:08:02", "remaining_time": "2:08:33"}
|
||||
{"current_steps": 985, "total_steps": 1650, "loss": 0.0008, "lr": 1.6777318788182723e-05, "epoch": 2.986842105263158, "percentage": 59.7, "elapsed_time": "3:09:00", "remaining_time": "2:07:36"}
|
||||
{"current_steps": 990, "total_steps": 1650, "loss": 0.0007, "lr": 1.6568712400118102e-05, "epoch": 3.0, "percentage": 60.0, "elapsed_time": "3:09:47", "remaining_time": "2:06:31"}
|
||||
{"current_steps": 995, "total_steps": 1650, "loss": 0.0006, "lr": 1.636048993132969e-05, "epoch": 3.0151821862348176, "percentage": 60.3, "elapsed_time": "3:10:48", "remaining_time": "2:05:36"}
|
||||
{"current_steps": 1000, "total_steps": 1650, "loss": 0.0006, "lr": 1.615267467937479e-05, "epoch": 3.0303643724696356, "percentage": 60.61, "elapsed_time": "3:11:42", "remaining_time": "2:04:36"}
|
||||
{"current_steps": 1005, "total_steps": 1650, "loss": 0.0007, "lr": 1.59452898962481e-05, "epoch": 3.0455465587044532, "percentage": 60.91, "elapsed_time": "3:12:39", "remaining_time": "2:03:38"}
|
||||
{"current_steps": 1010, "total_steps": 1650, "loss": 0.0006, "lr": 1.573835878578013e-05, "epoch": 3.0607287449392713, "percentage": 61.21, "elapsed_time": "3:13:34", "remaining_time": "2:02:39"}
|
||||
{"current_steps": 1015, "total_steps": 1650, "loss": 0.0007, "lr": 1.5531904501040917e-05, "epoch": 3.075910931174089, "percentage": 61.52, "elapsed_time": "3:14:28", "remaining_time": "2:01:40"}
|
||||
{"current_steps": 1020, "total_steps": 1650, "loss": 0.0006, "lr": 1.5325950141749522e-05, "epoch": 3.091093117408907, "percentage": 61.82, "elapsed_time": "3:15:27", "remaining_time": "2:00:43"}
|
||||
{"current_steps": 1025, "total_steps": 1650, "loss": 0.0007, "lr": 1.5120518751689438e-05, "epoch": 3.1062753036437245, "percentage": 62.12, "elapsed_time": "3:16:21", "remaining_time": "1:59:44"}
|
||||
{"current_steps": 1030, "total_steps": 1650, "loss": 0.0005, "lr": 1.4915633316130267e-05, "epoch": 3.1214574898785425, "percentage": 62.42, "elapsed_time": "3:17:13", "remaining_time": "1:58:43"}
|
||||
{"current_steps": 1035, "total_steps": 1650, "loss": 0.0006, "lr": 1.4711316759255963e-05, "epoch": 3.13663967611336, "percentage": 62.73, "elapsed_time": "3:18:07", "remaining_time": "1:57:43"}
|
||||
{"current_steps": 1040, "total_steps": 1650, "loss": 0.0006, "lr": 1.450759194159987e-05, "epoch": 3.151821862348178, "percentage": 63.03, "elapsed_time": "3:19:02", "remaining_time": "1:56:44"}
|
||||
{"current_steps": 1045, "total_steps": 1650, "loss": 0.0005, "lr": 1.4304481657486955e-05, "epoch": 3.167004048582996, "percentage": 63.33, "elapsed_time": "3:19:58", "remaining_time": "1:55:46"}
|
||||
{"current_steps": 1050, "total_steps": 1650, "loss": 0.0005, "lr": 1.4102008632483344e-05, "epoch": 3.182186234817814, "percentage": 63.64, "elapsed_time": "3:20:55", "remaining_time": "1:54:48"}
|
||||
{"current_steps": 1055, "total_steps": 1650, "loss": 0.0006, "lr": 1.3900195520853628e-05, "epoch": 3.1973684210526314, "percentage": 63.94, "elapsed_time": "3:21:50", "remaining_time": "1:53:50"}
|
||||
{"current_steps": 1060, "total_steps": 1650, "loss": 0.0005, "lr": 1.3699064903026149e-05, "epoch": 3.2125506072874495, "percentage": 64.24, "elapsed_time": "3:22:41", "remaining_time": "1:52:49"}
|
||||
{"current_steps": 1065, "total_steps": 1650, "loss": 0.0006, "lr": 1.34986392830665e-05, "epoch": 3.227732793522267, "percentage": 64.55, "elapsed_time": "3:23:36", "remaining_time": "1:51:50"}
|
||||
{"current_steps": 1070, "total_steps": 1650, "loss": 0.0005, "lr": 1.3298941086159598e-05, "epoch": 3.242914979757085, "percentage": 64.85, "elapsed_time": "3:24:31", "remaining_time": "1:50:51"}
|
||||
{"current_steps": 1075, "total_steps": 1650, "loss": 0.0004, "lr": 1.3099992656100592e-05, "epoch": 3.2580971659919027, "percentage": 65.15, "elapsed_time": "3:25:29", "remaining_time": "1:49:54"}
|
||||
{"current_steps": 1080, "total_steps": 1650, "loss": 0.0006, "lr": 1.2901816252794848e-05, "epoch": 3.2732793522267207, "percentage": 65.45, "elapsed_time": "3:26:23", "remaining_time": "1:48:55"}
|
||||
{"current_steps": 1085, "total_steps": 1650, "loss": 0.0004, "lr": 1.2704434049767356e-05, "epoch": 3.2884615384615383, "percentage": 65.76, "elapsed_time": "3:27:21", "remaining_time": "1:47:58"}
|
||||
{"current_steps": 1090, "total_steps": 1650, "loss": 0.0005, "lr": 1.250786813168176e-05, "epoch": 3.3036437246963564, "percentage": 66.06, "elapsed_time": "3:28:10", "remaining_time": "1:46:57"}
|
||||
{"current_steps": 1095, "total_steps": 1650, "loss": 0.0004, "lr": 1.2312140491869369e-05, "epoch": 3.318825910931174, "percentage": 66.36, "elapsed_time": "3:29:00", "remaining_time": "1:45:56"}
|
||||
{"current_steps": 1100, "total_steps": 1650, "loss": 0.0005, "lr": 1.2117273029868362e-05, "epoch": 3.334008097165992, "percentage": 66.67, "elapsed_time": "3:29:58", "remaining_time": "1:44:59"}
|
||||
{"current_steps": 1105, "total_steps": 1650, "loss": 0.0005, "lr": 1.1923287548973508e-05, "epoch": 3.3491902834008096, "percentage": 66.97, "elapsed_time": "3:30:53", "remaining_time": "1:44:00"}
|
||||
{"current_steps": 1110, "total_steps": 1650, "loss": 0.0004, "lr": 1.1730205753796631e-05, "epoch": 3.3643724696356276, "percentage": 67.27, "elapsed_time": "3:31:52", "remaining_time": "1:43:04"}
|
||||
{"current_steps": 1115, "total_steps": 1650, "loss": 0.0004, "lr": 1.1538049247838128e-05, "epoch": 3.3795546558704452, "percentage": 67.58, "elapsed_time": "3:32:49", "remaining_time": "1:42:07"}
|
||||
{"current_steps": 1120, "total_steps": 1650, "loss": 0.0003, "lr": 1.134683953106983e-05, "epoch": 3.3947368421052633, "percentage": 67.88, "elapsed_time": "3:33:47", "remaining_time": "1:41:10"}
|
||||
{"current_steps": 1125, "total_steps": 1650, "loss": 0.0005, "lr": 1.115659799752938e-05, "epoch": 3.409919028340081, "percentage": 68.18, "elapsed_time": "3:34:48", "remaining_time": "1:40:14"}
|
||||
{"current_steps": 1130, "total_steps": 1650, "loss": 0.0004, "lr": 1.096734593292649e-05, "epoch": 3.425101214574899, "percentage": 68.48, "elapsed_time": "3:35:47", "remaining_time": "1:39:17"}
|
||||
{"current_steps": 1135, "total_steps": 1650, "loss": 0.0004, "lr": 1.077910451226138e-05, "epoch": 3.4402834008097165, "percentage": 68.79, "elapsed_time": "3:36:45", "remaining_time": "1:38:21"}
|
||||
{"current_steps": 1140, "total_steps": 1650, "loss": 0.0004, "lr": 1.0591894797455526e-05, "epoch": 3.4554655870445345, "percentage": 69.09, "elapsed_time": "3:37:42", "remaining_time": "1:37:23"}
|
||||
{"current_steps": 1145, "total_steps": 1650, "loss": 0.0005, "lr": 1.0405737734995083e-05, "epoch": 3.470647773279352, "percentage": 69.39, "elapsed_time": "3:38:38", "remaining_time": "1:36:25"}
|
||||
{"current_steps": 1150, "total_steps": 1650, "loss": 0.0003, "lr": 1.0220654153587225e-05, "epoch": 3.48582995951417, "percentage": 69.7, "elapsed_time": "3:39:34", "remaining_time": "1:35:28"}
|
||||
{"current_steps": 1155, "total_steps": 1650, "loss": 0.0004, "lr": 1.00366647618297e-05, "epoch": 3.501012145748988, "percentage": 70.0, "elapsed_time": "3:40:28", "remaining_time": "1:34:29"}
|
||||
{"current_steps": 1160, "total_steps": 1650, "loss": 0.0004, "lr": 9.853790145893742e-06, "epoch": 3.516194331983806, "percentage": 70.3, "elapsed_time": "3:41:24", "remaining_time": "1:33:31"}
|
||||
{"current_steps": 1165, "total_steps": 1650, "loss": 0.0003, "lr": 9.672050767220765e-06, "epoch": 3.5313765182186234, "percentage": 70.61, "elapsed_time": "3:42:26", "remaining_time": "1:32:36"}
|
||||
{"current_steps": 1170, "total_steps": 1650, "loss": 0.0004, "lr": 9.491466960232955e-06, "epoch": 3.5465587044534415, "percentage": 70.91, "elapsed_time": "3:43:14", "remaining_time": "1:31:35"}
|
||||
{"current_steps": 1175, "total_steps": 1650, "loss": 0.0003, "lr": 9.312058930058114e-06, "epoch": 3.561740890688259, "percentage": 71.21, "elapsed_time": "3:44:16", "remaining_time": "1:30:39"}
|
||||
{"current_steps": 1180, "total_steps": 1650, "loss": 0.0003, "lr": 9.133846750268945e-06, "epoch": 3.5769230769230766, "percentage": 71.52, "elapsed_time": "3:45:12", "remaining_time": "1:29:42"}
|
||||
{"current_steps": 1185, "total_steps": 1650, "loss": 0.0003, "lr": 8.956850360637046e-06, "epoch": 3.5921052631578947, "percentage": 71.82, "elapsed_time": "3:46:09", "remaining_time": "1:28:44"}
|
||||
{"current_steps": 1190, "total_steps": 1650, "loss": 0.0003, "lr": 8.78108956490194e-06, "epoch": 3.6072874493927127, "percentage": 72.12, "elapsed_time": "3:47:01", "remaining_time": "1:27:45"}
|
||||
{"current_steps": 1195, "total_steps": 1650, "loss": 0.0003, "lr": 8.606584028555225e-06, "epoch": 3.6224696356275303, "percentage": 72.42, "elapsed_time": "3:47:58", "remaining_time": "1:26:48"}
|
||||
{"current_steps": 1200, "total_steps": 1650, "loss": 0.0003, "lr": 8.43335327664027e-06, "epoch": 3.637651821862348, "percentage": 72.73, "elapsed_time": "3:48:57", "remaining_time": "1:25:51"}
|
||||
{"current_steps": 1205, "total_steps": 1650, "loss": 0.0003, "lr": 8.261416691567601e-06, "epoch": 3.652834008097166, "percentage": 73.03, "elapsed_time": "3:50:04", "remaining_time": "1:24:57"}
|
||||
{"current_steps": 1210, "total_steps": 1650, "loss": 0.0003, "lr": 8.090793510946242e-06, "epoch": 3.668016194331984, "percentage": 73.33, "elapsed_time": "3:51:02", "remaining_time": "1:24:01"}
|
||||
{"current_steps": 1215, "total_steps": 1650, "loss": 0.0003, "lr": 7.921502825431258e-06, "epoch": 3.6831983805668016, "percentage": 73.64, "elapsed_time": "3:51:58", "remaining_time": "1:23:03"}
|
||||
{"current_steps": 1220, "total_steps": 1650, "loss": 0.0002, "lr": 7.753563576587753e-06, "epoch": 3.698380566801619, "percentage": 73.94, "elapsed_time": "3:52:56", "remaining_time": "1:22:06"}
|
||||
{"current_steps": 1225, "total_steps": 1650, "loss": 0.0003, "lr": 7.5869945547715275e-06, "epoch": 3.7135627530364372, "percentage": 74.24, "elapsed_time": "3:53:53", "remaining_time": "1:21:08"}
|
||||
{"current_steps": 1230, "total_steps": 1650, "loss": 0.0003, "lr": 7.421814397026674e-06, "epoch": 3.7287449392712553, "percentage": 74.55, "elapsed_time": "3:54:50", "remaining_time": "1:20:11"}
|
||||
{"current_steps": 1235, "total_steps": 1650, "loss": 0.0002, "lr": 7.258041585000317e-06, "epoch": 3.743927125506073, "percentage": 74.85, "elapsed_time": "3:55:44", "remaining_time": "1:19:13"}
|
||||
{"current_steps": 1240, "total_steps": 1650, "loss": 0.0002, "lr": 7.095694442874743e-06, "epoch": 3.7591093117408905, "percentage": 75.15, "elapsed_time": "3:56:40", "remaining_time": "1:18:15"}
|
||||
{"current_steps": 1245, "total_steps": 1650, "loss": 0.0002, "lr": 6.934791135317147e-06, "epoch": 3.7742914979757085, "percentage": 75.45, "elapsed_time": "3:57:37", "remaining_time": "1:17:18"}
|
||||
{"current_steps": 1250, "total_steps": 1650, "loss": 0.0003, "lr": 6.775349665447222e-06, "epoch": 3.7894736842105265, "percentage": 75.76, "elapsed_time": "3:58:30", "remaining_time": "1:16:19"}
|
||||
{"current_steps": 1255, "total_steps": 1650, "loss": 0.0003, "lr": 6.617387872822842e-06, "epoch": 3.804655870445344, "percentage": 76.06, "elapsed_time": "3:59:23", "remaining_time": "1:15:20"}
|
||||
{"current_steps": 1260, "total_steps": 1650, "loss": 0.0002, "lr": 6.460923431444015e-06, "epoch": 3.8198380566801617, "percentage": 76.36, "elapsed_time": "4:00:23", "remaining_time": "1:14:24"}
|
||||
{"current_steps": 1265, "total_steps": 1650, "loss": 0.0002, "lr": 6.305973847775406e-06, "epoch": 3.83502024291498, "percentage": 76.67, "elapsed_time": "4:01:18", "remaining_time": "1:13:26"}
|
||||
{"current_steps": 1270, "total_steps": 1650, "loss": 0.0003, "lr": 6.152556458787546e-06, "epoch": 3.850202429149798, "percentage": 76.97, "elapsed_time": "4:02:11", "remaining_time": "1:12:27"}
|
||||
{"current_steps": 1275, "total_steps": 1650, "loss": 0.0002, "lr": 6.000688430017048e-06, "epoch": 3.8653846153846154, "percentage": 77.27, "elapsed_time": "4:03:03", "remaining_time": "1:11:29"}
|
||||
{"current_steps": 1280, "total_steps": 1650, "loss": 0.0002, "lr": 5.850386753645998e-06, "epoch": 3.880566801619433, "percentage": 77.58, "elapsed_time": "4:04:04", "remaining_time": "1:10:33"}
|
||||
{"current_steps": 1285, "total_steps": 1650, "loss": 0.0002, "lr": 5.701668246600731e-06, "epoch": 3.895748987854251, "percentage": 77.88, "elapsed_time": "4:05:06", "remaining_time": "1:09:37"}
|
||||
{"current_steps": 1290, "total_steps": 1650, "loss": 0.0002, "lr": 5.554549548670227e-06, "epoch": 3.910931174089069, "percentage": 78.18, "elapsed_time": "4:05:55", "remaining_time": "1:08:37"}
|
||||
{"current_steps": 1295, "total_steps": 1650, "loss": 0.0002, "lr": 5.409047120644307e-06, "epoch": 3.9261133603238867, "percentage": 78.48, "elapsed_time": "4:06:49", "remaining_time": "1:07:39"}
|
||||
{"current_steps": 1300, "total_steps": 1650, "loss": 0.0002, "lr": 5.265177242471899e-06, "epoch": 3.9412955465587043, "percentage": 78.79, "elapsed_time": "4:07:47", "remaining_time": "1:06:42"}
|
||||
{"current_steps": 1305, "total_steps": 1650, "loss": 0.0002, "lr": 5.122956011439486e-06, "epoch": 3.9564777327935223, "percentage": 79.09, "elapsed_time": "4:08:48", "remaining_time": "1:05:46"}
|
||||
{"current_steps": 1310, "total_steps": 1650, "loss": 0.0002, "lr": 4.982399340370017e-06, "epoch": 3.97165991902834, "percentage": 79.39, "elapsed_time": "4:09:39", "remaining_time": "1:04:47"}
|
||||
{"current_steps": 1315, "total_steps": 1650, "loss": 0.0002, "lr": 4.843522955842464e-06, "epoch": 3.986842105263158, "percentage": 79.7, "elapsed_time": "4:10:34", "remaining_time": "1:03:50"}
|
||||
{"current_steps": 1320, "total_steps": 1650, "loss": 0.0002, "lr": 4.706342396432213e-06, "epoch": 4.0, "percentage": 80.0, "elapsed_time": "4:11:21", "remaining_time": "1:02:50"}
|
||||
{"current_steps": 1325, "total_steps": 1650, "loss": 0.0002, "lr": 4.570873010972477e-06, "epoch": 4.015182186234818, "percentage": 80.3, "elapsed_time": "4:12:18", "remaining_time": "1:01:53"}
|
||||
{"current_steps": 1330, "total_steps": 1650, "loss": 0.0002, "lr": 4.43712995683695e-06, "epoch": 4.030364372469635, "percentage": 80.61, "elapsed_time": "4:13:19", "remaining_time": "1:00:57"}
|
||||
{"current_steps": 1335, "total_steps": 1650, "loss": 0.0002, "lr": 4.305128198243888e-06, "epoch": 4.045546558704453, "percentage": 80.91, "elapsed_time": "4:14:16", "remaining_time": "0:59:59"}
|
||||
{"current_steps": 1340, "total_steps": 1650, "loss": 0.0001, "lr": 4.174882504581794e-06, "epoch": 4.060728744939271, "percentage": 81.21, "elapsed_time": "4:15:17", "remaining_time": "0:59:03"}
|
||||
{"current_steps": 1345, "total_steps": 1650, "loss": 0.0002, "lr": 4.046407448756895e-06, "epoch": 4.075910931174089, "percentage": 81.52, "elapsed_time": "4:16:13", "remaining_time": "0:58:06"}
|
||||
{"current_steps": 1350, "total_steps": 1650, "loss": 0.0002, "lr": 3.91971740556262e-06, "epoch": 4.0910931174089065, "percentage": 81.82, "elapsed_time": "4:17:13", "remaining_time": "0:57:09"}
|
||||
{"current_steps": 1355, "total_steps": 1650, "loss": 0.0002, "lr": 3.7948265500712313e-06, "epoch": 4.1062753036437245, "percentage": 82.12, "elapsed_time": "4:18:06", "remaining_time": "0:56:11"}
|
||||
{"current_steps": 1360, "total_steps": 1650, "loss": 0.0002, "lr": 3.6717488560478096e-06, "epoch": 4.1214574898785425, "percentage": 82.42, "elapsed_time": "4:19:00", "remaining_time": "0:55:13"}
|
||||
{"current_steps": 1365, "total_steps": 1650, "loss": 0.0002, "lr": 3.5504980943867538e-06, "epoch": 4.136639676113361, "percentage": 82.73, "elapsed_time": "4:19:54", "remaining_time": "0:54:15"}
|
||||
{"current_steps": 1370, "total_steps": 1650, "loss": 0.0002, "lr": 3.4310878315710074e-06, "epoch": 4.151821862348178, "percentage": 83.03, "elapsed_time": "4:20:47", "remaining_time": "0:53:18"}
|
||||
{"current_steps": 1375, "total_steps": 1650, "loss": 0.0002, "lr": 3.3135314281540954e-06, "epoch": 4.167004048582996, "percentage": 83.33, "elapsed_time": "4:21:44", "remaining_time": "0:52:20"}
|
||||
{"current_steps": 1380, "total_steps": 1650, "loss": 0.0002, "lr": 3.1978420372652776e-06, "epoch": 4.182186234817814, "percentage": 83.64, "elapsed_time": "4:22:41", "remaining_time": "0:51:23"}
|
||||
{"current_steps": 1385, "total_steps": 1650, "loss": 0.0002, "lr": 3.084032603137852e-06, "epoch": 4.197368421052632, "percentage": 83.94, "elapsed_time": "4:23:36", "remaining_time": "0:50:26"}
|
||||
{"current_steps": 1390, "total_steps": 1650, "loss": 0.0002, "lr": 2.9721158596608622e-06, "epoch": 4.212550607287449, "percentage": 84.24, "elapsed_time": "4:24:31", "remaining_time": "0:49:28"}
|
||||
{"current_steps": 1395, "total_steps": 1650, "loss": 0.0002, "lr": 2.8621043289543314e-06, "epoch": 4.227732793522267, "percentage": 84.55, "elapsed_time": "4:25:27", "remaining_time": "0:48:31"}
|
||||
{"current_steps": 1400, "total_steps": 1650, "loss": 0.0002, "lr": 2.754010319968181e-06, "epoch": 4.242914979757085, "percentage": 84.85, "elapsed_time": "4:26:26", "remaining_time": "0:47:34"}
|
||||
{"current_steps": 1405, "total_steps": 1650, "loss": 0.0001, "lr": 2.647845927105015e-06, "epoch": 4.258097165991903, "percentage": 85.15, "elapsed_time": "4:27:26", "remaining_time": "0:46:38"}
|
||||
{"current_steps": 1410, "total_steps": 1650, "loss": 0.0002, "lr": 2.543623028866915e-06, "epoch": 4.27327935222672, "percentage": 85.45, "elapsed_time": "4:28:27", "remaining_time": "0:45:41"}
|
||||
{"current_steps": 1415, "total_steps": 1650, "loss": 0.0002, "lr": 2.4413532865263533e-06, "epoch": 4.288461538461538, "percentage": 85.76, "elapsed_time": "4:29:21", "remaining_time": "0:44:44"}
|
||||
{"current_steps": 1420, "total_steps": 1650, "loss": 0.0002, "lr": 2.3410481428214602e-06, "epoch": 4.303643724696356, "percentage": 86.06, "elapsed_time": "4:30:18", "remaining_time": "0:43:46"}
|
||||
{"current_steps": 1425, "total_steps": 1650, "loss": 0.0001, "lr": 2.242718820675718e-06, "epoch": 4.318825910931174, "percentage": 86.36, "elapsed_time": "4:31:14", "remaining_time": "0:42:49"}
|
||||
{"current_steps": 1430, "total_steps": 1650, "loss": 0.0001, "lr": 2.1463763219422495e-06, "epoch": 4.334008097165992, "percentage": 86.67, "elapsed_time": "4:32:08", "remaining_time": "0:41:52"}
|
||||
{"current_steps": 1435, "total_steps": 1650, "loss": 0.0001, "lr": 2.0520314261728357e-06, "epoch": 4.34919028340081, "percentage": 86.97, "elapsed_time": "4:32:59", "remaining_time": "0:40:54"}
|
||||
{"current_steps": 1440, "total_steps": 1650, "loss": 0.0001, "lr": 1.9596946894118306e-06, "epoch": 4.364372469635628, "percentage": 87.27, "elapsed_time": "4:33:54", "remaining_time": "0:39:56"}
|
||||
{"current_steps": 1445, "total_steps": 1650, "loss": 0.0002, "lr": 1.8693764430150696e-06, "epoch": 4.379554655870446, "percentage": 87.58, "elapsed_time": "4:34:51", "remaining_time": "0:38:59"}
|
||||
{"current_steps": 1450, "total_steps": 1650, "loss": 0.0001, "lr": 1.7810867924938978e-06, "epoch": 4.394736842105263, "percentage": 87.88, "elapsed_time": "4:35:41", "remaining_time": "0:38:01"}
|
||||
{"current_steps": 1455, "total_steps": 1650, "loss": 0.0001, "lr": 1.6948356163845048e-06, "epoch": 4.409919028340081, "percentage": 88.18, "elapsed_time": "4:36:34", "remaining_time": "0:37:03"}
|
||||
{"current_steps": 1460, "total_steps": 1650, "loss": 0.0001, "lr": 1.610632565142627e-06, "epoch": 4.425101214574899, "percentage": 88.48, "elapsed_time": "4:37:24", "remaining_time": "0:36:06"}
|
||||
{"current_steps": 1465, "total_steps": 1650, "loss": 0.0001, "lr": 1.5284870600637813e-06, "epoch": 4.440283400809717, "percentage": 88.79, "elapsed_time": "4:38:23", "remaining_time": "0:35:09"}
|
||||
{"current_steps": 1470, "total_steps": 1650, "loss": 0.0001, "lr": 1.4484082922291376e-06, "epoch": 4.455465587044534, "percentage": 89.09, "elapsed_time": "4:39:20", "remaining_time": "0:34:12"}
|
||||
{"current_steps": 1475, "total_steps": 1650, "loss": 0.0001, "lr": 1.3704052214771513e-06, "epoch": 4.470647773279352, "percentage": 89.39, "elapsed_time": "4:40:17", "remaining_time": "0:33:15"}
|
||||
{"current_steps": 1480, "total_steps": 1650, "loss": 0.0001, "lr": 1.2944865754010682e-06, "epoch": 4.48582995951417, "percentage": 89.7, "elapsed_time": "4:41:17", "remaining_time": "0:32:18"}
|
||||
{"current_steps": 1485, "total_steps": 1650, "loss": 0.0001, "lr": 1.2206608483724013e-06, "epoch": 4.501012145748988, "percentage": 90.0, "elapsed_time": "4:42:09", "remaining_time": "0:31:21"}
|
||||
{"current_steps": 1490, "total_steps": 1650, "loss": 0.0002, "lr": 1.1489363005905241e-06, "epoch": 4.516194331983805, "percentage": 90.3, "elapsed_time": "4:43:05", "remaining_time": "0:30:23"}
|
||||
{"current_steps": 1495, "total_steps": 1650, "loss": 0.0001, "lr": 1.0793209571584562e-06, "epoch": 4.531376518218623, "percentage": 90.61, "elapsed_time": "4:43:58", "remaining_time": "0:29:26"}
|
||||
{"current_steps": 1500, "total_steps": 1650, "loss": 0.0001, "lr": 1.0118226071849424e-06, "epoch": 4.5465587044534415, "percentage": 90.91, "elapsed_time": "4:44:50", "remaining_time": "0:28:29"}
|
||||
{"current_steps": 1505, "total_steps": 1650, "loss": 0.0001, "lr": 9.464488029129581e-07, "epoch": 4.5617408906882595, "percentage": 91.21, "elapsed_time": "4:45:55", "remaining_time": "0:27:32"}
|
||||
{"current_steps": 1510, "total_steps": 1650, "loss": 0.0001, "lr": 8.832068588746945e-07, "epoch": 4.576923076923077, "percentage": 91.52, "elapsed_time": "4:46:47", "remaining_time": "0:26:35"}
|
||||
{"current_steps": 1515, "total_steps": 1650, "loss": 0.0001, "lr": 8.221038510731704e-07, "epoch": 4.592105263157895, "percentage": 91.82, "elapsed_time": "4:47:46", "remaining_time": "0:25:38"}
|
||||
{"current_steps": 1520, "total_steps": 1650, "loss": 0.0001, "lr": 7.631466161904821e-07, "epoch": 4.607287449392713, "percentage": 92.12, "elapsed_time": "4:48:42", "remaining_time": "0:24:41"}
|
||||
{"current_steps": 1525, "total_steps": 1650, "loss": 0.0001, "lr": 7.063417508228876e-07, "epoch": 4.62246963562753, "percentage": 92.42, "elapsed_time": "4:49:38", "remaining_time": "0:23:44"}
|
||||
{"current_steps": 1530, "total_steps": 1650, "loss": 0.0001, "lr": 6.516956107427241e-07, "epoch": 4.637651821862348, "percentage": 92.73, "elapsed_time": "4:50:37", "remaining_time": "0:22:47"}
|
||||
{"current_steps": 1535, "total_steps": 1650, "loss": 0.0001, "lr": 5.992143101872638e-07, "epoch": 4.652834008097166, "percentage": 93.03, "elapsed_time": "4:51:34", "remaining_time": "0:21:50"}
|
||||
{"current_steps": 1540, "total_steps": 1650, "loss": 0.0001, "lr": 5.489037211746184e-07, "epoch": 4.668016194331984, "percentage": 93.33, "elapsed_time": "4:52:26", "remaining_time": "0:20:53"}
|
||||
{"current_steps": 1545, "total_steps": 1650, "loss": 0.0001, "lr": 5.007694728467228e-07, "epoch": 4.683198380566802, "percentage": 93.64, "elapsed_time": "4:53:18", "remaining_time": "0:19:55"}
|
||||
{"current_steps": 1550, "total_steps": 1650, "loss": 0.0001, "lr": 4.548169508395028e-07, "epoch": 4.698380566801619, "percentage": 93.94, "elapsed_time": "4:54:14", "remaining_time": "0:18:59"}
|
||||
{"current_steps": 1555, "total_steps": 1650, "loss": 0.0001, "lr": 4.1105129668029595e-07, "epoch": 4.713562753036437, "percentage": 94.24, "elapsed_time": "4:55:10", "remaining_time": "0:18:02"}
|
||||
{"current_steps": 1560, "total_steps": 1650, "loss": 0.0001, "lr": 3.6947740721257066e-07, "epoch": 4.728744939271255, "percentage": 94.55, "elapsed_time": "4:56:09", "remaining_time": "0:17:05"}
|
||||
{"current_steps": 1565, "total_steps": 1650, "loss": 0.0001, "lr": 3.3009993404802486e-07, "epoch": 4.743927125506072, "percentage": 94.85, "elapsed_time": "4:56:59", "remaining_time": "0:16:07"}
|
||||
{"current_steps": 1570, "total_steps": 1650, "loss": 0.0001, "lr": 2.929232830461404e-07, "epoch": 4.7591093117408905, "percentage": 95.15, "elapsed_time": "4:57:55", "remaining_time": "0:15:10"}
|
||||
{"current_steps": 1575, "total_steps": 1650, "loss": 0.0001, "lr": 2.579516138212101e-07, "epoch": 4.7742914979757085, "percentage": 95.45, "elapsed_time": "4:58:54", "remaining_time": "0:14:14"}
|
||||
{"current_steps": 1580, "total_steps": 1650, "loss": 0.0001, "lr": 2.2518883927692857e-07, "epoch": 4.7894736842105265, "percentage": 95.76, "elapsed_time": "4:59:46", "remaining_time": "0:13:16"}
|
||||
{"current_steps": 1585, "total_steps": 1650, "loss": 0.0001, "lr": 1.9463862516859277e-07, "epoch": 4.804655870445345, "percentage": 96.06, "elapsed_time": "5:00:43", "remaining_time": "0:12:19"}
|
||||
{"current_steps": 1590, "total_steps": 1650, "loss": 0.0001, "lr": 1.6630438969294615e-07, "epoch": 4.819838056680162, "percentage": 96.36, "elapsed_time": "5:01:39", "remaining_time": "0:11:23"}
|
||||
{"current_steps": 1595, "total_steps": 1650, "loss": 0.0001, "lr": 1.4018930310571553e-07, "epoch": 4.83502024291498, "percentage": 96.67, "elapsed_time": "5:02:32", "remaining_time": "0:10:25"}
|
||||
{"current_steps": 1600, "total_steps": 1650, "loss": 0.0001, "lr": 1.1629628736690824e-07, "epoch": 4.850202429149798, "percentage": 96.97, "elapsed_time": "5:03:30", "remaining_time": "0:09:29"}
|
||||
{"current_steps": 1605, "total_steps": 1650, "loss": 0.0001, "lr": 9.46280158138757e-08, "epoch": 4.865384615384615, "percentage": 97.27, "elapsed_time": "5:04:23", "remaining_time": "0:08:32"}
|
||||
{"current_steps": 1610, "total_steps": 1650, "loss": 0.0001, "lr": 7.518691286220625e-08, "epoch": 4.880566801619433, "percentage": 97.58, "elapsed_time": "5:05:22", "remaining_time": "0:07:35"}
|
||||
{"current_steps": 1615, "total_steps": 1650, "loss": 0.0001, "lr": 5.797515373445084e-08, "epoch": 4.895748987854251, "percentage": 97.88, "elapsed_time": "5:06:14", "remaining_time": "0:06:38"}
|
||||
{"current_steps": 1620, "total_steps": 1650, "loss": 0.0001, "lr": 4.299466421675113e-08, "epoch": 4.910931174089069, "percentage": 98.18, "elapsed_time": "5:07:08", "remaining_time": "0:05:41"}
|
||||
{"current_steps": 1625, "total_steps": 1650, "loss": 0.0001, "lr": 3.0247120443362976e-08, "epoch": 4.926113360323887, "percentage": 98.48, "elapsed_time": "5:08:02", "remaining_time": "0:04:44"}
|
||||
{"current_steps": 1630, "total_steps": 1650, "loss": 0.0001, "lr": 1.973394870912193e-08, "epoch": 4.941295546558704, "percentage": 98.79, "elapsed_time": "5:09:03", "remaining_time": "0:03:47"}
|
||||
{"current_steps": 1635, "total_steps": 1650, "loss": 0.0001, "lr": 1.145632530985541e-08, "epoch": 4.956477732793522, "percentage": 99.09, "elapsed_time": "5:10:05", "remaining_time": "0:02:50"}
|
||||
{"current_steps": 1640, "total_steps": 1650, "loss": 0.0001, "lr": 5.415176410765721e-09, "epoch": 4.97165991902834, "percentage": 99.39, "elapsed_time": "5:11:06", "remaining_time": "0:01:53"}
|
||||
{"current_steps": 1645, "total_steps": 1650, "loss": 0.0001, "lr": 1.611177942812958e-09, "epoch": 4.9868421052631575, "percentage": 99.7, "elapsed_time": "5:12:03", "remaining_time": "0:00:56"}
|
||||
{"current_steps": 1650, "total_steps": 1650, "loss": 0.0001, "lr": 4.475552707772224e-11, "epoch": 5.0, "percentage": 100.0, "elapsed_time": "5:12:52", "remaining_time": "0:00:00"}
|
||||
{"current_steps": 1650, "total_steps": 1650, "epoch": 5.0, "percentage": 100.0, "elapsed_time": "5:13:00", "remaining_time": "0:00:00"}
|
||||
{"current_steps": 1650, "total_steps": 1650, "epoch": 5.0, "percentage": 100.0, "elapsed_time": "0:00:00", "remaining_time": "0:00:00"}
|
||||
{"current_steps": 1650, "total_steps": 1650, "epoch": 5.0, "percentage": 100.0, "elapsed_time": "0:00:00", "remaining_time": "0:00:00"}
|
||||
3673
trainer_state.json
Normal file
3673
trainer_state.json
Normal file
File diff suppressed because it is too large
Load Diff
3
training_args.bin
Normal file
3
training_args.bin
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:11b96207b8ad36ef7e1c3e6845e45a666ecc716120e99cfdd70f374f74395f95
|
||||
size 8657
|
||||
BIN
training_loss.png
Normal file
BIN
training_loss.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 36 KiB |
1
vocab.json
Normal file
1
vocab.json
Normal file
File diff suppressed because one or more lines are too long
Reference in New Issue
Block a user