初始化项目,由ModelHub XC社区提供模型
Model: laion/nemotron-terminal-adapters_swe__Qwen3-8B Source: Original Platform
This commit is contained in:
36
.gitattributes
vendored
Normal file
36
.gitattributes
vendored
Normal file
@@ -0,0 +1,36 @@
|
||||
*.7z filter=lfs diff=lfs merge=lfs -text
|
||||
*.arrow filter=lfs diff=lfs merge=lfs -text
|
||||
*.bin filter=lfs diff=lfs merge=lfs -text
|
||||
*.bz2 filter=lfs diff=lfs merge=lfs -text
|
||||
*.ckpt filter=lfs diff=lfs merge=lfs -text
|
||||
*.ftz filter=lfs diff=lfs merge=lfs -text
|
||||
*.gz filter=lfs diff=lfs merge=lfs -text
|
||||
*.h5 filter=lfs diff=lfs merge=lfs -text
|
||||
*.joblib filter=lfs diff=lfs merge=lfs -text
|
||||
*.lfs.* filter=lfs diff=lfs merge=lfs -text
|
||||
*.mlmodel filter=lfs diff=lfs merge=lfs -text
|
||||
*.model filter=lfs diff=lfs merge=lfs -text
|
||||
*.msgpack filter=lfs diff=lfs merge=lfs -text
|
||||
*.npy filter=lfs diff=lfs merge=lfs -text
|
||||
*.npz filter=lfs diff=lfs merge=lfs -text
|
||||
*.onnx filter=lfs diff=lfs merge=lfs -text
|
||||
*.ot filter=lfs diff=lfs merge=lfs -text
|
||||
*.parquet filter=lfs diff=lfs merge=lfs -text
|
||||
*.pb filter=lfs diff=lfs merge=lfs -text
|
||||
*.pickle filter=lfs diff=lfs merge=lfs -text
|
||||
*.pkl filter=lfs diff=lfs merge=lfs -text
|
||||
*.pt filter=lfs diff=lfs merge=lfs -text
|
||||
*.pth filter=lfs diff=lfs merge=lfs -text
|
||||
*.rar filter=lfs diff=lfs merge=lfs -text
|
||||
*.safetensors filter=lfs diff=lfs merge=lfs -text
|
||||
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
||||
*.tar.* filter=lfs diff=lfs merge=lfs -text
|
||||
*.tar filter=lfs diff=lfs merge=lfs -text
|
||||
*.tflite filter=lfs diff=lfs merge=lfs -text
|
||||
*.tgz filter=lfs diff=lfs merge=lfs -text
|
||||
*.wasm filter=lfs diff=lfs merge=lfs -text
|
||||
*.xz filter=lfs diff=lfs merge=lfs -text
|
||||
*.zip filter=lfs diff=lfs merge=lfs -text
|
||||
*.zst filter=lfs diff=lfs merge=lfs -text
|
||||
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
||||
tokenizer.json filter=lfs diff=lfs merge=lfs -text
|
||||
61
README.md
Normal file
61
README.md
Normal file
@@ -0,0 +1,61 @@
|
||||
---
|
||||
library_name: transformers
|
||||
license: other
|
||||
base_model: Qwen/Qwen3-8B
|
||||
tags:
|
||||
- llama-factory
|
||||
- full
|
||||
- generated_from_trainer
|
||||
model-index:
|
||||
- name: nemotron-adapters-swe__Qwen3-8B
|
||||
results: []
|
||||
---
|
||||
|
||||
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
||||
should probably proofread and complete it, then remove this comment. -->
|
||||
|
||||
# nemotron-adapters-swe__Qwen3-8B
|
||||
|
||||
This model is a fine-tuned version of [Qwen/Qwen3-8B](https://huggingface.co/Qwen/Qwen3-8B) on the /e/data1/datasets/playground/ot/hf_hub/datasets--laion--nemotron-terminal-adapters_swe/snapshots/297112e289bfaea4f73e193a41f860e868850e05_thinking_preprocessed dataset.
|
||||
|
||||
## Model description
|
||||
|
||||
More information needed
|
||||
|
||||
## Intended uses & limitations
|
||||
|
||||
More information needed
|
||||
|
||||
## Training and evaluation data
|
||||
|
||||
More information needed
|
||||
|
||||
## Training procedure
|
||||
|
||||
### Training hyperparameters
|
||||
|
||||
The following hyperparameters were used during training:
|
||||
- learning_rate: 4e-05
|
||||
- train_batch_size: 1
|
||||
- eval_batch_size: 8
|
||||
- seed: 42
|
||||
- distributed_type: multi-GPU
|
||||
- num_devices: 32
|
||||
- gradient_accumulation_steps: 3
|
||||
- total_train_batch_size: 96
|
||||
- total_eval_batch_size: 256
|
||||
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.98) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
||||
- lr_scheduler_type: cosine
|
||||
- lr_scheduler_warmup_ratio: 0.1
|
||||
- num_epochs: 5.0
|
||||
|
||||
### Training results
|
||||
|
||||
|
||||
|
||||
### Framework versions
|
||||
|
||||
- Transformers 4.57.6
|
||||
- Pytorch 2.9.1+cu130
|
||||
- Datasets 4.7.0
|
||||
- Tokenizers 0.22.2
|
||||
28
added_tokens.json
Normal file
28
added_tokens.json
Normal file
@@ -0,0 +1,28 @@
|
||||
{
|
||||
"</think>": 151668,
|
||||
"</tool_call>": 151658,
|
||||
"</tool_response>": 151666,
|
||||
"<think>": 151667,
|
||||
"<tool_call>": 151657,
|
||||
"<tool_response>": 151665,
|
||||
"<|box_end|>": 151649,
|
||||
"<|box_start|>": 151648,
|
||||
"<|endoftext|>": 151643,
|
||||
"<|file_sep|>": 151664,
|
||||
"<|fim_middle|>": 151660,
|
||||
"<|fim_pad|>": 151662,
|
||||
"<|fim_prefix|>": 151659,
|
||||
"<|fim_suffix|>": 151661,
|
||||
"<|im_end|>": 151645,
|
||||
"<|im_start|>": 151644,
|
||||
"<|image_pad|>": 151655,
|
||||
"<|object_ref_end|>": 151647,
|
||||
"<|object_ref_start|>": 151646,
|
||||
"<|quad_end|>": 151651,
|
||||
"<|quad_start|>": 151650,
|
||||
"<|repo_name|>": 151663,
|
||||
"<|video_pad|>": 151656,
|
||||
"<|vision_end|>": 151653,
|
||||
"<|vision_pad|>": 151654,
|
||||
"<|vision_start|>": 151652
|
||||
}
|
||||
16
all_results.json
Normal file
16
all_results.json
Normal file
@@ -0,0 +1,16 @@
|
||||
{
|
||||
"achieved_tflops_per_gpu": 21.65962676868645,
|
||||
"achieved_tflops_per_gpu_theoretical": 406.5485475387801,
|
||||
"epoch": 5.0,
|
||||
"loss_nan_ranks": 0,
|
||||
"loss_rank_avg": 0.12205812335014343,
|
||||
"mfu_percent": 1.530715672698689,
|
||||
"mfu_percent_theoretical": 28.731346115814848,
|
||||
"total_flos": 9.560237722870743e+18,
|
||||
"train_loss": 0.10292447277993867,
|
||||
"train_runtime": 13793.2861,
|
||||
"train_samples_per_second": 11.477,
|
||||
"train_steps_per_second": 0.12,
|
||||
"valid_targets_mean": 15604.5,
|
||||
"valid_targets_min": 4811
|
||||
}
|
||||
89
chat_template.jinja
Normal file
89
chat_template.jinja
Normal file
@@ -0,0 +1,89 @@
|
||||
{%- if tools %}
|
||||
{{- '<|im_start|>system\n' }}
|
||||
{%- if messages[0].role == 'system' %}
|
||||
{{- messages[0].content + '\n\n' }}
|
||||
{%- endif %}
|
||||
{{- "# Tools\n\nYou may call one or more functions to assist with the user query.\n\nYou are provided with function signatures within <tools></tools> XML tags:\n<tools>" }}
|
||||
{%- for tool in tools %}
|
||||
{{- "\n" }}
|
||||
{{- tool | tojson }}
|
||||
{%- endfor %}
|
||||
{{- "\n</tools>\n\nFor each function call, return a json object with function name and arguments within <tool_call></tool_call> XML tags:\n<tool_call>\n{\"name\": <function-name>, \"arguments\": <args-json-object>}\n</tool_call><|im_end|>\n" }}
|
||||
{%- else %}
|
||||
{%- if messages[0].role == 'system' %}
|
||||
{{- '<|im_start|>system\n' + messages[0].content + '<|im_end|>\n' }}
|
||||
{%- endif %}
|
||||
{%- endif %}
|
||||
{%- set ns = namespace(multi_step_tool=true, last_query_index=messages|length - 1) %}
|
||||
{%- for message in messages[::-1] %}
|
||||
{%- set index = (messages|length - 1) - loop.index0 %}
|
||||
{%- if ns.multi_step_tool and message.role == "user" and message.content is string and not(message.content.startswith('<tool_response>') and message.content.endswith('</tool_response>')) %}
|
||||
{%- set ns.multi_step_tool = false %}
|
||||
{%- set ns.last_query_index = index %}
|
||||
{%- endif %}
|
||||
{%- endfor %}
|
||||
{%- for message in messages %}
|
||||
{%- if message.content is string %}
|
||||
{%- set content = message.content %}
|
||||
{%- else %}
|
||||
{%- set content = '' %}
|
||||
{%- endif %}
|
||||
{%- if (message.role == "user") or (message.role == "system" and not loop.first) %}
|
||||
{{- '<|im_start|>' + message.role + '\n' + content + '<|im_end|>' + '\n' }}
|
||||
{%- elif message.role == "assistant" %}
|
||||
{%- set reasoning_content = '' %}
|
||||
{%- if message.reasoning_content is string %}
|
||||
{%- set reasoning_content = message.reasoning_content %}
|
||||
{%- else %}
|
||||
{%- if '</think>' in content %}
|
||||
{%- set reasoning_content = content.split('</think>')[0].rstrip('\n').split('<think>')[-1].lstrip('\n') %}
|
||||
{%- set content = content.split('</think>')[-1].lstrip('\n') %}
|
||||
{%- endif %}
|
||||
{%- endif %}
|
||||
{%- if loop.index0 > ns.last_query_index %}
|
||||
{%- if loop.last or (not loop.last and reasoning_content) %}
|
||||
{{- '<|im_start|>' + message.role + '\n<think>\n' + reasoning_content.strip('\n') + '\n</think>\n\n' + content.lstrip('\n') }}
|
||||
{%- else %}
|
||||
{{- '<|im_start|>' + message.role + '\n' + content }}
|
||||
{%- endif %}
|
||||
{%- else %}
|
||||
{{- '<|im_start|>' + message.role + '\n' + content }}
|
||||
{%- endif %}
|
||||
{%- if message.tool_calls %}
|
||||
{%- for tool_call in message.tool_calls %}
|
||||
{%- if (loop.first and content) or (not loop.first) %}
|
||||
{{- '\n' }}
|
||||
{%- endif %}
|
||||
{%- if tool_call.function %}
|
||||
{%- set tool_call = tool_call.function %}
|
||||
{%- endif %}
|
||||
{{- '<tool_call>\n{"name": "' }}
|
||||
{{- tool_call.name }}
|
||||
{{- '", "arguments": ' }}
|
||||
{%- if tool_call.arguments is string %}
|
||||
{{- tool_call.arguments }}
|
||||
{%- else %}
|
||||
{{- tool_call.arguments | tojson }}
|
||||
{%- endif %}
|
||||
{{- '}\n</tool_call>' }}
|
||||
{%- endfor %}
|
||||
{%- endif %}
|
||||
{{- '<|im_end|>\n' }}
|
||||
{%- elif message.role == "tool" %}
|
||||
{%- if loop.first or (messages[loop.index0 - 1].role != "tool") %}
|
||||
{{- '<|im_start|>user' }}
|
||||
{%- endif %}
|
||||
{{- '\n<tool_response>\n' }}
|
||||
{{- content }}
|
||||
{{- '\n</tool_response>' }}
|
||||
{%- if loop.last or (messages[loop.index0 + 1].role != "tool") %}
|
||||
{{- '<|im_end|>\n' }}
|
||||
{%- endif %}
|
||||
{%- endif %}
|
||||
{%- endfor %}
|
||||
{%- if add_generation_prompt %}
|
||||
{{- '<|im_start|>assistant\n' }}
|
||||
{%- if enable_thinking is defined and enable_thinking is false %}
|
||||
{{- '<think>\n\n</think>\n\n' }}
|
||||
{%- endif %}
|
||||
{%- endif %}
|
||||
68
config.json
Normal file
68
config.json
Normal file
@@ -0,0 +1,68 @@
|
||||
{
|
||||
"architectures": [
|
||||
"Qwen3ForCausalLM"
|
||||
],
|
||||
"attention_bias": false,
|
||||
"attention_dropout": 0.0,
|
||||
"dtype": "bfloat16",
|
||||
"eos_token_id": 151645,
|
||||
"head_dim": 128,
|
||||
"hidden_act": "silu",
|
||||
"hidden_size": 4096,
|
||||
"initializer_range": 0.02,
|
||||
"intermediate_size": 12288,
|
||||
"layer_types": [
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention",
|
||||
"full_attention"
|
||||
],
|
||||
"max_position_embeddings": 40960,
|
||||
"max_window_layers": 36,
|
||||
"model_type": "qwen3",
|
||||
"num_attention_heads": 32,
|
||||
"num_hidden_layers": 36,
|
||||
"num_key_value_heads": 8,
|
||||
"pad_token_id": 151643,
|
||||
"rms_norm_eps": 1e-06,
|
||||
"rope_scaling": null,
|
||||
"rope_theta": 1000000,
|
||||
"sliding_window": null,
|
||||
"tie_word_embeddings": false,
|
||||
"transformers_version": "4.57.6",
|
||||
"use_cache": false,
|
||||
"use_sliding_window": false,
|
||||
"vocab_size": 151936
|
||||
}
|
||||
12
generation_config.json
Normal file
12
generation_config.json
Normal file
@@ -0,0 +1,12 @@
|
||||
{
|
||||
"do_sample": true,
|
||||
"eos_token_id": [
|
||||
151645,
|
||||
151643
|
||||
],
|
||||
"pad_token_id": 151643,
|
||||
"temperature": 0.6,
|
||||
"top_k": 20,
|
||||
"top_p": 0.95,
|
||||
"transformers_version": "4.57.6"
|
||||
}
|
||||
151388
merges.txt
Normal file
151388
merges.txt
Normal file
File diff suppressed because it is too large
Load Diff
3
model-00001-of-00004.safetensors
Normal file
3
model-00001-of-00004.safetensors
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:ac87959dbd803d2fe69fc312f5bf59d853c9783f5b2787814084239373628565
|
||||
size 4902257696
|
||||
3
model-00002-of-00004.safetensors
Normal file
3
model-00002-of-00004.safetensors
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:9a1347e14b2fb3184a21b223e0a2cccbaa56525bdff20db45aa2880158839ef0
|
||||
size 4915960368
|
||||
3
model-00003-of-00004.safetensors
Normal file
3
model-00003-of-00004.safetensors
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:6f353c813af5a0ce7cd471916f1dd5aee82b9225e582f34b914fb5ac4ff7927f
|
||||
size 4983068496
|
||||
3
model-00004-of-00004.safetensors
Normal file
3
model-00004-of-00004.safetensors
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:737d3234a750489e16957bc2ced5fa7c4cdc40c49076cd0c39dc5b1104dda4d5
|
||||
size 1580230264
|
||||
407
model.safetensors.index.json
Normal file
407
model.safetensors.index.json
Normal file
@@ -0,0 +1,407 @@
|
||||
{
|
||||
"metadata": {
|
||||
"total_parameters": 308224,
|
||||
"total_size": 16381470720
|
||||
},
|
||||
"weight_map": {
|
||||
"lm_head.weight": "model-00004-of-00004.safetensors",
|
||||
"model.embed_tokens.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.0.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.0.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.0.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.0.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.0.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.0.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.0.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.0.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.0.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.0.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.0.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.1.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.1.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.1.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.1.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.1.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.1.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.1.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.1.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.1.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.1.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.1.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.10.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.10.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.10.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.10.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.10.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.10.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.10.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.10.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.10.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.10.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.10.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.11.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.11.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.11.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.11.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.11.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.11.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.11.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.11.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.11.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.11.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.11.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.12.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.12.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.12.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.12.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.12.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.12.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.12.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.12.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.12.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.12.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.12.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.13.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.13.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.13.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.13.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.13.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.13.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.13.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.13.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.13.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.13.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.13.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.14.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.14.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.14.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.14.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.14.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.14.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.14.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.14.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.14.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.14.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.14.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.15.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.15.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.15.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.15.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.15.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.15.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.15.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.15.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.15.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.15.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.15.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.16.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.16.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.16.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.16.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.16.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.16.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.16.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.16.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.16.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.16.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.16.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.17.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.17.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.17.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.17.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.17.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.17.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.17.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.17.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.17.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.17.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.17.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.18.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.18.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.18.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.18.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.18.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.18.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.18.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.18.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.18.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.18.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.18.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.19.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.19.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.19.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.19.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.19.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.19.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.19.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.19.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.19.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.19.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.19.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.2.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.2.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.2.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.2.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.2.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.2.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.2.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.2.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.2.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.2.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.2.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.20.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.20.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.20.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.20.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.20.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.20.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.20.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.20.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.20.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.20.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.20.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.21.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.21.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.21.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.21.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.21.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.21.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.21.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.21.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.21.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.21.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.21.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.22.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.22.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.22.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.22.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.22.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.22.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.22.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.22.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.22.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.22.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.22.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.23.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.23.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.23.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.23.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.23.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.23.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.23.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.23.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.23.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.23.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.23.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.24.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.24.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.24.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.24.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.24.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.24.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.24.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.24.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.24.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.24.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.24.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.25.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.25.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.25.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.25.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.25.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.25.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.25.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.25.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.25.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.25.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.25.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.26.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.26.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.26.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.26.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.26.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.26.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.26.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.26.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.26.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.26.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.26.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.27.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.27.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.27.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.27.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.27.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.27.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.27.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.27.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.27.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.27.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.27.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.28.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.28.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.28.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.28.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.28.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.28.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.28.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.28.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.28.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.28.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.28.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.29.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.29.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.29.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.29.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.29.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.29.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.29.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.29.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.29.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.29.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.29.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.3.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.3.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.3.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.3.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.3.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.3.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.3.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.3.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.3.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.3.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.3.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.30.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.30.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.30.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.30.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.30.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.30.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.30.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.30.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.30.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.30.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.30.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.31.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.31.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.31.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.31.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.31.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.31.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.31.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.31.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.31.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.31.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.31.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.32.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.32.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.32.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.32.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.32.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.32.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.32.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.32.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.32.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.32.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.32.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.33.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.33.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.33.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.33.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.33.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.33.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.33.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.33.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.33.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.33.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.33.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.34.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.34.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.34.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.34.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.34.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.34.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.34.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.34.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.34.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.34.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.34.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.35.input_layernorm.weight": "model-00004-of-00004.safetensors",
|
||||
"model.layers.35.mlp.down_proj.weight": "model-00004-of-00004.safetensors",
|
||||
"model.layers.35.mlp.gate_proj.weight": "model-00004-of-00004.safetensors",
|
||||
"model.layers.35.mlp.up_proj.weight": "model-00004-of-00004.safetensors",
|
||||
"model.layers.35.post_attention_layernorm.weight": "model-00004-of-00004.safetensors",
|
||||
"model.layers.35.self_attn.k_norm.weight": "model-00004-of-00004.safetensors",
|
||||
"model.layers.35.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.35.self_attn.o_proj.weight": "model-00004-of-00004.safetensors",
|
||||
"model.layers.35.self_attn.q_norm.weight": "model-00004-of-00004.safetensors",
|
||||
"model.layers.35.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.35.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
||||
"model.layers.4.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.4.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.4.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.4.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.4.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.4.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.4.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.4.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.4.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.4.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.4.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.5.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.5.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.5.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.5.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.5.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.5.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.5.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.5.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.5.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.5.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.5.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.6.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.6.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.6.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.6.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.6.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.6.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.6.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.6.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.6.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.6.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.6.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.7.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.7.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.7.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.7.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.7.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.7.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.7.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.7.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.7.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.7.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.7.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.8.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.8.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.8.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.8.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.8.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.8.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.8.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.8.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.8.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.8.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.8.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.9.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.9.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.9.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.9.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.9.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
||||
"model.layers.9.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.9.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.9.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.9.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.9.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.layers.9.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
||||
"model.norm.weight": "model-00004-of-00004.safetensors"
|
||||
}
|
||||
}
|
||||
12
run_summary.json
Normal file
12
run_summary.json
Normal file
@@ -0,0 +1,12 @@
|
||||
{
|
||||
"agent_name": "297112e289bfaea4f73e193a41f860e868850e05_thinking_preprocessed",
|
||||
"training_start": null,
|
||||
"training_end": null,
|
||||
"created_by": "DCAgent",
|
||||
"base_model_name": "Qwen/Qwen3-8B",
|
||||
"dataset_name": "/e/data1/datasets/playground/ot/hf_hub/datasets--laion--nemotron-terminal-adapters_swe/snapshots/297112e289bfaea4f73e193a41f860e868850e05_thinking_preprocessed",
|
||||
"training_type": "SFT",
|
||||
"training_parameters": "https://huggingface.co/laion/nemotron-terminal-adapters_swe__Qwen3-8B/blob/main/config.json",
|
||||
"wandb_link": null,
|
||||
"traces_location_s3": null
|
||||
}
|
||||
31
special_tokens_map.json
Normal file
31
special_tokens_map.json
Normal file
@@ -0,0 +1,31 @@
|
||||
{
|
||||
"additional_special_tokens": [
|
||||
"<|im_start|>",
|
||||
"<|im_end|>",
|
||||
"<|object_ref_start|>",
|
||||
"<|object_ref_end|>",
|
||||
"<|box_start|>",
|
||||
"<|box_end|>",
|
||||
"<|quad_start|>",
|
||||
"<|quad_end|>",
|
||||
"<|vision_start|>",
|
||||
"<|vision_end|>",
|
||||
"<|vision_pad|>",
|
||||
"<|image_pad|>",
|
||||
"<|video_pad|>"
|
||||
],
|
||||
"eos_token": {
|
||||
"content": "<|im_end|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false
|
||||
},
|
||||
"pad_token": {
|
||||
"content": "<|endoftext|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false
|
||||
}
|
||||
}
|
||||
3
tokenizer.json
Normal file
3
tokenizer.json
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:aeb13307a71acd8fe81861d94ad54ab689df773318809eed3cbe794b4492dae4
|
||||
size 11422654
|
||||
240
tokenizer_config.json
Normal file
240
tokenizer_config.json
Normal file
@@ -0,0 +1,240 @@
|
||||
{
|
||||
"add_bos_token": false,
|
||||
"add_prefix_space": false,
|
||||
"added_tokens_decoder": {
|
||||
"151643": {
|
||||
"content": "<|endoftext|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": true
|
||||
},
|
||||
"151644": {
|
||||
"content": "<|im_start|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": true
|
||||
},
|
||||
"151645": {
|
||||
"content": "<|im_end|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": true
|
||||
},
|
||||
"151646": {
|
||||
"content": "<|object_ref_start|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": true
|
||||
},
|
||||
"151647": {
|
||||
"content": "<|object_ref_end|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": true
|
||||
},
|
||||
"151648": {
|
||||
"content": "<|box_start|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": true
|
||||
},
|
||||
"151649": {
|
||||
"content": "<|box_end|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": true
|
||||
},
|
||||
"151650": {
|
||||
"content": "<|quad_start|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": true
|
||||
},
|
||||
"151651": {
|
||||
"content": "<|quad_end|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": true
|
||||
},
|
||||
"151652": {
|
||||
"content": "<|vision_start|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": true
|
||||
},
|
||||
"151653": {
|
||||
"content": "<|vision_end|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": true
|
||||
},
|
||||
"151654": {
|
||||
"content": "<|vision_pad|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": true
|
||||
},
|
||||
"151655": {
|
||||
"content": "<|image_pad|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": true
|
||||
},
|
||||
"151656": {
|
||||
"content": "<|video_pad|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": true
|
||||
},
|
||||
"151657": {
|
||||
"content": "<tool_call>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": false
|
||||
},
|
||||
"151658": {
|
||||
"content": "</tool_call>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": false
|
||||
},
|
||||
"151659": {
|
||||
"content": "<|fim_prefix|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": false
|
||||
},
|
||||
"151660": {
|
||||
"content": "<|fim_middle|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": false
|
||||
},
|
||||
"151661": {
|
||||
"content": "<|fim_suffix|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": false
|
||||
},
|
||||
"151662": {
|
||||
"content": "<|fim_pad|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": false
|
||||
},
|
||||
"151663": {
|
||||
"content": "<|repo_name|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": false
|
||||
},
|
||||
"151664": {
|
||||
"content": "<|file_sep|>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": false
|
||||
},
|
||||
"151665": {
|
||||
"content": "<tool_response>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": false
|
||||
},
|
||||
"151666": {
|
||||
"content": "</tool_response>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": false
|
||||
},
|
||||
"151667": {
|
||||
"content": "<think>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": false
|
||||
},
|
||||
"151668": {
|
||||
"content": "</think>",
|
||||
"lstrip": false,
|
||||
"normalized": false,
|
||||
"rstrip": false,
|
||||
"single_word": false,
|
||||
"special": false
|
||||
}
|
||||
},
|
||||
"additional_special_tokens": [
|
||||
"<|im_start|>",
|
||||
"<|im_end|>",
|
||||
"<|object_ref_start|>",
|
||||
"<|object_ref_end|>",
|
||||
"<|box_start|>",
|
||||
"<|box_end|>",
|
||||
"<|quad_start|>",
|
||||
"<|quad_end|>",
|
||||
"<|vision_start|>",
|
||||
"<|vision_end|>",
|
||||
"<|vision_pad|>",
|
||||
"<|image_pad|>",
|
||||
"<|video_pad|>"
|
||||
],
|
||||
"bos_token": null,
|
||||
"clean_up_tokenization_spaces": false,
|
||||
"eos_token": "<|im_end|>",
|
||||
"errors": "replace",
|
||||
"extra_special_tokens": {},
|
||||
"model_max_length": 32768,
|
||||
"pad_token": "<|endoftext|>",
|
||||
"padding_side": "right",
|
||||
"split_special_tokens": false,
|
||||
"tokenizer_class": "Qwen2Tokenizer",
|
||||
"unk_token": null
|
||||
}
|
||||
16
train_results.json
Normal file
16
train_results.json
Normal file
@@ -0,0 +1,16 @@
|
||||
{
|
||||
"achieved_tflops_per_gpu": 21.65962676868645,
|
||||
"achieved_tflops_per_gpu_theoretical": 406.5485475387801,
|
||||
"epoch": 5.0,
|
||||
"loss_nan_ranks": 0,
|
||||
"loss_rank_avg": 0.12205812335014343,
|
||||
"mfu_percent": 1.530715672698689,
|
||||
"mfu_percent_theoretical": 28.731346115814848,
|
||||
"total_flos": 9.560237722870743e+18,
|
||||
"train_loss": 0.10292447277993867,
|
||||
"train_runtime": 13793.2861,
|
||||
"train_samples_per_second": 11.477,
|
||||
"train_steps_per_second": 0.12,
|
||||
"valid_targets_mean": 15604.5,
|
||||
"valid_targets_min": 4811
|
||||
}
|
||||
373
trainer_log.jsonl
Normal file
373
trainer_log.jsonl
Normal file
@@ -0,0 +1,373 @@
|
||||
{"current_steps": 5, "total_steps": 1650, "loss": 0.9617, "lr": 9.696969696969698e-07, "epoch": 0.015151515151515152, "percentage": 0.3, "elapsed_time": "0:02:40", "remaining_time": "14:40:18"}
|
||||
{"current_steps": 10, "total_steps": 1650, "loss": 0.9303, "lr": 2.181818181818182e-06, "epoch": 0.030303030303030304, "percentage": 0.61, "elapsed_time": "0:05:13", "remaining_time": "14:18:07"}
|
||||
{"current_steps": 15, "total_steps": 1650, "loss": 0.8651, "lr": 3.3939393939393946e-06, "epoch": 0.045454545454545456, "percentage": 0.91, "elapsed_time": "0:07:46", "remaining_time": "14:07:51"}
|
||||
{"current_steps": 20, "total_steps": 1650, "loss": 0.8108, "lr": 4.606060606060606e-06, "epoch": 0.06060606060606061, "percentage": 1.21, "elapsed_time": "0:10:20", "remaining_time": "14:02:37"}
|
||||
{"current_steps": 25, "total_steps": 1650, "loss": 0.7799, "lr": 5.8181818181818185e-06, "epoch": 0.07575757575757576, "percentage": 1.52, "elapsed_time": "0:12:53", "remaining_time": "13:58:01"}
|
||||
{"current_steps": 30, "total_steps": 1650, "loss": 0.7383, "lr": 7.030303030303031e-06, "epoch": 0.09090909090909091, "percentage": 1.82, "elapsed_time": "0:15:27", "remaining_time": "13:54:33"}
|
||||
{"current_steps": 35, "total_steps": 1650, "loss": 0.6976, "lr": 8.242424242424243e-06, "epoch": 0.10606060606060606, "percentage": 2.12, "elapsed_time": "0:18:00", "remaining_time": "13:50:46"}
|
||||
{"current_steps": 40, "total_steps": 1650, "loss": 0.6631, "lr": 9.454545454545456e-06, "epoch": 0.12121212121212122, "percentage": 2.42, "elapsed_time": "0:20:34", "remaining_time": "13:47:49"}
|
||||
{"current_steps": 45, "total_steps": 1650, "loss": 0.6355, "lr": 1.0666666666666667e-05, "epoch": 0.13636363636363635, "percentage": 2.73, "elapsed_time": "0:23:07", "remaining_time": "13:44:34"}
|
||||
{"current_steps": 50, "total_steps": 1650, "loss": 0.6043, "lr": 1.187878787878788e-05, "epoch": 0.15151515151515152, "percentage": 3.03, "elapsed_time": "0:25:40", "remaining_time": "13:41:21"}
|
||||
{"current_steps": 55, "total_steps": 1650, "loss": 0.5897, "lr": 1.3090909090909092e-05, "epoch": 0.16666666666666666, "percentage": 3.33, "elapsed_time": "0:28:13", "remaining_time": "13:38:34"}
|
||||
{"current_steps": 60, "total_steps": 1650, "loss": 0.5724, "lr": 1.4303030303030305e-05, "epoch": 0.18181818181818182, "percentage": 3.64, "elapsed_time": "0:30:46", "remaining_time": "13:35:42"}
|
||||
{"current_steps": 65, "total_steps": 1650, "loss": 0.5586, "lr": 1.5515151515151516e-05, "epoch": 0.19696969696969696, "percentage": 3.94, "elapsed_time": "0:33:21", "remaining_time": "13:33:29"}
|
||||
{"current_steps": 70, "total_steps": 1650, "loss": 0.5451, "lr": 1.672727272727273e-05, "epoch": 0.21212121212121213, "percentage": 4.24, "elapsed_time": "0:35:55", "remaining_time": "13:30:59"}
|
||||
{"current_steps": 75, "total_steps": 1650, "loss": 0.5295, "lr": 1.7939393939393942e-05, "epoch": 0.22727272727272727, "percentage": 4.55, "elapsed_time": "0:38:31", "remaining_time": "13:28:55"}
|
||||
{"current_steps": 80, "total_steps": 1650, "loss": 0.5232, "lr": 1.9151515151515152e-05, "epoch": 0.24242424242424243, "percentage": 4.85, "elapsed_time": "0:41:05", "remaining_time": "13:26:17"}
|
||||
{"current_steps": 85, "total_steps": 1650, "loss": 0.5194, "lr": 2.0363636363636365e-05, "epoch": 0.25757575757575757, "percentage": 5.15, "elapsed_time": "0:43:39", "remaining_time": "13:23:43"}
|
||||
{"current_steps": 90, "total_steps": 1650, "loss": 0.515, "lr": 2.1575757575757578e-05, "epoch": 0.2727272727272727, "percentage": 5.45, "elapsed_time": "0:46:11", "remaining_time": "13:20:47"}
|
||||
{"current_steps": 95, "total_steps": 1650, "loss": 0.5038, "lr": 2.278787878787879e-05, "epoch": 0.2878787878787879, "percentage": 5.76, "elapsed_time": "0:48:44", "remaining_time": "13:17:44"}
|
||||
{"current_steps": 100, "total_steps": 1650, "loss": 0.5042, "lr": 2.4e-05, "epoch": 0.30303030303030304, "percentage": 6.06, "elapsed_time": "0:51:15", "remaining_time": "13:14:33"}
|
||||
{"current_steps": 105, "total_steps": 1650, "loss": 0.5005, "lr": 2.5212121212121214e-05, "epoch": 0.3181818181818182, "percentage": 6.36, "elapsed_time": "0:59:12", "remaining_time": "14:31:05"}
|
||||
{"current_steps": 110, "total_steps": 1650, "loss": 0.4915, "lr": 2.6424242424242427e-05, "epoch": 0.3333333333333333, "percentage": 6.67, "elapsed_time": "1:01:44", "remaining_time": "14:24:19"}
|
||||
{"current_steps": 115, "total_steps": 1650, "loss": 0.4901, "lr": 2.763636363636364e-05, "epoch": 0.3484848484848485, "percentage": 6.97, "elapsed_time": "1:04:16", "remaining_time": "14:17:51"}
|
||||
{"current_steps": 120, "total_steps": 1650, "loss": 0.4865, "lr": 2.884848484848485e-05, "epoch": 0.36363636363636365, "percentage": 7.27, "elapsed_time": "1:06:47", "remaining_time": "14:11:33"}
|
||||
{"current_steps": 125, "total_steps": 1650, "loss": 0.4898, "lr": 3.0060606060606062e-05, "epoch": 0.3787878787878788, "percentage": 7.58, "elapsed_time": "1:09:19", "remaining_time": "14:05:41"}
|
||||
{"current_steps": 130, "total_steps": 1650, "loss": 0.4813, "lr": 3.127272727272728e-05, "epoch": 0.3939393939393939, "percentage": 7.88, "elapsed_time": "1:11:50", "remaining_time": "14:00:02"}
|
||||
{"current_steps": 135, "total_steps": 1650, "loss": 0.4827, "lr": 3.2484848484848485e-05, "epoch": 0.4090909090909091, "percentage": 8.18, "elapsed_time": "1:14:23", "remaining_time": "13:54:47"}
|
||||
{"current_steps": 140, "total_steps": 1650, "loss": 0.4809, "lr": 3.36969696969697e-05, "epoch": 0.42424242424242425, "percentage": 8.48, "elapsed_time": "1:16:54", "remaining_time": "13:49:35"}
|
||||
{"current_steps": 145, "total_steps": 1650, "loss": 0.4823, "lr": 3.490909090909091e-05, "epoch": 0.4393939393939394, "percentage": 8.79, "elapsed_time": "1:19:26", "remaining_time": "13:44:34"}
|
||||
{"current_steps": 150, "total_steps": 1650, "loss": 0.4726, "lr": 3.6121212121212124e-05, "epoch": 0.45454545454545453, "percentage": 9.09, "elapsed_time": "1:21:57", "remaining_time": "13:39:32"}
|
||||
{"current_steps": 155, "total_steps": 1650, "loss": 0.476, "lr": 3.733333333333334e-05, "epoch": 0.4696969696969697, "percentage": 9.39, "elapsed_time": "1:24:28", "remaining_time": "13:34:48"}
|
||||
{"current_steps": 160, "total_steps": 1650, "loss": 0.469, "lr": 3.854545454545455e-05, "epoch": 0.48484848484848486, "percentage": 9.7, "elapsed_time": "1:27:00", "remaining_time": "13:30:15"}
|
||||
{"current_steps": 165, "total_steps": 1650, "loss": 0.4689, "lr": 3.9757575757575757e-05, "epoch": 0.5, "percentage": 10.0, "elapsed_time": "1:29:32", "remaining_time": "13:25:51"}
|
||||
{"current_steps": 170, "total_steps": 1650, "loss": 0.4709, "lr": 3.999928391557286e-05, "epoch": 0.5151515151515151, "percentage": 10.3, "elapsed_time": "1:32:04", "remaining_time": "13:21:32"}
|
||||
{"current_steps": 175, "total_steps": 1650, "loss": 0.4693, "lr": 3.999637491047052e-05, "epoch": 0.5303030303030303, "percentage": 10.61, "elapsed_time": "1:34:35", "remaining_time": "13:17:19"}
|
||||
{"current_steps": 180, "total_steps": 1650, "loss": 0.4647, "lr": 3.999122855464813e-05, "epoch": 0.5454545454545454, "percentage": 10.91, "elapsed_time": "1:37:07", "remaining_time": "13:13:07"}
|
||||
{"current_steps": 185, "total_steps": 1650, "loss": 0.4688, "lr": 3.998384542392021e-05, "epoch": 0.5606060606060606, "percentage": 11.21, "elapsed_time": "1:39:39", "remaining_time": "13:09:07"}
|
||||
{"current_steps": 190, "total_steps": 1650, "loss": 0.4676, "lr": 3.9974226344369124e-05, "epoch": 0.5757575757575758, "percentage": 11.52, "elapsed_time": "1:42:11", "remaining_time": "13:05:11"}
|
||||
{"current_steps": 195, "total_steps": 1650, "loss": 0.4641, "lr": 3.996237239225268e-05, "epoch": 0.5909090909090909, "percentage": 11.82, "elapsed_time": "1:44:42", "remaining_time": "13:01:15"}
|
||||
{"current_steps": 200, "total_steps": 1650, "loss": 0.4606, "lr": 3.994828489388371e-05, "epoch": 0.6060606060606061, "percentage": 12.12, "elapsed_time": "1:47:13", "remaining_time": "12:57:20"}
|
||||
{"current_steps": 205, "total_steps": 1650, "loss": 0.4659, "lr": 3.993196542548162e-05, "epoch": 0.6212121212121212, "percentage": 12.42, "elapsed_time": "1:49:45", "remaining_time": "12:53:37"}
|
||||
{"current_steps": 210, "total_steps": 1650, "loss": 0.4614, "lr": 3.991341581299609e-05, "epoch": 0.6363636363636364, "percentage": 12.73, "elapsed_time": "1:52:18", "remaining_time": "12:50:09"}
|
||||
{"current_steps": 215, "total_steps": 1650, "loss": 0.4546, "lr": 3.9892638131902765e-05, "epoch": 0.6515151515151515, "percentage": 13.03, "elapsed_time": "1:54:52", "remaining_time": "12:46:40"}
|
||||
{"current_steps": 220, "total_steps": 1650, "loss": 0.4597, "lr": 3.9869634706971e-05, "epoch": 0.6666666666666666, "percentage": 13.33, "elapsed_time": "1:57:25", "remaining_time": "12:43:14"}
|
||||
{"current_steps": 225, "total_steps": 1650, "loss": 0.4574, "lr": 3.984440811200379e-05, "epoch": 0.6818181818181818, "percentage": 13.64, "elapsed_time": "1:59:58", "remaining_time": "12:39:51"}
|
||||
{"current_steps": 230, "total_steps": 1650, "loss": 0.4533, "lr": 3.981696116954973e-05, "epoch": 0.696969696969697, "percentage": 13.94, "elapsed_time": "2:02:30", "remaining_time": "12:36:23"}
|
||||
{"current_steps": 235, "total_steps": 1650, "loss": 0.4534, "lr": 3.978729695058729e-05, "epoch": 0.7121212121212122, "percentage": 14.24, "elapsed_time": "2:05:03", "remaining_time": "12:32:59"}
|
||||
{"current_steps": 240, "total_steps": 1650, "loss": 0.456, "lr": 3.9755418774181146e-05, "epoch": 0.7272727272727273, "percentage": 14.55, "elapsed_time": "2:07:36", "remaining_time": "12:29:43"}
|
||||
{"current_steps": 245, "total_steps": 1650, "loss": 0.4509, "lr": 3.9721330207110835e-05, "epoch": 0.7424242424242424, "percentage": 14.85, "elapsed_time": "2:10:10", "remaining_time": "12:26:29"}
|
||||
{"current_steps": 250, "total_steps": 1650, "loss": 0.4494, "lr": 3.9685035063471675e-05, "epoch": 0.7575757575757576, "percentage": 15.15, "elapsed_time": "2:12:41", "remaining_time": "12:23:06"}
|
||||
{"current_steps": 255, "total_steps": 1650, "loss": 0.449, "lr": 3.964653740424804e-05, "epoch": 0.7727272727272727, "percentage": 15.45, "elapsed_time": "2:15:13", "remaining_time": "12:19:43"}
|
||||
{"current_steps": 260, "total_steps": 1650, "loss": 0.4535, "lr": 3.960584153685895e-05, "epoch": 0.7878787878787878, "percentage": 15.76, "elapsed_time": "2:17:44", "remaining_time": "12:16:22"}
|
||||
{"current_steps": 265, "total_steps": 1650, "loss": 0.4507, "lr": 3.9562952014676116e-05, "epoch": 0.803030303030303, "percentage": 16.06, "elapsed_time": "2:20:15", "remaining_time": "12:13:04"}
|
||||
{"current_steps": 270, "total_steps": 1650, "loss": 0.4545, "lr": 3.9517873636514525e-05, "epoch": 0.8181818181818182, "percentage": 16.36, "elapsed_time": "2:22:47", "remaining_time": "12:09:48"}
|
||||
{"current_steps": 275, "total_steps": 1650, "loss": 0.4421, "lr": 3.947061144609546e-05, "epoch": 0.8333333333333334, "percentage": 16.67, "elapsed_time": "2:25:19", "remaining_time": "12:06:35"}
|
||||
{"current_steps": 280, "total_steps": 1650, "loss": 0.4457, "lr": 3.942117073148221e-05, "epoch": 0.8484848484848485, "percentage": 16.97, "elapsed_time": "2:27:49", "remaining_time": "12:03:17"}
|
||||
{"current_steps": 285, "total_steps": 1650, "loss": 0.4494, "lr": 3.9369557024488345e-05, "epoch": 0.8636363636363636, "percentage": 17.27, "elapsed_time": "2:30:19", "remaining_time": "11:59:59"}
|
||||
{"current_steps": 290, "total_steps": 1650, "loss": 0.4482, "lr": 3.931577610005883e-05, "epoch": 0.8787878787878788, "percentage": 17.58, "elapsed_time": "2:32:50", "remaining_time": "11:56:47"}
|
||||
{"current_steps": 295, "total_steps": 1650, "loss": 0.4483, "lr": 3.925983397562385e-05, "epoch": 0.8939393939393939, "percentage": 17.88, "elapsed_time": "2:35:22", "remaining_time": "11:53:39"}
|
||||
{"current_steps": 300, "total_steps": 1650, "loss": 0.4486, "lr": 3.920173691042554e-05, "epoch": 0.9090909090909091, "percentage": 18.18, "elapsed_time": "2:37:53", "remaining_time": "11:50:31"}
|
||||
{"current_steps": 305, "total_steps": 1650, "loss": 0.4445, "lr": 3.914149140481766e-05, "epoch": 0.9242424242424242, "percentage": 18.48, "elapsed_time": "2:40:33", "remaining_time": "11:48:02"}
|
||||
{"current_steps": 310, "total_steps": 1650, "loss": 0.4468, "lr": 3.9079104199538256e-05, "epoch": 0.9393939393939394, "percentage": 18.79, "elapsed_time": "2:43:03", "remaining_time": "11:44:52"}
|
||||
{"current_steps": 315, "total_steps": 1650, "loss": 0.4455, "lr": 3.901458227495549e-05, "epoch": 0.9545454545454546, "percentage": 19.09, "elapsed_time": "2:45:35", "remaining_time": "11:41:45"}
|
||||
{"current_steps": 320, "total_steps": 1650, "loss": 0.4453, "lr": 3.8947932850286585e-05, "epoch": 0.9696969696969697, "percentage": 19.39, "elapsed_time": "2:48:05", "remaining_time": "11:38:39"}
|
||||
{"current_steps": 325, "total_steps": 1650, "loss": 0.44, "lr": 3.887916338279014e-05, "epoch": 0.9848484848484849, "percentage": 19.7, "elapsed_time": "2:50:36", "remaining_time": "11:35:34"}
|
||||
{"current_steps": 330, "total_steps": 1650, "loss": 0.4371, "lr": 3.8808281566931675e-05, "epoch": 1.0, "percentage": 20.0, "elapsed_time": "2:53:07", "remaining_time": "11:32:28"}
|
||||
{"current_steps": 335, "total_steps": 1650, "loss": 0.4349, "lr": 3.873529533352277e-05, "epoch": 1.0151515151515151, "percentage": 20.3, "elapsed_time": "2:55:37", "remaining_time": "11:29:25"}
|
||||
{"current_steps": 340, "total_steps": 1650, "loss": 0.4208, "lr": 3.8660212848833705e-05, "epoch": 1.0303030303030303, "percentage": 20.61, "elapsed_time": "2:58:07", "remaining_time": "11:26:18"}
|
||||
{"current_steps": 345, "total_steps": 1650, "loss": 0.4322, "lr": 3.858304251367972e-05, "epoch": 1.0454545454545454, "percentage": 20.91, "elapsed_time": "3:00:38", "remaining_time": "11:23:17"}
|
||||
{"current_steps": 350, "total_steps": 1650, "loss": 0.4323, "lr": 3.850379296248107e-05, "epoch": 1.0606060606060606, "percentage": 21.21, "elapsed_time": "3:03:08", "remaining_time": "11:20:14"}
|
||||
{"current_steps": 355, "total_steps": 1650, "loss": 0.4289, "lr": 3.8422473062297e-05, "epoch": 1.0757575757575757, "percentage": 21.52, "elapsed_time": "3:05:38", "remaining_time": "11:17:13"}
|
||||
{"current_steps": 360, "total_steps": 1650, "loss": 0.4337, "lr": 3.8339091911833545e-05, "epoch": 1.0909090909090908, "percentage": 21.82, "elapsed_time": "3:08:09", "remaining_time": "11:14:13"}
|
||||
{"current_steps": 365, "total_steps": 1650, "loss": 0.4375, "lr": 3.825365884042553e-05, "epoch": 1.106060606060606, "percentage": 22.12, "elapsed_time": "3:10:39", "remaining_time": "11:11:13"}
|
||||
{"current_steps": 370, "total_steps": 1650, "loss": 0.4302, "lr": 3.8166183406992745e-05, "epoch": 1.121212121212121, "percentage": 22.42, "elapsed_time": "3:13:09", "remaining_time": "11:08:14"}
|
||||
{"current_steps": 375, "total_steps": 1650, "loss": 0.4305, "lr": 3.807667539897041e-05, "epoch": 1.1363636363636362, "percentage": 22.73, "elapsed_time": "3:15:39", "remaining_time": "11:05:15"}
|
||||
{"current_steps": 380, "total_steps": 1650, "loss": 0.4309, "lr": 3.798514483121408e-05, "epoch": 1.1515151515151516, "percentage": 23.03, "elapsed_time": "3:18:10", "remaining_time": "11:02:18"}
|
||||
{"current_steps": 385, "total_steps": 1650, "loss": 0.4273, "lr": 3.789160194487908e-05, "epoch": 1.1666666666666667, "percentage": 23.33, "elapsed_time": "3:20:40", "remaining_time": "10:59:22"}
|
||||
{"current_steps": 390, "total_steps": 1650, "loss": 0.4272, "lr": 3.7796057206274686e-05, "epoch": 1.1818181818181819, "percentage": 23.64, "elapsed_time": "3:23:10", "remaining_time": "10:56:25"}
|
||||
{"current_steps": 395, "total_steps": 1650, "loss": 0.4341, "lr": 3.769852130569304e-05, "epoch": 1.196969696969697, "percentage": 23.94, "elapsed_time": "3:25:40", "remaining_time": "10:53:28"}
|
||||
{"current_steps": 400, "total_steps": 1650, "loss": 0.4299, "lr": 3.7599005156213066e-05, "epoch": 1.2121212121212122, "percentage": 24.24, "elapsed_time": "3:28:11", "remaining_time": "10:50:35"}
|
||||
{"current_steps": 405, "total_steps": 1650, "loss": 0.4297, "lr": 3.74975198924794e-05, "epoch": 1.2272727272727273, "percentage": 24.55, "elapsed_time": "3:30:41", "remaining_time": "10:47:41"}
|
||||
{"current_steps": 410, "total_steps": 1650, "loss": 0.4277, "lr": 3.739407686945658e-05, "epoch": 1.2424242424242424, "percentage": 24.85, "elapsed_time": "3:33:11", "remaining_time": "10:44:45"}
|
||||
{"current_steps": 415, "total_steps": 1650, "loss": 0.4296, "lr": 3.728868766115854e-05, "epoch": 1.2575757575757576, "percentage": 25.15, "elapsed_time": "3:35:41", "remaining_time": "10:41:51"}
|
||||
{"current_steps": 420, "total_steps": 1650, "loss": 0.4278, "lr": 3.718136405935365e-05, "epoch": 1.2727272727272727, "percentage": 25.45, "elapsed_time": "3:38:10", "remaining_time": "10:38:57"}
|
||||
{"current_steps": 425, "total_steps": 1650, "loss": 0.4276, "lr": 3.707211807224534e-05, "epoch": 1.2878787878787878, "percentage": 25.76, "elapsed_time": "3:40:39", "remaining_time": "10:36:02"}
|
||||
{"current_steps": 430, "total_steps": 1650, "loss": 0.4309, "lr": 3.696096192312852e-05, "epoch": 1.303030303030303, "percentage": 26.06, "elapsed_time": "3:43:10", "remaining_time": "10:33:11"}
|
||||
{"current_steps": 435, "total_steps": 1650, "loss": 0.4266, "lr": 3.684790804902199e-05, "epoch": 1.3181818181818181, "percentage": 26.36, "elapsed_time": "3:45:41", "remaining_time": "10:30:22"}
|
||||
{"current_steps": 440, "total_steps": 1650, "loss": 0.4253, "lr": 3.673296909927682e-05, "epoch": 1.3333333333333333, "percentage": 26.67, "elapsed_time": "3:48:10", "remaining_time": "10:27:28"}
|
||||
{"current_steps": 445, "total_steps": 1650, "loss": 0.4241, "lr": 3.661615793416109e-05, "epoch": 1.3484848484848486, "percentage": 26.97, "elapsed_time": "3:50:39", "remaining_time": "10:24:35"}
|
||||
{"current_steps": 450, "total_steps": 1650, "loss": 0.429, "lr": 3.649748762342098e-05, "epoch": 1.3636363636363638, "percentage": 27.27, "elapsed_time": "3:53:10", "remaining_time": "10:21:46"}
|
||||
{"current_steps": 455, "total_steps": 1650, "loss": 0.4246, "lr": 3.637697144481839e-05, "epoch": 1.378787878787879, "percentage": 27.58, "elapsed_time": "3:55:39", "remaining_time": "10:18:55"}
|
||||
{"current_steps": 460, "total_steps": 1650, "loss": 0.4174, "lr": 3.625462288264536e-05, "epoch": 1.393939393939394, "percentage": 27.88, "elapsed_time": "3:58:09", "remaining_time": "10:16:05"}
|
||||
{"current_steps": 465, "total_steps": 1650, "loss": 0.4223, "lr": 3.613045562621533e-05, "epoch": 1.4090909090909092, "percentage": 28.18, "elapsed_time": "4:00:38", "remaining_time": "10:13:15"}
|
||||
{"current_steps": 470, "total_steps": 1650, "loss": 0.4242, "lr": 3.600448356833146e-05, "epoch": 1.4242424242424243, "percentage": 28.48, "elapsed_time": "4:03:08", "remaining_time": "10:10:26"}
|
||||
{"current_steps": 475, "total_steps": 1650, "loss": 0.4253, "lr": 3.587672080373219e-05, "epoch": 1.4393939393939394, "percentage": 28.79, "elapsed_time": "4:05:38", "remaining_time": "10:07:38"}
|
||||
{"current_steps": 480, "total_steps": 1650, "loss": 0.4245, "lr": 3.574718162751426e-05, "epoch": 1.4545454545454546, "percentage": 29.09, "elapsed_time": "4:08:10", "remaining_time": "10:04:56"}
|
||||
{"current_steps": 485, "total_steps": 1650, "loss": 0.4248, "lr": 3.561588053353319e-05, "epoch": 1.4696969696969697, "percentage": 29.39, "elapsed_time": "4:10:39", "remaining_time": "10:02:06"}
|
||||
{"current_steps": 490, "total_steps": 1650, "loss": 0.4208, "lr": 3.5482832212781655e-05, "epoch": 1.4848484848484849, "percentage": 29.7, "elapsed_time": "4:13:09", "remaining_time": "9:59:19"}
|
||||
{"current_steps": 495, "total_steps": 1650, "loss": 0.4254, "lr": 3.53480515517457e-05, "epoch": 1.5, "percentage": 30.0, "elapsed_time": "4:15:39", "remaining_time": "9:56:33"}
|
||||
{"current_steps": 500, "total_steps": 1650, "loss": 0.4262, "lr": 3.5211553630739166e-05, "epoch": 1.5151515151515151, "percentage": 30.3, "elapsed_time": "4:18:10", "remaining_time": "9:53:48"}
|
||||
{"current_steps": 505, "total_steps": 1650, "loss": 0.4184, "lr": 3.5073353722216334e-05, "epoch": 1.5303030303030303, "percentage": 30.61, "elapsed_time": "4:20:40", "remaining_time": "9:51:01"}
|
||||
{"current_steps": 510, "total_steps": 1650, "loss": 0.4205, "lr": 3.4933467289063156e-05, "epoch": 1.5454545454545454, "percentage": 30.91, "elapsed_time": "4:23:10", "remaining_time": "9:48:15"}
|
||||
{"current_steps": 515, "total_steps": 1650, "loss": 0.4188, "lr": 3.4791909982867175e-05, "epoch": 1.5606060606060606, "percentage": 31.21, "elapsed_time": "4:25:40", "remaining_time": "9:45:30"}
|
||||
{"current_steps": 520, "total_steps": 1650, "loss": 0.4238, "lr": 3.464869764216622e-05, "epoch": 1.5757575757575757, "percentage": 31.52, "elapsed_time": "4:28:09", "remaining_time": "9:42:44"}
|
||||
{"current_steps": 525, "total_steps": 1650, "loss": 0.4249, "lr": 3.450384629067635e-05, "epoch": 1.5909090909090908, "percentage": 31.82, "elapsed_time": "4:30:38", "remaining_time": "9:39:57"}
|
||||
{"current_steps": 530, "total_steps": 1650, "loss": 0.4188, "lr": 3.435737213549896e-05, "epoch": 1.606060606060606, "percentage": 32.12, "elapsed_time": "4:33:10", "remaining_time": "9:37:15"}
|
||||
{"current_steps": 535, "total_steps": 1650, "loss": 0.4191, "lr": 3.420929156530738e-05, "epoch": 1.621212121212121, "percentage": 32.42, "elapsed_time": "4:35:40", "remaining_time": "9:34:31"}
|
||||
{"current_steps": 540, "total_steps": 1650, "loss": 0.4188, "lr": 3.405962114851324e-05, "epoch": 1.6363636363636362, "percentage": 32.73, "elapsed_time": "4:38:10", "remaining_time": "9:31:48"}
|
||||
{"current_steps": 545, "total_steps": 1650, "loss": 0.425, "lr": 3.390837763141261e-05, "epoch": 1.6515151515151514, "percentage": 33.03, "elapsed_time": "4:40:40", "remaining_time": "9:29:05"}
|
||||
{"current_steps": 550, "total_steps": 1650, "loss": 0.4184, "lr": 3.3755577936312344e-05, "epoch": 1.6666666666666665, "percentage": 33.33, "elapsed_time": "4:43:11", "remaining_time": "9:26:23"}
|
||||
{"current_steps": 555, "total_steps": 1650, "loss": 0.4202, "lr": 3.360123915963662e-05, "epoch": 1.6818181818181817, "percentage": 33.64, "elapsed_time": "4:45:40", "remaining_time": "9:23:38"}
|
||||
{"current_steps": 560, "total_steps": 1650, "loss": 0.4188, "lr": 3.3445378570014125e-05, "epoch": 1.696969696969697, "percentage": 33.94, "elapsed_time": "4:48:10", "remaining_time": "9:20:55"}
|
||||
{"current_steps": 565, "total_steps": 1650, "loss": 0.4231, "lr": 3.328801360634585e-05, "epoch": 1.7121212121212122, "percentage": 34.24, "elapsed_time": "4:50:41", "remaining_time": "9:18:13"}
|
||||
{"current_steps": 570, "total_steps": 1650, "loss": 0.424, "lr": 3.312916187585392e-05, "epoch": 1.7272727272727273, "percentage": 34.55, "elapsed_time": "4:53:10", "remaining_time": "9:15:29"}
|
||||
{"current_steps": 575, "total_steps": 1650, "loss": 0.4202, "lr": 3.296884115211157e-05, "epoch": 1.7424242424242424, "percentage": 34.85, "elapsed_time": "4:55:40", "remaining_time": "9:12:46"}
|
||||
{"current_steps": 580, "total_steps": 1650, "loss": 0.4233, "lr": 3.280706937305445e-05, "epoch": 1.7575757575757576, "percentage": 35.15, "elapsed_time": "4:58:09", "remaining_time": "9:10:03"}
|
||||
{"current_steps": 585, "total_steps": 1650, "loss": 0.4175, "lr": 3.2643864638973645e-05, "epoch": 1.7727272727272727, "percentage": 35.45, "elapsed_time": "5:00:39", "remaining_time": "9:07:21"}
|
||||
{"current_steps": 590, "total_steps": 1650, "loss": 0.4133, "lr": 3.2479245210490434e-05, "epoch": 1.7878787878787878, "percentage": 35.76, "elapsed_time": "5:03:09", "remaining_time": "9:04:40"}
|
||||
{"current_steps": 595, "total_steps": 1650, "loss": 0.4096, "lr": 3.2313229506513167e-05, "epoch": 1.803030303030303, "percentage": 36.06, "elapsed_time": "5:05:39", "remaining_time": "9:01:57"}
|
||||
{"current_steps": 600, "total_steps": 1650, "loss": 0.4217, "lr": 3.2145836102176424e-05, "epoch": 1.8181818181818183, "percentage": 36.36, "elapsed_time": "5:08:08", "remaining_time": "8:59:15"}
|
||||
{"current_steps": 605, "total_steps": 1650, "loss": 0.4177, "lr": 3.197708372676265e-05, "epoch": 1.8333333333333335, "percentage": 36.67, "elapsed_time": "5:10:46", "remaining_time": "8:56:47"}
|
||||
{"current_steps": 610, "total_steps": 1650, "loss": 0.4236, "lr": 3.1806991261606604e-05, "epoch": 1.8484848484848486, "percentage": 36.97, "elapsed_time": "5:13:15", "remaining_time": "8:54:05"}
|
||||
{"current_steps": 615, "total_steps": 1650, "loss": 0.4151, "lr": 3.163557773798276e-05, "epoch": 1.8636363636363638, "percentage": 37.27, "elapsed_time": "5:15:46", "remaining_time": "8:51:25"}
|
||||
{"current_steps": 620, "total_steps": 1650, "loss": 0.4151, "lr": 3.146286233497593e-05, "epoch": 1.878787878787879, "percentage": 37.58, "elapsed_time": "5:18:17", "remaining_time": "8:48:45"}
|
||||
{"current_steps": 625, "total_steps": 1650, "loss": 0.4176, "lr": 3.128886437733539e-05, "epoch": 1.893939393939394, "percentage": 37.88, "elapsed_time": "5:20:47", "remaining_time": "8:46:05"}
|
||||
{"current_steps": 630, "total_steps": 1650, "loss": 0.4189, "lr": 3.111360333331263e-05, "epoch": 1.9090909090909092, "percentage": 38.18, "elapsed_time": "5:23:16", "remaining_time": "8:43:23"}
|
||||
{"current_steps": 635, "total_steps": 1650, "loss": 0.4137, "lr": 3.093709881248312e-05, "epoch": 1.9242424242424243, "percentage": 38.48, "elapsed_time": "5:25:46", "remaining_time": "8:40:44"}
|
||||
{"current_steps": 640, "total_steps": 1650, "loss": 0.4187, "lr": 3.075937056355225e-05, "epoch": 1.9393939393939394, "percentage": 38.79, "elapsed_time": "5:28:16", "remaining_time": "8:38:03"}
|
||||
{"current_steps": 645, "total_steps": 1650, "loss": 0.419, "lr": 3.0580438472145665e-05, "epoch": 1.9545454545454546, "percentage": 39.09, "elapsed_time": "5:30:46", "remaining_time": "8:35:23"}
|
||||
{"current_steps": 650, "total_steps": 1650, "loss": 0.4143, "lr": 3.0400322558584308e-05, "epoch": 1.9696969696969697, "percentage": 39.39, "elapsed_time": "5:33:15", "remaining_time": "8:32:42"}
|
||||
{"current_steps": 655, "total_steps": 1650, "loss": 0.4134, "lr": 3.0219042975644415e-05, "epoch": 1.9848484848484849, "percentage": 39.7, "elapsed_time": "5:35:44", "remaining_time": "8:30:01"}
|
||||
{"current_steps": 660, "total_steps": 1650, "loss": 0.4168, "lr": 3.0036620006302624e-05, "epoch": 2.0, "percentage": 40.0, "elapsed_time": "5:38:14", "remaining_time": "8:27:21"}
|
||||
{"current_steps": 665, "total_steps": 1650, "loss": 0.4032, "lr": 2.9853074061466602e-05, "epoch": 2.015151515151515, "percentage": 40.3, "elapsed_time": "5:40:44", "remaining_time": "8:24:42"}
|
||||
{"current_steps": 670, "total_steps": 1650, "loss": 0.4022, "lr": 2.9668425677691278e-05, "epoch": 2.0303030303030303, "percentage": 40.61, "elapsed_time": "5:43:14", "remaining_time": "8:22:03"}
|
||||
{"current_steps": 675, "total_steps": 1650, "loss": 0.403, "lr": 2.948269551488108e-05, "epoch": 2.0454545454545454, "percentage": 40.91, "elapsed_time": "5:45:44", "remaining_time": "8:19:23"}
|
||||
{"current_steps": 680, "total_steps": 1650, "loss": 0.4098, "lr": 2.929590435397832e-05, "epoch": 2.0606060606060606, "percentage": 41.21, "elapsed_time": "5:48:13", "remaining_time": "8:16:43"}
|
||||
{"current_steps": 685, "total_steps": 1650, "loss": 0.399, "lr": 2.9108073094638066e-05, "epoch": 2.0757575757575757, "percentage": 41.52, "elapsed_time": "5:50:43", "remaining_time": "8:14:05"}
|
||||
{"current_steps": 690, "total_steps": 1650, "loss": 0.4038, "lr": 2.8919222752889727e-05, "epoch": 2.090909090909091, "percentage": 41.82, "elapsed_time": "5:53:12", "remaining_time": "8:11:25"}
|
||||
{"current_steps": 695, "total_steps": 1650, "loss": 0.4027, "lr": 2.8729374458785647e-05, "epoch": 2.106060606060606, "percentage": 42.12, "elapsed_time": "5:55:42", "remaining_time": "8:08:47"}
|
||||
{"current_steps": 700, "total_steps": 1650, "loss": 0.4043, "lr": 2.8538549454036838e-05, "epoch": 2.121212121212121, "percentage": 42.42, "elapsed_time": "5:58:11", "remaining_time": "8:06:07"}
|
||||
{"current_steps": 705, "total_steps": 1650, "loss": 0.3985, "lr": 2.834676908963636e-05, "epoch": 2.1363636363636362, "percentage": 42.73, "elapsed_time": "6:00:41", "remaining_time": "8:03:28"}
|
||||
{"current_steps": 710, "total_steps": 1650, "loss": 0.4034, "lr": 2.815405482347037e-05, "epoch": 2.1515151515151514, "percentage": 43.03, "elapsed_time": "6:03:10", "remaining_time": "8:00:49"}
|
||||
{"current_steps": 715, "total_steps": 1650, "loss": 0.408, "lr": 2.796042821791725e-05, "epoch": 2.1666666666666665, "percentage": 43.33, "elapsed_time": "6:05:40", "remaining_time": "7:58:11"}
|
||||
{"current_steps": 720, "total_steps": 1650, "loss": 0.4032, "lr": 2.776591093743505e-05, "epoch": 2.1818181818181817, "percentage": 43.64, "elapsed_time": "6:08:09", "remaining_time": "7:55:31"}
|
||||
{"current_steps": 725, "total_steps": 1650, "loss": 0.4031, "lr": 2.7570524746137485e-05, "epoch": 2.196969696969697, "percentage": 43.94, "elapsed_time": "6:10:38", "remaining_time": "7:52:53"}
|
||||
{"current_steps": 730, "total_steps": 1650, "loss": 0.4014, "lr": 2.7374291505358818e-05, "epoch": 2.212121212121212, "percentage": 44.24, "elapsed_time": "6:13:07", "remaining_time": "7:50:14"}
|
||||
{"current_steps": 735, "total_steps": 1650, "loss": 0.3995, "lr": 2.7177233171207817e-05, "epoch": 2.227272727272727, "percentage": 44.55, "elapsed_time": "6:15:36", "remaining_time": "7:47:35"}
|
||||
{"current_steps": 740, "total_steps": 1650, "loss": 0.3992, "lr": 2.6979371792111147e-05, "epoch": 2.242424242424242, "percentage": 44.85, "elapsed_time": "6:18:05", "remaining_time": "7:44:57"}
|
||||
{"current_steps": 745, "total_steps": 1650, "loss": 0.4036, "lr": 2.678072950634641e-05, "epoch": 2.257575757575758, "percentage": 45.15, "elapsed_time": "6:20:35", "remaining_time": "7:42:20"}
|
||||
{"current_steps": 750, "total_steps": 1650, "loss": 0.3975, "lr": 2.6581328539565184e-05, "epoch": 2.2727272727272725, "percentage": 45.45, "elapsed_time": "6:23:05", "remaining_time": "7:39:42"}
|
||||
{"current_steps": 755, "total_steps": 1650, "loss": 0.4014, "lr": 2.638119120230616e-05, "epoch": 2.287878787878788, "percentage": 45.76, "elapsed_time": "6:25:35", "remaining_time": "7:37:05"}
|
||||
{"current_steps": 760, "total_steps": 1650, "loss": 0.4014, "lr": 2.618033988749895e-05, "epoch": 2.303030303030303, "percentage": 46.06, "elapsed_time": "6:28:05", "remaining_time": "7:34:28"}
|
||||
{"current_steps": 765, "total_steps": 1650, "loss": 0.4036, "lr": 2.5978797067958542e-05, "epoch": 2.3181818181818183, "percentage": 46.36, "elapsed_time": "6:30:34", "remaining_time": "7:31:50"}
|
||||
{"current_steps": 770, "total_steps": 1650, "loss": 0.407, "lr": 2.5776585293870877e-05, "epoch": 2.3333333333333335, "percentage": 46.67, "elapsed_time": "6:33:04", "remaining_time": "7:29:14"}
|
||||
{"current_steps": 775, "total_steps": 1650, "loss": 0.4031, "lr": 2.557372719026976e-05, "epoch": 2.3484848484848486, "percentage": 46.97, "elapsed_time": "6:35:34", "remaining_time": "7:26:36"}
|
||||
{"current_steps": 780, "total_steps": 1650, "loss": 0.4047, "lr": 2.537024545450539e-05, "epoch": 2.3636363636363638, "percentage": 47.27, "elapsed_time": "6:38:04", "remaining_time": "7:24:00"}
|
||||
{"current_steps": 785, "total_steps": 1650, "loss": 0.4016, "lr": 2.5166162853704825e-05, "epoch": 2.378787878787879, "percentage": 47.58, "elapsed_time": "6:40:32", "remaining_time": "7:21:22"}
|
||||
{"current_steps": 790, "total_steps": 1650, "loss": 0.3987, "lr": 2.496150222222458e-05, "epoch": 2.393939393939394, "percentage": 47.88, "elapsed_time": "6:43:02", "remaining_time": "7:18:45"}
|
||||
{"current_steps": 795, "total_steps": 1650, "loss": 0.3967, "lr": 2.475628645909576e-05, "epoch": 2.409090909090909, "percentage": 48.18, "elapsed_time": "6:45:33", "remaining_time": "7:16:10"}
|
||||
{"current_steps": 800, "total_steps": 1650, "loss": 0.4029, "lr": 2.4550538525461963e-05, "epoch": 2.4242424242424243, "percentage": 48.48, "elapsed_time": "6:48:03", "remaining_time": "7:13:33"}
|
||||
{"current_steps": 805, "total_steps": 1650, "loss": 0.3994, "lr": 2.434428144201016e-05, "epoch": 2.4393939393939394, "percentage": 48.79, "elapsed_time": "6:50:32", "remaining_time": "7:10:56"}
|
||||
{"current_steps": 810, "total_steps": 1650, "loss": 0.4021, "lr": 2.4137538286394976e-05, "epoch": 2.4545454545454546, "percentage": 49.09, "elapsed_time": "6:53:02", "remaining_time": "7:08:20"}
|
||||
{"current_steps": 815, "total_steps": 1650, "loss": 0.4019, "lr": 2.3930332190656604e-05, "epoch": 2.4696969696969697, "percentage": 49.39, "elapsed_time": "6:55:32", "remaining_time": "7:05:44"}
|
||||
{"current_steps": 820, "total_steps": 1650, "loss": 0.4023, "lr": 2.3722686338632602e-05, "epoch": 2.484848484848485, "percentage": 49.7, "elapsed_time": "6:58:02", "remaining_time": "7:03:08"}
|
||||
{"current_steps": 825, "total_steps": 1650, "loss": 0.3971, "lr": 2.3514623963363886e-05, "epoch": 2.5, "percentage": 50.0, "elapsed_time": "7:00:32", "remaining_time": "7:00:32"}
|
||||
{"current_steps": 830, "total_steps": 1650, "loss": 0.4024, "lr": 2.330616834449525e-05, "epoch": 2.515151515151515, "percentage": 50.3, "elapsed_time": "7:03:02", "remaining_time": "6:57:57"}
|
||||
{"current_steps": 835, "total_steps": 1650, "loss": 0.397, "lr": 2.309734280567065e-05, "epoch": 2.5303030303030303, "percentage": 50.61, "elapsed_time": "7:05:32", "remaining_time": "6:55:21"}
|
||||
{"current_steps": 840, "total_steps": 1650, "loss": 0.4034, "lr": 2.28881707119236e-05, "epoch": 2.5454545454545454, "percentage": 50.91, "elapsed_time": "7:08:03", "remaining_time": "6:52:45"}
|
||||
{"current_steps": 845, "total_steps": 1650, "loss": 0.3978, "lr": 2.267867546706287e-05, "epoch": 2.5606060606060606, "percentage": 51.21, "elapsed_time": "7:10:32", "remaining_time": "6:50:09"}
|
||||
{"current_steps": 850, "total_steps": 1650, "loss": 0.3996, "lr": 2.2468880511053896e-05, "epoch": 2.5757575757575757, "percentage": 51.52, "elapsed_time": "7:13:04", "remaining_time": "6:47:35"}
|
||||
{"current_steps": 855, "total_steps": 1650, "loss": 0.4005, "lr": 2.2258809317396163e-05, "epoch": 2.590909090909091, "percentage": 51.82, "elapsed_time": "7:15:34", "remaining_time": "6:45:00"}
|
||||
{"current_steps": 860, "total_steps": 1650, "loss": 0.3993, "lr": 2.2048485390496757e-05, "epoch": 2.606060606060606, "percentage": 52.12, "elapsed_time": "7:18:05", "remaining_time": "6:42:26"}
|
||||
{"current_steps": 865, "total_steps": 1650, "loss": 0.4028, "lr": 2.1837932263040553e-05, "epoch": 2.621212121212121, "percentage": 52.42, "elapsed_time": "7:20:35", "remaining_time": "6:39:50"}
|
||||
{"current_steps": 870, "total_steps": 1650, "loss": 0.3992, "lr": 2.1627173493357167e-05, "epoch": 2.6363636363636362, "percentage": 52.73, "elapsed_time": "7:23:06", "remaining_time": "6:37:16"}
|
||||
{"current_steps": 875, "total_steps": 1650, "loss": 0.4002, "lr": 2.1416232662785084e-05, "epoch": 2.6515151515151514, "percentage": 53.03, "elapsed_time": "7:25:36", "remaining_time": "6:34:40"}
|
||||
{"current_steps": 880, "total_steps": 1650, "loss": 0.3987, "lr": 2.1205133373033173e-05, "epoch": 2.6666666666666665, "percentage": 53.33, "elapsed_time": "7:28:05", "remaining_time": "6:32:04"}
|
||||
{"current_steps": 885, "total_steps": 1650, "loss": 0.3989, "lr": 2.0993899243539953e-05, "epoch": 2.6818181818181817, "percentage": 53.64, "elapsed_time": "7:30:35", "remaining_time": "6:29:29"}
|
||||
{"current_steps": 890, "total_steps": 1650, "loss": 0.3975, "lr": 2.0782553908830887e-05, "epoch": 2.6969696969696972, "percentage": 53.94, "elapsed_time": "7:33:06", "remaining_time": "6:26:55"}
|
||||
{"current_steps": 895, "total_steps": 1650, "loss": 0.3995, "lr": 2.0571121015873924e-05, "epoch": 2.712121212121212, "percentage": 54.24, "elapsed_time": "7:35:36", "remaining_time": "6:24:20"}
|
||||
{"current_steps": 900, "total_steps": 1650, "loss": 0.3978, "lr": 2.0359624221433728e-05, "epoch": 2.7272727272727275, "percentage": 54.55, "elapsed_time": "7:38:07", "remaining_time": "6:21:46"}
|
||||
{"current_steps": 905, "total_steps": 1650, "loss": 0.3994, "lr": 2.014808718942476e-05, "epoch": 2.742424242424242, "percentage": 54.85, "elapsed_time": "7:40:45", "remaining_time": "6:19:18"}
|
||||
{"current_steps": 910, "total_steps": 1650, "loss": 0.4003, "lr": 1.9936533588263557e-05, "epoch": 2.757575757575758, "percentage": 55.15, "elapsed_time": "7:43:15", "remaining_time": "6:16:42"}
|
||||
{"current_steps": 915, "total_steps": 1650, "loss": 0.4004, "lr": 1.9724987088220565e-05, "epoch": 2.7727272727272725, "percentage": 55.45, "elapsed_time": "7:45:46", "remaining_time": "6:14:08"}
|
||||
{"current_steps": 920, "total_steps": 1650, "loss": 0.3986, "lr": 1.951347135877169e-05, "epoch": 2.787878787878788, "percentage": 55.76, "elapsed_time": "7:48:16", "remaining_time": "6:11:34"}
|
||||
{"current_steps": 925, "total_steps": 1650, "loss": 0.3983, "lr": 1.930201006594999e-05, "epoch": 2.8030303030303028, "percentage": 56.06, "elapsed_time": "7:50:46", "remaining_time": "6:08:59"}
|
||||
{"current_steps": 930, "total_steps": 1650, "loss": 0.3976, "lr": 1.9090626869697714e-05, "epoch": 2.8181818181818183, "percentage": 56.36, "elapsed_time": "7:53:16", "remaining_time": "6:06:24"}
|
||||
{"current_steps": 935, "total_steps": 1650, "loss": 0.395, "lr": 1.8879345421219063e-05, "epoch": 2.8333333333333335, "percentage": 56.67, "elapsed_time": "7:55:46", "remaining_time": "6:03:49"}
|
||||
{"current_steps": 940, "total_steps": 1650, "loss": 0.3995, "lr": 1.8668189360333923e-05, "epoch": 2.8484848484848486, "percentage": 56.97, "elapsed_time": "7:58:15", "remaining_time": "6:01:13"}
|
||||
{"current_steps": 945, "total_steps": 1650, "loss": 0.4025, "lr": 1.845718231283281e-05, "epoch": 2.8636363636363638, "percentage": 57.27, "elapsed_time": "8:00:45", "remaining_time": "5:58:39"}
|
||||
{"current_steps": 950, "total_steps": 1650, "loss": 0.3966, "lr": 1.8246347887833457e-05, "epoch": 2.878787878787879, "percentage": 57.58, "elapsed_time": "8:03:14", "remaining_time": "5:56:04"}
|
||||
{"current_steps": 955, "total_steps": 1650, "loss": 0.3966, "lr": 1.8035709675139258e-05, "epoch": 2.893939393939394, "percentage": 57.88, "elapsed_time": "8:05:43", "remaining_time": "5:53:29"}
|
||||
{"current_steps": 960, "total_steps": 1650, "loss": 0.4008, "lr": 1.7825291242599837e-05, "epoch": 2.909090909090909, "percentage": 58.18, "elapsed_time": "8:08:13", "remaining_time": "5:50:54"}
|
||||
{"current_steps": 965, "total_steps": 1650, "loss": 0.4013, "lr": 1.7615116133474084e-05, "epoch": 2.9242424242424243, "percentage": 58.48, "elapsed_time": "8:10:43", "remaining_time": "5:48:20"}
|
||||
{"current_steps": 970, "total_steps": 1650, "loss": 0.397, "lr": 1.7405207863795966e-05, "epoch": 2.9393939393939394, "percentage": 58.79, "elapsed_time": "8:13:12", "remaining_time": "5:45:45"}
|
||||
{"current_steps": 975, "total_steps": 1650, "loss": 0.3986, "lr": 1.719558991974339e-05, "epoch": 2.9545454545454546, "percentage": 59.09, "elapsed_time": "8:15:42", "remaining_time": "5:43:11"}
|
||||
{"current_steps": 980, "total_steps": 1650, "loss": 0.3955, "lr": 1.698628575501034e-05, "epoch": 2.9696969696969697, "percentage": 59.39, "elapsed_time": "8:18:11", "remaining_time": "5:40:35"}
|
||||
{"current_steps": 985, "total_steps": 1650, "loss": 0.4034, "lr": 1.6777318788182723e-05, "epoch": 2.984848484848485, "percentage": 59.7, "elapsed_time": "8:20:41", "remaining_time": "5:38:01"}
|
||||
{"current_steps": 990, "total_steps": 1650, "loss": 0.392, "lr": 1.6568712400118102e-05, "epoch": 3.0, "percentage": 60.0, "elapsed_time": "8:23:11", "remaining_time": "5:35:27"}
|
||||
{"current_steps": 995, "total_steps": 1650, "loss": 0.386, "lr": 1.636048993132969e-05, "epoch": 3.015151515151515, "percentage": 60.3, "elapsed_time": "8:25:42", "remaining_time": "5:32:53"}
|
||||
{"current_steps": 1000, "total_steps": 1650, "loss": 0.3863, "lr": 1.615267467937479e-05, "epoch": 3.0303030303030303, "percentage": 60.61, "elapsed_time": "8:28:12", "remaining_time": "5:30:19"}
|
||||
{"current_steps": 1005, "total_steps": 1650, "loss": 0.3856, "lr": 1.59452898962481e-05, "epoch": 3.0454545454545454, "percentage": 60.91, "elapsed_time": "8:30:41", "remaining_time": "5:27:45"}
|
||||
{"current_steps": 1010, "total_steps": 1650, "loss": 0.3942, "lr": 1.573835878578013e-05, "epoch": 3.0606060606060606, "percentage": 61.21, "elapsed_time": "8:33:11", "remaining_time": "5:25:11"}
|
||||
{"current_steps": 1015, "total_steps": 1650, "loss": 0.3869, "lr": 1.5531904501040917e-05, "epoch": 3.0757575757575757, "percentage": 61.52, "elapsed_time": "8:35:40", "remaining_time": "5:22:37"}
|
||||
{"current_steps": 1020, "total_steps": 1650, "loss": 0.3839, "lr": 1.5325950141749522e-05, "epoch": 3.090909090909091, "percentage": 61.82, "elapsed_time": "8:38:10", "remaining_time": "5:20:03"}
|
||||
{"current_steps": 1025, "total_steps": 1650, "loss": 0.3886, "lr": 1.5120518751689438e-05, "epoch": 3.106060606060606, "percentage": 62.12, "elapsed_time": "8:40:40", "remaining_time": "5:17:29"}
|
||||
{"current_steps": 1030, "total_steps": 1650, "loss": 0.3815, "lr": 1.4915633316130267e-05, "epoch": 3.121212121212121, "percentage": 62.42, "elapsed_time": "8:43:10", "remaining_time": "5:14:55"}
|
||||
{"current_steps": 1035, "total_steps": 1650, "loss": 0.3843, "lr": 1.4711316759255963e-05, "epoch": 3.1363636363636362, "percentage": 62.73, "elapsed_time": "8:45:40", "remaining_time": "5:12:21"}
|
||||
{"current_steps": 1040, "total_steps": 1650, "loss": 0.3878, "lr": 1.450759194159987e-05, "epoch": 3.1515151515151514, "percentage": 63.03, "elapsed_time": "8:48:10", "remaining_time": "5:09:47"}
|
||||
{"current_steps": 1045, "total_steps": 1650, "loss": 0.3874, "lr": 1.4304481657486955e-05, "epoch": 3.1666666666666665, "percentage": 63.33, "elapsed_time": "8:50:41", "remaining_time": "5:07:14"}
|
||||
{"current_steps": 1050, "total_steps": 1650, "loss": 0.383, "lr": 1.4102008632483344e-05, "epoch": 3.1818181818181817, "percentage": 63.64, "elapsed_time": "8:53:10", "remaining_time": "5:04:40"}
|
||||
{"current_steps": 1055, "total_steps": 1650, "loss": 0.3835, "lr": 1.3900195520853628e-05, "epoch": 3.196969696969697, "percentage": 63.94, "elapsed_time": "8:55:40", "remaining_time": "5:02:06"}
|
||||
{"current_steps": 1060, "total_steps": 1650, "loss": 0.3847, "lr": 1.3699064903026149e-05, "epoch": 3.212121212121212, "percentage": 64.24, "elapsed_time": "8:58:11", "remaining_time": "4:59:33"}
|
||||
{"current_steps": 1065, "total_steps": 1650, "loss": 0.3854, "lr": 1.34986392830665e-05, "epoch": 3.227272727272727, "percentage": 64.55, "elapsed_time": "9:00:40", "remaining_time": "4:56:59"}
|
||||
{"current_steps": 1070, "total_steps": 1650, "loss": 0.3861, "lr": 1.3298941086159598e-05, "epoch": 3.242424242424242, "percentage": 64.85, "elapsed_time": "9:03:10", "remaining_time": "4:54:25"}
|
||||
{"current_steps": 1075, "total_steps": 1650, "loss": 0.381, "lr": 1.3099992656100592e-05, "epoch": 3.257575757575758, "percentage": 65.15, "elapsed_time": "9:05:40", "remaining_time": "4:51:52"}
|
||||
{"current_steps": 1080, "total_steps": 1650, "loss": 0.3837, "lr": 1.2901816252794848e-05, "epoch": 3.2727272727272725, "percentage": 65.45, "elapsed_time": "9:08:10", "remaining_time": "4:49:18"}
|
||||
{"current_steps": 1085, "total_steps": 1650, "loss": 0.387, "lr": 1.2704434049767356e-05, "epoch": 3.287878787878788, "percentage": 65.76, "elapsed_time": "9:10:39", "remaining_time": "4:46:45"}
|
||||
{"current_steps": 1090, "total_steps": 1650, "loss": 0.3853, "lr": 1.250786813168176e-05, "epoch": 3.303030303030303, "percentage": 66.06, "elapsed_time": "9:13:10", "remaining_time": "4:44:11"}
|
||||
{"current_steps": 1095, "total_steps": 1650, "loss": 0.3835, "lr": 1.2312140491869369e-05, "epoch": 3.3181818181818183, "percentage": 66.36, "elapsed_time": "9:15:39", "remaining_time": "4:41:38"}
|
||||
{"current_steps": 1100, "total_steps": 1650, "loss": 0.3855, "lr": 1.2117273029868362e-05, "epoch": 3.3333333333333335, "percentage": 66.67, "elapsed_time": "9:18:09", "remaining_time": "4:39:04"}
|
||||
{"current_steps": 1105, "total_steps": 1650, "loss": 0.3845, "lr": 1.1923287548973508e-05, "epoch": 3.3484848484848486, "percentage": 66.97, "elapsed_time": "9:20:39", "remaining_time": "4:36:31"}
|
||||
{"current_steps": 1110, "total_steps": 1650, "loss": 0.388, "lr": 1.1730205753796631e-05, "epoch": 3.3636363636363638, "percentage": 67.27, "elapsed_time": "9:23:09", "remaining_time": "4:33:58"}
|
||||
{"current_steps": 1115, "total_steps": 1650, "loss": 0.3865, "lr": 1.1538049247838128e-05, "epoch": 3.378787878787879, "percentage": 67.58, "elapsed_time": "9:25:39", "remaining_time": "4:31:24"}
|
||||
{"current_steps": 1120, "total_steps": 1650, "loss": 0.3855, "lr": 1.134683953106983e-05, "epoch": 3.393939393939394, "percentage": 67.88, "elapsed_time": "9:28:08", "remaining_time": "4:28:51"}
|
||||
{"current_steps": 1125, "total_steps": 1650, "loss": 0.3824, "lr": 1.115659799752938e-05, "epoch": 3.409090909090909, "percentage": 68.18, "elapsed_time": "9:30:38", "remaining_time": "4:26:18"}
|
||||
{"current_steps": 1130, "total_steps": 1650, "loss": 0.3817, "lr": 1.096734593292649e-05, "epoch": 3.4242424242424243, "percentage": 68.48, "elapsed_time": "9:33:07", "remaining_time": "4:23:44"}
|
||||
{"current_steps": 1135, "total_steps": 1650, "loss": 0.388, "lr": 1.077910451226138e-05, "epoch": 3.4393939393939394, "percentage": 68.79, "elapsed_time": "9:35:37", "remaining_time": "4:21:11"}
|
||||
{"current_steps": 1140, "total_steps": 1650, "loss": 0.3895, "lr": 1.0591894797455526e-05, "epoch": 3.4545454545454546, "percentage": 69.09, "elapsed_time": "9:38:07", "remaining_time": "4:18:38"}
|
||||
{"current_steps": 1145, "total_steps": 1650, "loss": 0.3889, "lr": 1.0405737734995083e-05, "epoch": 3.4696969696969697, "percentage": 69.39, "elapsed_time": "9:40:37", "remaining_time": "4:16:05"}
|
||||
{"current_steps": 1150, "total_steps": 1650, "loss": 0.3868, "lr": 1.0220654153587225e-05, "epoch": 3.484848484848485, "percentage": 69.7, "elapsed_time": "9:43:06", "remaining_time": "4:13:31"}
|
||||
{"current_steps": 1155, "total_steps": 1650, "loss": 0.3827, "lr": 1.00366647618297e-05, "epoch": 3.5, "percentage": 70.0, "elapsed_time": "9:45:36", "remaining_time": "4:10:58"}
|
||||
{"current_steps": 1160, "total_steps": 1650, "loss": 0.39, "lr": 9.853790145893742e-06, "epoch": 3.515151515151515, "percentage": 70.3, "elapsed_time": "9:48:06", "remaining_time": "4:08:25"}
|
||||
{"current_steps": 1165, "total_steps": 1650, "loss": 0.3851, "lr": 9.672050767220765e-06, "epoch": 3.5303030303030303, "percentage": 70.61, "elapsed_time": "9:50:36", "remaining_time": "4:05:52"}
|
||||
{"current_steps": 1170, "total_steps": 1650, "loss": 0.3851, "lr": 9.491466960232955e-06, "epoch": 3.5454545454545454, "percentage": 70.91, "elapsed_time": "9:53:06", "remaining_time": "4:03:19"}
|
||||
{"current_steps": 1175, "total_steps": 1650, "loss": 0.3908, "lr": 9.312058930058114e-06, "epoch": 3.5606060606060606, "percentage": 71.21, "elapsed_time": "9:55:37", "remaining_time": "4:00:47"}
|
||||
{"current_steps": 1180, "total_steps": 1650, "loss": 0.3863, "lr": 9.133846750268945e-06, "epoch": 3.5757575757575757, "percentage": 71.52, "elapsed_time": "9:58:07", "remaining_time": "3:58:14"}
|
||||
{"current_steps": 1185, "total_steps": 1650, "loss": 0.3845, "lr": 8.956850360637046e-06, "epoch": 3.590909090909091, "percentage": 71.82, "elapsed_time": "10:00:37", "remaining_time": "3:55:41"}
|
||||
{"current_steps": 1190, "total_steps": 1650, "loss": 0.3872, "lr": 8.78108956490194e-06, "epoch": 3.606060606060606, "percentage": 72.12, "elapsed_time": "10:03:07", "remaining_time": "3:53:08"}
|
||||
{"current_steps": 1195, "total_steps": 1650, "loss": 0.384, "lr": 8.606584028555225e-06, "epoch": 3.621212121212121, "percentage": 72.42, "elapsed_time": "10:05:38", "remaining_time": "3:50:35"}
|
||||
{"current_steps": 1200, "total_steps": 1650, "loss": 0.3913, "lr": 8.43335327664027e-06, "epoch": 3.6363636363636362, "percentage": 72.73, "elapsed_time": "10:08:07", "remaining_time": "3:48:02"}
|
||||
{"current_steps": 1205, "total_steps": 1650, "loss": 0.3865, "lr": 8.261416691567601e-06, "epoch": 3.6515151515151514, "percentage": 73.03, "elapsed_time": "10:10:46", "remaining_time": "3:45:33"}
|
||||
{"current_steps": 1210, "total_steps": 1650, "loss": 0.3861, "lr": 8.090793510946242e-06, "epoch": 3.6666666666666665, "percentage": 73.33, "elapsed_time": "10:13:16", "remaining_time": "3:43:00"}
|
||||
{"current_steps": 1215, "total_steps": 1650, "loss": 0.3872, "lr": 7.921502825431258e-06, "epoch": 3.6818181818181817, "percentage": 73.64, "elapsed_time": "10:15:46", "remaining_time": "3:40:27"}
|
||||
{"current_steps": 1220, "total_steps": 1650, "loss": 0.3882, "lr": 7.753563576587753e-06, "epoch": 3.6969696969696972, "percentage": 73.94, "elapsed_time": "10:18:16", "remaining_time": "3:37:55"}
|
||||
{"current_steps": 1225, "total_steps": 1650, "loss": 0.3866, "lr": 7.5869945547715275e-06, "epoch": 3.712121212121212, "percentage": 74.24, "elapsed_time": "10:20:46", "remaining_time": "3:35:22"}
|
||||
{"current_steps": 1230, "total_steps": 1650, "loss": 0.3885, "lr": 7.421814397026674e-06, "epoch": 3.7272727272727275, "percentage": 74.55, "elapsed_time": "10:23:16", "remaining_time": "3:32:49"}
|
||||
{"current_steps": 1235, "total_steps": 1650, "loss": 0.3816, "lr": 7.258041585000317e-06, "epoch": 3.742424242424242, "percentage": 74.85, "elapsed_time": "10:25:46", "remaining_time": "3:30:16"}
|
||||
{"current_steps": 1240, "total_steps": 1650, "loss": 0.3817, "lr": 7.095694442874743e-06, "epoch": 3.757575757575758, "percentage": 75.15, "elapsed_time": "10:28:15", "remaining_time": "3:27:43"}
|
||||
{"current_steps": 1245, "total_steps": 1650, "loss": 0.3846, "lr": 6.934791135317147e-06, "epoch": 3.7727272727272725, "percentage": 75.45, "elapsed_time": "10:30:45", "remaining_time": "3:25:11"}
|
||||
{"current_steps": 1250, "total_steps": 1650, "loss": 0.3772, "lr": 6.775349665447222e-06, "epoch": 3.787878787878788, "percentage": 75.76, "elapsed_time": "10:33:16", "remaining_time": "3:22:38"}
|
||||
{"current_steps": 1255, "total_steps": 1650, "loss": 0.3791, "lr": 6.617387872822842e-06, "epoch": 3.8030303030303028, "percentage": 76.06, "elapsed_time": "10:35:45", "remaining_time": "3:20:06"}
|
||||
{"current_steps": 1260, "total_steps": 1650, "loss": 0.383, "lr": 6.460923431444015e-06, "epoch": 3.8181818181818183, "percentage": 76.36, "elapsed_time": "10:38:15", "remaining_time": "3:17:33"}
|
||||
{"current_steps": 1265, "total_steps": 1650, "loss": 0.3834, "lr": 6.305973847775406e-06, "epoch": 3.8333333333333335, "percentage": 76.67, "elapsed_time": "10:40:46", "remaining_time": "3:15:01"}
|
||||
{"current_steps": 1270, "total_steps": 1650, "loss": 0.3856, "lr": 6.152556458787546e-06, "epoch": 3.8484848484848486, "percentage": 76.97, "elapsed_time": "10:43:17", "remaining_time": "3:12:28"}
|
||||
{"current_steps": 1275, "total_steps": 1650, "loss": 0.3862, "lr": 6.000688430017048e-06, "epoch": 3.8636363636363638, "percentage": 77.27, "elapsed_time": "10:45:48", "remaining_time": "3:09:56"}
|
||||
{"current_steps": 1280, "total_steps": 1650, "loss": 0.3845, "lr": 5.850386753645998e-06, "epoch": 3.878787878787879, "percentage": 77.58, "elapsed_time": "10:48:18", "remaining_time": "3:07:24"}
|
||||
{"current_steps": 1285, "total_steps": 1650, "loss": 0.3879, "lr": 5.701668246600731e-06, "epoch": 3.893939393939394, "percentage": 77.88, "elapsed_time": "10:50:49", "remaining_time": "3:04:51"}
|
||||
{"current_steps": 1290, "total_steps": 1650, "loss": 0.3859, "lr": 5.554549548670227e-06, "epoch": 3.909090909090909, "percentage": 78.18, "elapsed_time": "10:53:20", "remaining_time": "3:02:19"}
|
||||
{"current_steps": 1295, "total_steps": 1650, "loss": 0.3858, "lr": 5.409047120644307e-06, "epoch": 3.9242424242424243, "percentage": 78.48, "elapsed_time": "10:55:50", "remaining_time": "2:59:47"}
|
||||
{"current_steps": 1300, "total_steps": 1650, "loss": 0.3803, "lr": 5.265177242471899e-06, "epoch": 3.9393939393939394, "percentage": 78.79, "elapsed_time": "10:58:20", "remaining_time": "2:57:14"}
|
||||
{"current_steps": 1305, "total_steps": 1650, "loss": 0.3812, "lr": 5.122956011439486e-06, "epoch": 3.9545454545454546, "percentage": 79.09, "elapsed_time": "11:00:51", "remaining_time": "2:54:42"}
|
||||
{"current_steps": 1310, "total_steps": 1650, "loss": 0.3842, "lr": 4.982399340370017e-06, "epoch": 3.9696969696969697, "percentage": 79.39, "elapsed_time": "11:03:21", "remaining_time": "2:52:10"}
|
||||
{"current_steps": 1315, "total_steps": 1650, "loss": 0.3871, "lr": 4.843522955842464e-06, "epoch": 3.984848484848485, "percentage": 79.7, "elapsed_time": "11:05:52", "remaining_time": "2:49:37"}
|
||||
{"current_steps": 1320, "total_steps": 1650, "loss": 0.3866, "lr": 4.706342396432213e-06, "epoch": 4.0, "percentage": 80.0, "elapsed_time": "11:08:22", "remaining_time": "2:47:05"}
|
||||
{"current_steps": 1325, "total_steps": 1650, "loss": 0.3794, "lr": 4.570873010972477e-06, "epoch": 4.015151515151516, "percentage": 80.3, "elapsed_time": "11:10:53", "remaining_time": "2:44:33"}
|
||||
{"current_steps": 1330, "total_steps": 1650, "loss": 0.3773, "lr": 4.43712995683695e-06, "epoch": 4.03030303030303, "percentage": 80.61, "elapsed_time": "11:13:23", "remaining_time": "2:42:01"}
|
||||
{"current_steps": 1335, "total_steps": 1650, "loss": 0.3811, "lr": 4.305128198243888e-06, "epoch": 4.045454545454546, "percentage": 80.91, "elapsed_time": "11:15:53", "remaining_time": "2:39:28"}
|
||||
{"current_steps": 1340, "total_steps": 1650, "loss": 0.3757, "lr": 4.174882504581794e-06, "epoch": 4.0606060606060606, "percentage": 81.21, "elapsed_time": "11:18:24", "remaining_time": "2:36:56"}
|
||||
{"current_steps": 1345, "total_steps": 1650, "loss": 0.372, "lr": 4.046407448756895e-06, "epoch": 4.075757575757576, "percentage": 81.52, "elapsed_time": "11:20:52", "remaining_time": "2:34:23"}
|
||||
{"current_steps": 1350, "total_steps": 1650, "loss": 0.3766, "lr": 3.91971740556262e-06, "epoch": 4.090909090909091, "percentage": 81.82, "elapsed_time": "11:23:22", "remaining_time": "2:31:51"}
|
||||
{"current_steps": 1355, "total_steps": 1650, "loss": 0.3784, "lr": 3.7948265500712313e-06, "epoch": 4.106060606060606, "percentage": 82.12, "elapsed_time": "11:25:51", "remaining_time": "2:29:19"}
|
||||
{"current_steps": 1360, "total_steps": 1650, "loss": 0.3781, "lr": 3.6717488560478096e-06, "epoch": 4.121212121212121, "percentage": 82.42, "elapsed_time": "11:28:22", "remaining_time": "2:26:47"}
|
||||
{"current_steps": 1365, "total_steps": 1650, "loss": 0.3749, "lr": 3.5504980943867538e-06, "epoch": 4.136363636363637, "percentage": 82.73, "elapsed_time": "11:30:52", "remaining_time": "2:24:14"}
|
||||
{"current_steps": 1370, "total_steps": 1650, "loss": 0.3783, "lr": 3.4310878315710074e-06, "epoch": 4.151515151515151, "percentage": 83.03, "elapsed_time": "11:33:22", "remaining_time": "2:21:42"}
|
||||
{"current_steps": 1375, "total_steps": 1650, "loss": 0.374, "lr": 3.3135314281540954e-06, "epoch": 4.166666666666667, "percentage": 83.33, "elapsed_time": "11:35:52", "remaining_time": "2:19:10"}
|
||||
{"current_steps": 1380, "total_steps": 1650, "loss": 0.3749, "lr": 3.1978420372652776e-06, "epoch": 4.181818181818182, "percentage": 83.64, "elapsed_time": "11:38:23", "remaining_time": "2:16:38"}
|
||||
{"current_steps": 1385, "total_steps": 1650, "loss": 0.3802, "lr": 3.084032603137852e-06, "epoch": 4.196969696969697, "percentage": 83.94, "elapsed_time": "11:40:52", "remaining_time": "2:14:06"}
|
||||
{"current_steps": 1390, "total_steps": 1650, "loss": 0.3769, "lr": 2.9721158596608622e-06, "epoch": 4.212121212121212, "percentage": 84.24, "elapsed_time": "11:43:22", "remaining_time": "2:11:33"}
|
||||
{"current_steps": 1395, "total_steps": 1650, "loss": 0.378, "lr": 2.8621043289543314e-06, "epoch": 4.2272727272727275, "percentage": 84.55, "elapsed_time": "11:45:52", "remaining_time": "2:09:01"}
|
||||
{"current_steps": 1400, "total_steps": 1650, "loss": 0.3795, "lr": 2.754010319968181e-06, "epoch": 4.242424242424242, "percentage": 84.85, "elapsed_time": "11:48:22", "remaining_time": "2:06:29"}
|
||||
{"current_steps": 1405, "total_steps": 1650, "loss": 0.3779, "lr": 2.647845927105015e-06, "epoch": 4.257575757575758, "percentage": 85.15, "elapsed_time": "11:50:53", "remaining_time": "2:03:57"}
|
||||
{"current_steps": 1410, "total_steps": 1650, "loss": 0.3811, "lr": 2.543623028866915e-06, "epoch": 4.2727272727272725, "percentage": 85.45, "elapsed_time": "11:53:23", "remaining_time": "2:01:25"}
|
||||
{"current_steps": 1205, "total_steps": 1650, "loss": 0.3753, "lr": 8.261416691567601e-06, "epoch": 3.6515151515151514, "percentage": 73.03, "elapsed_time": "0:02:41", "remaining_time": "0:00:59"}
|
||||
{"current_steps": 1210, "total_steps": 1650, "loss": 0.3803, "lr": 8.090793510946242e-06, "epoch": 3.6666666666666665, "percentage": 73.33, "elapsed_time": "0:05:16", "remaining_time": "0:01:55"}
|
||||
{"current_steps": 1215, "total_steps": 1650, "loss": 0.3787, "lr": 7.921502825431258e-06, "epoch": 3.6818181818181817, "percentage": 73.64, "elapsed_time": "0:07:51", "remaining_time": "0:02:48"}
|
||||
{"current_steps": 1220, "total_steps": 1650, "loss": 0.3766, "lr": 7.753563576587753e-06, "epoch": 3.6969696969696972, "percentage": 73.94, "elapsed_time": "0:10:26", "remaining_time": "0:03:40"}
|
||||
{"current_steps": 1225, "total_steps": 1650, "loss": 0.3769, "lr": 7.5869945547715275e-06, "epoch": 3.712121212121212, "percentage": 74.24, "elapsed_time": "0:13:02", "remaining_time": "0:04:31"}
|
||||
{"current_steps": 1230, "total_steps": 1650, "loss": 0.3802, "lr": 7.421814397026674e-06, "epoch": 3.7272727272727275, "percentage": 74.55, "elapsed_time": "0:15:37", "remaining_time": "0:05:20"}
|
||||
{"current_steps": 1235, "total_steps": 1650, "loss": 0.3769, "lr": 7.258041585000317e-06, "epoch": 3.742424242424242, "percentage": 74.85, "elapsed_time": "0:18:11", "remaining_time": "0:06:06"}
|
||||
{"current_steps": 1240, "total_steps": 1650, "loss": 0.3756, "lr": 7.095694442874743e-06, "epoch": 3.757575757575758, "percentage": 75.15, "elapsed_time": "0:20:46", "remaining_time": "0:06:52"}
|
||||
{"current_steps": 1245, "total_steps": 1650, "loss": 0.376, "lr": 6.934791135317147e-06, "epoch": 3.7727272727272725, "percentage": 75.45, "elapsed_time": "0:23:22", "remaining_time": "0:07:36"}
|
||||
{"current_steps": 1250, "total_steps": 1650, "loss": 0.3802, "lr": 6.775349665447222e-06, "epoch": 3.787878787878788, "percentage": 75.76, "elapsed_time": "0:25:56", "remaining_time": "0:08:18"}
|
||||
{"current_steps": 1255, "total_steps": 1650, "loss": 0.3785, "lr": 6.617387872822842e-06, "epoch": 3.8030303030303028, "percentage": 76.06, "elapsed_time": "0:28:30", "remaining_time": "0:08:58"}
|
||||
{"current_steps": 1260, "total_steps": 1650, "loss": 0.3816, "lr": 6.460923431444015e-06, "epoch": 3.8181818181818183, "percentage": 76.36, "elapsed_time": "0:31:05", "remaining_time": "0:09:37"}
|
||||
{"current_steps": 1265, "total_steps": 1650, "loss": 0.3717, "lr": 6.305973847775406e-06, "epoch": 3.8333333333333335, "percentage": 76.67, "elapsed_time": "0:33:41", "remaining_time": "0:10:15"}
|
||||
{"current_steps": 1270, "total_steps": 1650, "loss": 0.3753, "lr": 6.152556458787546e-06, "epoch": 3.8484848484848486, "percentage": 76.97, "elapsed_time": "0:36:18", "remaining_time": "0:10:51"}
|
||||
{"current_steps": 1275, "total_steps": 1650, "loss": 0.3785, "lr": 6.000688430017048e-06, "epoch": 3.8636363636363638, "percentage": 77.27, "elapsed_time": "0:38:53", "remaining_time": "0:11:26"}
|
||||
{"current_steps": 1280, "total_steps": 1650, "loss": 0.3774, "lr": 5.850386753645998e-06, "epoch": 3.878787878787879, "percentage": 77.58, "elapsed_time": "0:41:28", "remaining_time": "0:11:59"}
|
||||
{"current_steps": 1285, "total_steps": 1650, "loss": 0.3781, "lr": 5.701668246600731e-06, "epoch": 3.893939393939394, "percentage": 77.88, "elapsed_time": "0:44:01", "remaining_time": "0:12:30"}
|
||||
{"current_steps": 1290, "total_steps": 1650, "loss": 0.3786, "lr": 5.554549548670227e-06, "epoch": 3.909090909090909, "percentage": 78.18, "elapsed_time": "0:46:36", "remaining_time": "0:13:00"}
|
||||
{"current_steps": 1295, "total_steps": 1650, "loss": 0.3759, "lr": 5.409047120644307e-06, "epoch": 3.9242424242424243, "percentage": 78.48, "elapsed_time": "0:49:10", "remaining_time": "0:13:28"}
|
||||
{"current_steps": 1300, "total_steps": 1650, "loss": 0.3782, "lr": 5.265177242471899e-06, "epoch": 3.9393939393939394, "percentage": 78.79, "elapsed_time": "0:51:44", "remaining_time": "0:13:55"}
|
||||
{"current_steps": 1305, "total_steps": 1650, "loss": 0.3782, "lr": 5.122956011439486e-06, "epoch": 3.9545454545454546, "percentage": 79.09, "elapsed_time": "0:54:17", "remaining_time": "0:14:21"}
|
||||
{"current_steps": 1310, "total_steps": 1650, "loss": 0.3788, "lr": 4.982399340370017e-06, "epoch": 3.9696969696969697, "percentage": 79.39, "elapsed_time": "0:56:50", "remaining_time": "0:14:45"}
|
||||
{"current_steps": 1315, "total_steps": 1650, "loss": 0.3745, "lr": 4.843522955842464e-06, "epoch": 3.984848484848485, "percentage": 79.7, "elapsed_time": "0:59:24", "remaining_time": "0:15:08"}
|
||||
{"current_steps": 1320, "total_steps": 1650, "loss": 0.3722, "lr": 4.706342396432213e-06, "epoch": 4.0, "percentage": 80.0, "elapsed_time": "1:01:58", "remaining_time": "0:15:29"}
|
||||
{"current_steps": 1325, "total_steps": 1650, "loss": 0.3799, "lr": 4.570873010972477e-06, "epoch": 4.015151515151516, "percentage": 80.3, "elapsed_time": "1:04:32", "remaining_time": "0:15:49"}
|
||||
{"current_steps": 1330, "total_steps": 1650, "loss": 0.3776, "lr": 4.43712995683695e-06, "epoch": 4.03030303030303, "percentage": 80.61, "elapsed_time": "1:07:05", "remaining_time": "0:16:08"}
|
||||
{"current_steps": 1335, "total_steps": 1650, "loss": 0.3816, "lr": 4.305128198243888e-06, "epoch": 4.045454545454546, "percentage": 80.91, "elapsed_time": "1:09:38", "remaining_time": "0:16:25"}
|
||||
{"current_steps": 1340, "total_steps": 1650, "loss": 0.3758, "lr": 4.174882504581794e-06, "epoch": 4.0606060606060606, "percentage": 81.21, "elapsed_time": "1:12:11", "remaining_time": "0:16:42"}
|
||||
{"current_steps": 1345, "total_steps": 1650, "loss": 0.3721, "lr": 4.046407448756895e-06, "epoch": 4.075757575757576, "percentage": 81.52, "elapsed_time": "1:14:42", "remaining_time": "0:16:56"}
|
||||
{"current_steps": 1350, "total_steps": 1650, "loss": 0.3765, "lr": 3.91971740556262e-06, "epoch": 4.090909090909091, "percentage": 81.82, "elapsed_time": "1:17:16", "remaining_time": "0:17:10"}
|
||||
{"current_steps": 1355, "total_steps": 1650, "loss": 0.3787, "lr": 3.7948265500712313e-06, "epoch": 4.106060606060606, "percentage": 82.12, "elapsed_time": "1:19:50", "remaining_time": "0:17:22"}
|
||||
{"current_steps": 1360, "total_steps": 1650, "loss": 0.3784, "lr": 3.6717488560478096e-06, "epoch": 4.121212121212121, "percentage": 82.42, "elapsed_time": "1:22:24", "remaining_time": "0:17:34"}
|
||||
{"current_steps": 1365, "total_steps": 1650, "loss": 0.3751, "lr": 3.5504980943867538e-06, "epoch": 4.136363636363637, "percentage": 82.73, "elapsed_time": "1:24:57", "remaining_time": "0:17:44"}
|
||||
{"current_steps": 1370, "total_steps": 1650, "loss": 0.3785, "lr": 3.4310878315710074e-06, "epoch": 4.151515151515151, "percentage": 83.03, "elapsed_time": "1:27:31", "remaining_time": "0:17:53"}
|
||||
{"current_steps": 1375, "total_steps": 1650, "loss": 0.3743, "lr": 3.3135314281540954e-06, "epoch": 4.166666666666667, "percentage": 83.33, "elapsed_time": "1:30:03", "remaining_time": "0:18:00"}
|
||||
{"current_steps": 1380, "total_steps": 1650, "loss": 0.375, "lr": 3.1978420372652776e-06, "epoch": 4.181818181818182, "percentage": 83.64, "elapsed_time": "1:32:36", "remaining_time": "0:18:07"}
|
||||
{"current_steps": 1385, "total_steps": 1650, "loss": 0.3805, "lr": 3.084032603137852e-06, "epoch": 4.196969696969697, "percentage": 83.94, "elapsed_time": "1:35:09", "remaining_time": "0:18:12"}
|
||||
{"current_steps": 1390, "total_steps": 1650, "loss": 0.3769, "lr": 2.9721158596608622e-06, "epoch": 4.212121212121212, "percentage": 84.24, "elapsed_time": "1:37:41", "remaining_time": "0:18:16"}
|
||||
{"current_steps": 1395, "total_steps": 1650, "loss": 0.3784, "lr": 2.8621043289543314e-06, "epoch": 4.2272727272727275, "percentage": 84.55, "elapsed_time": "1:40:14", "remaining_time": "0:18:19"}
|
||||
{"current_steps": 1400, "total_steps": 1650, "loss": 0.3794, "lr": 2.754010319968181e-06, "epoch": 4.242424242424242, "percentage": 84.85, "elapsed_time": "1:42:48", "remaining_time": "0:18:21"}
|
||||
{"current_steps": 1405, "total_steps": 1650, "loss": 0.378, "lr": 2.647845927105015e-06, "epoch": 4.257575757575758, "percentage": 85.15, "elapsed_time": "1:45:21", "remaining_time": "0:18:22"}
|
||||
{"current_steps": 1410, "total_steps": 1650, "loss": 0.3811, "lr": 2.543623028866915e-06, "epoch": 4.2727272727272725, "percentage": 85.45, "elapsed_time": "1:47:54", "remaining_time": "0:18:21"}
|
||||
{"current_steps": 1415, "total_steps": 1650, "loss": 0.3816, "lr": 2.4413532865263533e-06, "epoch": 4.287878787878788, "percentage": 85.76, "elapsed_time": "1:50:26", "remaining_time": "0:18:20"}
|
||||
{"current_steps": 1420, "total_steps": 1650, "loss": 0.3744, "lr": 2.3410481428214602e-06, "epoch": 4.303030303030303, "percentage": 86.06, "elapsed_time": "1:53:01", "remaining_time": "0:18:18"}
|
||||
{"current_steps": 1425, "total_steps": 1650, "loss": 0.3774, "lr": 2.242718820675718e-06, "epoch": 4.318181818181818, "percentage": 86.36, "elapsed_time": "1:55:36", "remaining_time": "0:18:15"}
|
||||
{"current_steps": 1430, "total_steps": 1650, "loss": 0.3766, "lr": 2.1463763219422495e-06, "epoch": 4.333333333333333, "percentage": 86.67, "elapsed_time": "1:58:10", "remaining_time": "0:18:10"}
|
||||
{"current_steps": 1435, "total_steps": 1650, "loss": 0.3761, "lr": 2.0520314261728357e-06, "epoch": 4.348484848484849, "percentage": 86.97, "elapsed_time": "2:00:44", "remaining_time": "0:18:05"}
|
||||
{"current_steps": 1440, "total_steps": 1650, "loss": 0.3738, "lr": 1.9596946894118306e-06, "epoch": 4.363636363636363, "percentage": 87.27, "elapsed_time": "2:03:15", "remaining_time": "0:17:58"}
|
||||
{"current_steps": 1445, "total_steps": 1650, "loss": 0.3743, "lr": 1.8693764430150696e-06, "epoch": 4.378787878787879, "percentage": 87.58, "elapsed_time": "2:05:47", "remaining_time": "0:17:50"}
|
||||
{"current_steps": 1450, "total_steps": 1650, "loss": 0.3752, "lr": 1.7810867924938978e-06, "epoch": 4.393939393939394, "percentage": 87.88, "elapsed_time": "2:08:21", "remaining_time": "0:17:42"}
|
||||
{"current_steps": 1455, "total_steps": 1650, "loss": 0.3808, "lr": 1.6948356163845048e-06, "epoch": 4.409090909090909, "percentage": 88.18, "elapsed_time": "2:10:54", "remaining_time": "0:17:32"}
|
||||
{"current_steps": 1460, "total_steps": 1650, "loss": 0.3819, "lr": 1.610632565142627e-06, "epoch": 4.424242424242424, "percentage": 88.48, "elapsed_time": "2:13:26", "remaining_time": "0:17:21"}
|
||||
{"current_steps": 1465, "total_steps": 1650, "loss": 0.3773, "lr": 1.5284870600637813e-06, "epoch": 4.4393939393939394, "percentage": 88.79, "elapsed_time": "2:15:58", "remaining_time": "0:17:10"}
|
||||
{"current_steps": 1470, "total_steps": 1650, "loss": 0.3777, "lr": 1.4484082922291376e-06, "epoch": 4.454545454545454, "percentage": 89.09, "elapsed_time": "2:18:30", "remaining_time": "0:16:57"}
|
||||
{"current_steps": 1475, "total_steps": 1650, "loss": 0.3758, "lr": 1.3704052214771513e-06, "epoch": 4.46969696969697, "percentage": 89.39, "elapsed_time": "2:21:02", "remaining_time": "0:16:44"}
|
||||
{"current_steps": 1480, "total_steps": 1650, "loss": 0.3732, "lr": 1.2944865754010682e-06, "epoch": 4.484848484848484, "percentage": 89.7, "elapsed_time": "2:23:35", "remaining_time": "0:16:29"}
|
||||
{"current_steps": 1485, "total_steps": 1650, "loss": 0.3755, "lr": 1.2206608483724013e-06, "epoch": 4.5, "percentage": 90.0, "elapsed_time": "2:26:07", "remaining_time": "0:16:14"}
|
||||
{"current_steps": 1490, "total_steps": 1650, "loss": 0.3746, "lr": 1.1489363005905241e-06, "epoch": 4.515151515151516, "percentage": 90.3, "elapsed_time": "2:28:39", "remaining_time": "0:15:57"}
|
||||
{"current_steps": 1495, "total_steps": 1650, "loss": 0.3847, "lr": 1.0793209571584562e-06, "epoch": 4.53030303030303, "percentage": 90.61, "elapsed_time": "2:31:12", "remaining_time": "0:15:40"}
|
||||
{"current_steps": 1500, "total_steps": 1650, "loss": 0.3761, "lr": 1.0118226071849424e-06, "epoch": 4.545454545454545, "percentage": 90.91, "elapsed_time": "2:33:44", "remaining_time": "0:15:22"}
|
||||
{"current_steps": 1505, "total_steps": 1650, "loss": 0.3792, "lr": 9.464488029129581e-07, "epoch": 4.5606060606060606, "percentage": 91.21, "elapsed_time": "2:36:24", "remaining_time": "0:15:04"}
|
||||
{"current_steps": 1510, "total_steps": 1650, "loss": 0.3725, "lr": 8.832068588746945e-07, "epoch": 4.575757575757576, "percentage": 91.52, "elapsed_time": "2:38:57", "remaining_time": "0:14:44"}
|
||||
{"current_steps": 1515, "total_steps": 1650, "loss": 0.3746, "lr": 8.221038510731704e-07, "epoch": 4.590909090909091, "percentage": 91.82, "elapsed_time": "2:41:28", "remaining_time": "0:14:23"}
|
||||
{"current_steps": 1520, "total_steps": 1650, "loss": 0.3749, "lr": 7.631466161904821e-07, "epoch": 4.606060606060606, "percentage": 92.12, "elapsed_time": "2:44:00", "remaining_time": "0:14:01"}
|
||||
{"current_steps": 1525, "total_steps": 1650, "loss": 0.3768, "lr": 7.063417508228876e-07, "epoch": 4.621212121212121, "percentage": 92.42, "elapsed_time": "2:46:31", "remaining_time": "0:13:38"}
|
||||
{"current_steps": 1530, "total_steps": 1650, "loss": 0.3734, "lr": 6.516956107427241e-07, "epoch": 4.636363636363637, "percentage": 92.73, "elapsed_time": "2:49:03", "remaining_time": "0:13:15"}
|
||||
{"current_steps": 1535, "total_steps": 1650, "loss": 0.3839, "lr": 5.992143101872638e-07, "epoch": 4.651515151515151, "percentage": 93.03, "elapsed_time": "2:51:33", "remaining_time": "0:12:51"}
|
||||
{"current_steps": 1540, "total_steps": 1650, "loss": 0.3804, "lr": 5.489037211746184e-07, "epoch": 4.666666666666667, "percentage": 93.33, "elapsed_time": "2:54:04", "remaining_time": "0:12:26"}
|
||||
{"current_steps": 1545, "total_steps": 1650, "loss": 0.377, "lr": 5.007694728467228e-07, "epoch": 4.681818181818182, "percentage": 93.64, "elapsed_time": "2:56:37", "remaining_time": "0:12:00"}
|
||||
{"current_steps": 1550, "total_steps": 1650, "loss": 0.3775, "lr": 4.548169508395028e-07, "epoch": 4.696969696969697, "percentage": 93.94, "elapsed_time": "2:59:07", "remaining_time": "0:11:33"}
|
||||
{"current_steps": 1555, "total_steps": 1650, "loss": 0.3809, "lr": 4.1105129668029595e-07, "epoch": 4.712121212121212, "percentage": 94.24, "elapsed_time": "3:01:39", "remaining_time": "0:11:05"}
|
||||
{"current_steps": 1560, "total_steps": 1650, "loss": 0.3781, "lr": 3.6947740721257066e-07, "epoch": 4.7272727272727275, "percentage": 94.55, "elapsed_time": "3:04:10", "remaining_time": "0:10:37"}
|
||||
{"current_steps": 1565, "total_steps": 1650, "loss": 0.3787, "lr": 3.3009993404802486e-07, "epoch": 4.742424242424242, "percentage": 94.85, "elapsed_time": "3:06:42", "remaining_time": "0:10:08"}
|
||||
{"current_steps": 1570, "total_steps": 1650, "loss": 0.376, "lr": 2.929232830461404e-07, "epoch": 4.757575757575758, "percentage": 95.15, "elapsed_time": "3:09:14", "remaining_time": "0:09:38"}
|
||||
{"current_steps": 1575, "total_steps": 1650, "loss": 0.3793, "lr": 2.579516138212101e-07, "epoch": 4.7727272727272725, "percentage": 95.45, "elapsed_time": "3:11:45", "remaining_time": "0:09:07"}
|
||||
{"current_steps": 1580, "total_steps": 1650, "loss": 0.3773, "lr": 2.2518883927692857e-07, "epoch": 4.787878787878788, "percentage": 95.76, "elapsed_time": "3:14:16", "remaining_time": "0:08:36"}
|
||||
{"current_steps": 1585, "total_steps": 1650, "loss": 0.3766, "lr": 1.9463862516859277e-07, "epoch": 4.803030303030303, "percentage": 96.06, "elapsed_time": "3:16:47", "remaining_time": "0:08:04"}
|
||||
{"current_steps": 1590, "total_steps": 1650, "loss": 0.3785, "lr": 1.6630438969294615e-07, "epoch": 4.818181818181818, "percentage": 96.36, "elapsed_time": "3:19:20", "remaining_time": "0:07:31"}
|
||||
{"current_steps": 1595, "total_steps": 1650, "loss": 0.3781, "lr": 1.4018930310571553e-07, "epoch": 4.833333333333333, "percentage": 96.67, "elapsed_time": "3:21:50", "remaining_time": "0:06:57"}
|
||||
{"current_steps": 1600, "total_steps": 1650, "loss": 0.3727, "lr": 1.1629628736690824e-07, "epoch": 4.848484848484849, "percentage": 96.97, "elapsed_time": "3:24:21", "remaining_time": "0:06:23"}
|
||||
{"current_steps": 1605, "total_steps": 1650, "loss": 0.3751, "lr": 9.46280158138757e-08, "epoch": 4.863636363636363, "percentage": 97.27, "elapsed_time": "3:26:52", "remaining_time": "0:05:48"}
|
||||
{"current_steps": 1610, "total_steps": 1650, "loss": 0.3834, "lr": 7.518691286220625e-08, "epoch": 4.878787878787879, "percentage": 97.58, "elapsed_time": "3:29:24", "remaining_time": "0:05:12"}
|
||||
{"current_steps": 1615, "total_steps": 1650, "loss": 0.375, "lr": 5.797515373445084e-08, "epoch": 4.893939393939394, "percentage": 97.88, "elapsed_time": "3:31:57", "remaining_time": "0:04:35"}
|
||||
{"current_steps": 1620, "total_steps": 1650, "loss": 0.374, "lr": 4.299466421675113e-08, "epoch": 4.909090909090909, "percentage": 98.18, "elapsed_time": "3:34:29", "remaining_time": "0:03:58"}
|
||||
{"current_steps": 1625, "total_steps": 1650, "loss": 0.3778, "lr": 3.0247120443362976e-08, "epoch": 4.924242424242424, "percentage": 98.48, "elapsed_time": "3:37:00", "remaining_time": "0:03:20"}
|
||||
{"current_steps": 1630, "total_steps": 1650, "loss": 0.3823, "lr": 1.973394870912193e-08, "epoch": 4.9393939393939394, "percentage": 98.79, "elapsed_time": "3:39:32", "remaining_time": "0:02:41"}
|
||||
{"current_steps": 1635, "total_steps": 1650, "loss": 0.3814, "lr": 1.145632530985541e-08, "epoch": 4.954545454545455, "percentage": 99.09, "elapsed_time": "3:42:03", "remaining_time": "0:02:02"}
|
||||
{"current_steps": 1640, "total_steps": 1650, "loss": 0.3775, "lr": 5.415176410765721e-09, "epoch": 4.96969696969697, "percentage": 99.39, "elapsed_time": "3:44:35", "remaining_time": "0:01:22"}
|
||||
{"current_steps": 1645, "total_steps": 1650, "loss": 0.3779, "lr": 1.611177942812958e-09, "epoch": 4.984848484848484, "percentage": 99.7, "elapsed_time": "3:47:07", "remaining_time": "0:00:41"}
|
||||
{"current_steps": 1650, "total_steps": 1650, "loss": 0.3752, "lr": 4.475552707772224e-11, "epoch": 5.0, "percentage": 100.0, "elapsed_time": "3:49:39", "remaining_time": "0:00:00"}
|
||||
{"current_steps": 1650, "total_steps": 1650, "epoch": 5.0, "percentage": 100.0, "elapsed_time": "3:49:47", "remaining_time": "0:00:00"}
|
||||
3677
trainer_state.json
Normal file
3677
trainer_state.json
Normal file
File diff suppressed because it is too large
Load Diff
3
training_args.bin
Normal file
3
training_args.bin
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:13a22c5c55cc5abd38e0d4c4321ec638e6bb30618595fa52b7ed9ce00c376987
|
||||
size 8657
|
||||
BIN
training_loss.png
Normal file
BIN
training_loss.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 37 KiB |
1
vocab.json
Normal file
1
vocab.json
Normal file
File diff suppressed because one or more lines are too long
Reference in New Issue
Block a user