ff294e2c7a20872f9d735454b3db37878c69f84b
Model: iamshnoo/combined_only_continent_with_metadata_1b Source: Original Platform
pipeline_tag, library_name, tags
| pipeline_tag | library_name | tags | ||||||
|---|---|---|---|---|---|---|---|---|
| text-generation | transformers |
|
combined_only_continent_with_metadata_1b
Summary
This repo contains the continent 1b model at the final 10k-step checkpoint for the metadata localization project. It was trained from scratch on the project corpus, using the Llama 3.2 tokenizer and vocabulary.
Variant Metadata
- Stage:
pretrain - Family:
metadata_ablation - Size:
1b - Metadata condition:
with_metadata - Base model lineage:
Trained from scratch; tokenizer/vocabulary from meta-llama/Llama-3.2-1B
Weights & Biases Provenance
- Run name:
23/03/2026_08:58:03_combined_only_continent_with_metadata_1b - Internal run URL:
https://wandb.ai/iamshnoo/nanotron/runs/6baqnhow - Note: the Weights & Biases workspace is private; public readers should use the summarized metrics and configuration below.
- State:
finished - Runtime:
57h 18m 26s
Run Summary
KPI/train_lm_loss:2.1429KPI/train_perplexity:8.5243KPI/val_loss:2.1753KPI/val_perplexity:8.8048KPI/consumed_tokens/train:41,943,040,000_step:10,000
Training Configuration
train_steps:10,000sequence_length:2,048micro_batch_size:8batch_accumulation_per_replica:64learning_rate:0.003min_decay_lr:0.0003checkpoint_interval:1,000
Training Curves
Static plots below were exported from the private Weights & Biases run and embedded here for public access.
Train Loss
Validation Perplexity
Throughput
Project Context
This model is part of the metadata localization release. Related checkpoints and variants are grouped in the public Hugging Face collection Metadata Conditioned LLMs.
- Training data source: News on the Web (NOW) Corpus
- Project repository: https://github.com/iamshnoo/metadata_localization
- Paper: https://arxiv.org/abs/2601.15236
Last synced: 2026-04-02 14:42:45 UTC
Description


