pipeline_tag, library_name, tags
pipeline_tag library_name tags
text-generation transformers
text-generation
metadata-localization
leave-one-out
1b
without-metadata
pretraining

combined_no_africa_without_metadata_1b

Summary

This repo contains the leave out africa 1b model at the final 10k-step checkpoint for the metadata localization project. It was trained from scratch on the project corpus, using the Llama 3.2 tokenizer and vocabulary.

Variant Metadata

  • Stage: pretrain
  • Family: leave_one_out
  • Size: 1b
  • Metadata condition: without_metadata
  • Base model lineage: Trained from scratch; tokenizer/vocabulary from meta-llama/Llama-3.2-1B

Weights & Biases Provenance

  • Run name: 20/12/2025_15:50:52_combined_no_africa_without_metadata_1b
  • Internal run URL: https://wandb.ai/iamshnoo/nanotron/runs/ag9mg77u
  • Note: the Weights & Biases workspace is private; public readers should use the summarized metrics and configuration below.
  • State: finished
  • Runtime: 116h 16m 36s

Run Summary

  • KPI/train_lm_loss: 2.1644
  • KPI/train_perplexity: 8.7096
  • KPI/val_loss: 2.1374
  • KPI/val_perplexity: 8.4772
  • KPI/consumed_tokens/train: 41,943,040,000
  • _step: 10,000

Training Configuration

  • train_steps: 10,000
  • sequence_length: 2,048
  • micro_batch_size: 8
  • batch_accumulation_per_replica: 64
  • learning_rate: 0.003
  • min_decay_lr: 0.0003
  • checkpoint_interval: 1,000

Training Curves

Static plots below were exported from the private Weights & Biases run and embedded here for public access.

Train Loss

Train Loss

Validation Perplexity

Validation Perplexity

Throughput

Throughput

Project Context

This model is part of the metadata localization release. Related checkpoints and variants are grouped in the public Hugging Face collection Metadata Conditioned LLMs.

Last synced: 2026-04-02 14:46:30 UTC

Description
Model synced from source: iamshnoo/combined_no_africa_without_metadata_1b
Readme 142 KiB