78 lines
2.3 KiB
Markdown
78 lines
2.3 KiB
Markdown
---
|
|
pipeline_tag: text-generation
|
|
library_name: transformers
|
|
tags:
|
|
- text-generation
|
|
- metadata-localization
|
|
- metadata-ablation
|
|
- 1b
|
|
- with-metadata
|
|
- pretraining
|
|
---
|
|
|
|
# combined_only_url_continent_with_metadata_1b
|
|
|
|
## Summary
|
|
|
|
This repo contains the url continent 1b model at the final 10k-step checkpoint for the metadata localization project. It was trained from scratch on the project corpus, using the Llama 3.2 tokenizer and vocabulary.
|
|
|
|
## Variant Metadata
|
|
|
|
- Stage: `pretrain`
|
|
- Family: `metadata_ablation`
|
|
- Size: `1b`
|
|
- Metadata condition: `with_metadata`
|
|
- Base model lineage: `Trained from scratch; tokenizer/vocabulary from meta-llama/Llama-3.2-1B`
|
|
|
|
## Weights & Biases Provenance
|
|
|
|
- Run name: `14/12/2025_07:13:07_combined_only_url_continent_with_metadata_1b`
|
|
- Internal run URL: `https://wandb.ai/iamshnoo/nanotron/runs/f265qjmo`
|
|
- Note: the Weights & Biases workspace is private; public readers should use the summarized metrics and configuration below.
|
|
- State: `finished`
|
|
- Runtime: `114h 41m 12s`
|
|
|
|
## Run Summary
|
|
|
|
- `KPI/train_lm_loss`: `2.0676`
|
|
- `KPI/train_perplexity`: `7.9061`
|
|
- `KPI/val_loss`: `2.1122`
|
|
- `KPI/val_perplexity`: `8.2662`
|
|
- `KPI/consumed_tokens/train`: `41,943,040,000`
|
|
- `_step`: `10,000`
|
|
|
|
## Training Configuration
|
|
|
|
- `train_steps`: `10,000`
|
|
- `sequence_length`: `2,048`
|
|
- `micro_batch_size`: `8`
|
|
- `batch_accumulation_per_replica`: `64`
|
|
- `learning_rate`: `0.003`
|
|
- `min_decay_lr`: `0.0003`
|
|
- `checkpoint_interval`: `1,000`
|
|
|
|
## Training Curves
|
|
|
|
Static plots below were exported from the private Weights & Biases run and embedded here for public access.
|
|
|
|
### Train Loss
|
|
|
|

|
|
|
|
### Validation Perplexity
|
|
|
|

|
|
|
|
### Throughput
|
|
|
|

|
|
|
|
## Project Context
|
|
|
|
This model is part of the metadata localization release. Related checkpoints and variants are grouped in the public Hugging Face collection [Metadata Conditioned LLMs](https://huggingface.co/collections/iamshnoo/metadata-conditioned-llms).
|
|
- Training data source: [News on the Web (NOW) Corpus](https://www.english-corpora.org/now/)
|
|
- Project repository: [https://github.com/iamshnoo/metadata_localization](https://github.com/iamshnoo/metadata_localization)
|
|
- Paper: [https://arxiv.org/abs/2601.15236](https://arxiv.org/abs/2601.15236)
|
|
|
|
Last synced: `2026-04-02 14:43:58 UTC`
|