初始化项目,由ModelHub XC社区提供模型

Model: iamshnoo/combined_no_america_with_metadata_1b_step8k
Source: Original Platform
This commit is contained in:
ModelHub XC
2026-04-21 20:56:45 +08:00
commit 7c31035978
11 changed files with 2234 additions and 0 deletions

79
README.md Normal file
View File

@@ -0,0 +1,79 @@
---
pipeline_tag: text-generation
library_name: transformers
tags:
- text-generation
- metadata-localization
- leave-one-out
- 1b
- with-metadata
- pretraining
- intermediate-checkpoint
---
# combined_no_america_with_metadata_1b_step8k
## Summary
This repo contains the leave out america 1b step8k model exported from the 8k checkpoint for the metadata localization project. It was trained from scratch on the project corpus, using the Llama 3.2 tokenizer and vocabulary.
## Variant Metadata
- Stage: `pretrain`
- Family: `leave_one_out`
- Size: `1b`
- Metadata condition: `with_metadata`
- Checkpoint export: `8k`
- Base model lineage: `Trained from scratch; tokenizer/vocabulary from meta-llama/Llama-3.2-1B`
## Weights & Biases Provenance
- Run name: `20/12/2025_09:26:32_combined_no_america_with_metadata_1b`
- Internal run URL: `https://wandb.ai/iamshnoo/nanotron/runs/kane1qyw`
- Note: the Weights & Biases workspace is private; public readers should use the summarized metrics and configuration below.
- State: `finished`
- Runtime: `114h 52m 10s`
## Run Summary
- `KPI/train_lm_loss`: `1.9947`
- `KPI/train_perplexity`: `7.3496`
- `KPI/val_loss`: `2.0776`
- `KPI/val_perplexity`: `7.9855`
- `KPI/consumed_tokens/train`: `41,943,040,000`
- `_step`: `10,000`
## Training Configuration
- `train_steps`: `10,000`
- `sequence_length`: `2,048`
- `micro_batch_size`: `8`
- `batch_accumulation_per_replica`: `64`
- `learning_rate`: `0.003`
- `min_decay_lr`: `0.0003`
- `checkpoint_interval`: `1,000`
## Training Curves
Static plots below were exported from the private Weights & Biases run and embedded here for public access.
### Train Loss
![Train Loss](assets/train_loss.png)
### Validation Perplexity
![Validation Perplexity](assets/val_perplexity.png)
### Throughput
![Throughput](assets/tokens_per_sec.png)
## Project Context
This model is part of the metadata localization release. Related checkpoints and variants are grouped in the public Hugging Face collection [Metadata Conditioned LLMs](https://huggingface.co/collections/iamshnoo/metadata-conditioned-llms).
- Training data source: [News on the Web (NOW) Corpus](https://www.english-corpora.org/now/)
- Project repository: [https://github.com/iamshnoo/metadata_localization](https://github.com/iamshnoo/metadata_localization)
- Paper: [https://arxiv.org/abs/2601.15236](https://arxiv.org/abs/2601.15236)
Last synced: `2026-04-02 14:45:40 UTC`