Files
clm-60m/README.md
ModelHub XC 8608ad4420 初始化项目,由ModelHub XC社区提供模型
Model: AICrossSim/clm-60m
Source: Original Platform
2026-05-01 10:00:03 +08:00

1.6 KiB

library_name, tags, license, datasets, language
library_name tags license datasets language
transformers
language-model
odc-by
HuggingFaceFW/fineweb-edu
en

Model Card for AICrossSim/clm-60m

A 60M parameter language model trained on 22 * 60M tokens from FineWeb-Edu dataset.

Model Details

aixsim-60M is a transformer-based language model with approximately 60 million parameters (embedding layer params excluded). It uses RMSNorm for normalization and is trained on the FineWeb-Edu dataset.

Training Details

Experiment setup and training logs can be found at wandb run.

Usage

import transformers

model_name="AICrossSim/clm-60m"
model = transformers.AutoModelForCausalLM.from_pretrained(model_name)
tokenizer = transformers.AutoTokenizer.from_pretrained(model_name)

lm-evaluation-harness

Tasks Version Filter n-shot Metric Value Stderr
wikitext 2 none 0 bits_per_byte 1.6693 ± N/A
none 0 byte_perplexity 3.1806 ± N/A
none 0 word_perplexity 486.5306 ± N/A