Files
distillspec-qwen600m/README.md
ModelHub XC 08dc551805 初始化项目,由ModelHub XC社区提供模型
Model: rishabhrj11/distillspec-qwen600m
Source: Original Platform
2026-05-01 09:49:20 +08:00

2.3 KiB

library_name, model_name, tags, licence
library_name model_name tags licence
transformers distillspec-qwen600m
generated_from_trainer
trl
gkd
license

Model Card for distillspec-qwen600m

This model is a fine-tuned version of None. It has been trained using TRL.

Quick start

from transformers import pipeline

question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?"
generator = pipeline("text-generation", model="rishabhrj11/distillspec-qwen600m", device="cuda")
output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
print(output["generated_text"])

Training procedure

Visualize in Weights & Biases

This model was trained with GKD, a method introduced in On-Policy Distillation of Language Models: Learning from Self-Generated Mistakes.

Framework versions

  • TRL: 0.25.1
  • Transformers: 4.57.3
  • Pytorch: 2.9.1
  • Datasets: 4.4.1
  • Tokenizers: 0.22.1

Citations

Cite GKD as:

@inproceedings{agarwal2024on-policy,
    title        = {{On-Policy Distillation of Language Models: Learning from Self-Generated Mistakes}},
    author       = {Rishabh Agarwal and Nino Vieillard and Yongchao Zhou and Piotr Stanczyk and Sabela Ramos Garea and Matthieu Geist and Olivier Bachem},
    year         = 2024,
    booktitle    = {The Twelfth International Conference on Learning Representations, {ICLR} 2024, Vienna, Austria, May 7-11, 2024},
    publisher    = {OpenReview.net},
    url          = {https://openreview.net/forum?id=3zKtaqxLhW},
}

Cite TRL as:

@misc{vonwerra2022trl,
	title        = {{TRL: Transformer Reinforcement Learning}},
	author       = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallou{\'e}dec},
	year         = 2020,
	journal      = {GitHub repository},
	publisher    = {GitHub},
	howpublished = {\url{https://github.com/huggingface/trl}}
}