license, language, library_name, pipeline_tag, tags
license language library_name pipeline_tag tags
mit
en
transformers text-generation
qwen3
ssd
self-distillation
rlve

Qwen3-1.7B SSD (RLVE Eval20, N=20) — global step 500

Weights merged from VERL FSDP SFT checkpoint global_step_500 (500 optimizer steps, 1 epoch schedule).

Training data

Parquet SFT corpus (16k rows, messages column): CL-From-Nothing/RLVE-Eval20-Qwen3-1.7B-SSD-N20-SFT-Train.

Load

from transformers import AutoModelForCausalLM, AutoTokenizer

model_id = "CL-From-Nothing/Qwen3-1-7B-SSD-RLVE-Eval20-N20-global-step-500"
tok = AutoTokenizer.from_pretrained(model_id, trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained(model_id, torch_dtype="bfloat16", device_map="auto", trust_remote_code=True)

Note: Qwen3 requires trust_remote_code=True.

Description
Model synced from source: CL-From-Nothing/Qwen3-1-7B-SSD-RLVE-Eval20-N20-global-step-500
Readme 2 MiB
Languages
Jinja 100%