Files
ModelHub XC 06ef6d3e58 初始化项目,由ModelHub XC社区提供模型
Model: FlameF0X/LFM2.5-1.2B-Distilled-Claude-GGUF
Source: Original Platform
2026-04-22 06:59:57 +08:00

1.2 KiB

license, language, base_model, tags, datasets, pipeline_tag
license language base_model tags datasets pipeline_tag
apache-2.0
en
FlameF0X/LFM2.5-1.2B-Distilled-Claude
lfm2
liquid
reasoning
cot
distillate
TeichAI/Claude-Opus-4.6-Reasoning-500x
TeichAI/Claude-Sonnet-4.6-Reasoning-1100x
TeichAI/claude-4.5-opus-high-reasoning-250x
TeichAI/claude-sonnet-4.5-high-reasoning-250x
TeichAI/gemini-3-pro-preview-high-reasoning-250x
TeichAI/gemini-3-pro-preview-high-reasoning-1000x
text-generation
Liquid Claude

LFM2.5-1.2B-Distilled-Claude (Liquid Claude)

LFM2.5-1.2B-Distilled-Claude (Liquid Claude) is a distillation of Claude into LFM2.5-1.2B-Thinking via LoRA.

Sample chat:

image (Ignore the fact that it took 1min to reason, i got a i3-6006u / 12GB as hardware and running the f16 quantization)

Benchmark

The results are in progress.