初始化项目,由ModelHub XC社区提供模型
Model: FlameF0X/LFM2.5-1.2B-Distilled-Claude-GGUF Source: Original Platform
This commit is contained in:
36
README.md
Normal file
36
README.md
Normal file
@@ -0,0 +1,36 @@
|
||||
---
|
||||
license: apache-2.0
|
||||
language:
|
||||
- en
|
||||
base_model: FlameF0X/LFM2.5-1.2B-Distilled-Claude
|
||||
tags:
|
||||
- lfm2
|
||||
- liquid
|
||||
- reasoning
|
||||
- cot
|
||||
- distillate
|
||||
datasets:
|
||||
- TeichAI/Claude-Opus-4.6-Reasoning-500x
|
||||
- TeichAI/Claude-Sonnet-4.6-Reasoning-1100x
|
||||
- TeichAI/claude-4.5-opus-high-reasoning-250x
|
||||
- TeichAI/claude-sonnet-4.5-high-reasoning-250x
|
||||
- TeichAI/gemini-3-pro-preview-high-reasoning-250x
|
||||
- TeichAI/gemini-3-pro-preview-high-reasoning-1000x
|
||||
pipeline_tag: text-generation
|
||||
---
|
||||
|
||||
<div align="center">
|
||||
<img src="https://cdn-uploads.huggingface.co/production/uploads/6615494716917dfdc645c44e/IKWKAsS9DFnGoFCmYNQYs.png" alt="Liquid Claude" style="width: 100%; max-width: 100%; height: auto; display: inline-block; margin-bottom: 0.5em; margin-top: 0.5em; reading-order: 20px; border-radius: 20px;"/>
|
||||
</div>
|
||||
|
||||
<br>
|
||||
|
||||
# LFM2.5-1.2B-Distilled-Claude (Liquid Claude)
|
||||
LFM2.5-1.2B-Distilled-Claude (Liquid Claude) is a distillation of Claude into LFM2.5-1.2B-Thinking via LoRA.
|
||||
|
||||
## Sample chat:
|
||||

|
||||
(Ignore the fact that it took 1min to reason, i got a i3-6006u / 12GB as hardware and running the f16 quantization)
|
||||
|
||||
# Benchmark
|
||||
The results are in progress.
|
||||
Reference in New Issue
Block a user