diff --git a/README.md b/README.md index 237eb15..972a0b0 100644 --- a/README.md +++ b/README.md @@ -1,12 +1,93 @@ - --- +library_name: transformers license: other license_name: lfm1.0 license_link: LICENSE +language: +- en +- ar +- zh +- fr +- de +- ja +- ko +- es +pipeline_tag: text-generation tags: - liquid - lfm2 - edge -base_model: -- LiquidAI/LFM2-350M-Extract +base_model: LiquidAI/LFM2-350M --- + +
+
+ Liquid AI +
+
+ + + Playground + + + + + + + + + + + + + + + + + + + Leap + + + + + + + + + + + + + + +
+
+ +# LFM2-350M-Extract-GGUF + +Based on [LFM2-350M](https://huggingface.co/LiquidAI/LFM2-350M), LFM2-350M-Extract is designed to **extract important information from a wide variety of unstructured documents** (such as articles, transcripts, or reports) into structured outputs like JSON, XML, or YAML. + +**Use cases**: + +- Extracting invoice details from emails into structured JSON. +- Converting regulatory filings into XML for compliance systems. +- Transforming customer support tickets into YAML for analytics pipelines. +- Populating knowledge graphs with entities and attributes from unstructured reports. + +You can find more information about other task-specific models in this [blog post](https://www.liquid.ai/blog/introducing-liquid-nanos-frontier-grade-performance-on-everyday-devices). + +## 🏃 How to run LFM2 + +Example usage with [llama.cpp](https://github.com/ggml-org/llama.cpp): + +``` +llama-cli -hf LiquidAI/LFM2-350M-Extract-GGUF +``` \ No newline at end of file