Liquid AI

LFM2-1.2B-Extract-GGUF

Based on LFM2-1.2B, LFM2-1.2B-Extract is designed to extract important information from a wide variety of unstructured documents (such as articles, transcripts, or reports) into structured outputs like JSON, XML, or YAML.

Use cases:

  • Extracting invoice details from emails into structured JSON.
  • Converting regulatory filings into XML for compliance systems.
  • Transforming customer support tickets into YAML for analytics pipelines.
  • Populating knowledge graphs with entities and attributes from unstructured reports.

You can find more information about other task-specific models in this blog post.

πŸƒ How to run LFM2

Example usage with llama.cpp:

llama-cli -hf LiquidAI/LFM2-1.2B-Extract-GGUF
Downloads last month
1,666
GGUF
Model size
1B params
Architecture
lfm2
Hardware compatibility
Log In to view the estimation

4-bit

5-bit

6-bit

8-bit

16-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for LiquidAI/LFM2-1.2B-Extract-GGUF

Base model

LiquidAI/LFM2-1.2B
Quantized
(8)
this model

Collection including LiquidAI/LFM2-1.2B-Extract-GGUF