ModelHub XC 06e7481182 初始化项目,由ModelHub XC社区提供模型
Model: farbodtavakkoli/OTel-LLM-4B-IT
Source: Original Platform
2026-05-01 11:17:12 +08:00

license, language, base_model, tags, pipeline_tag
license language base_model tags pipeline_tag
apache-2.0
en
google/gemma-3-4b-it
telecom
telecommunications
gsma
fine-tuned
text-generation

OTel-LLM-4B-IT

OTel-LLM-4B-IT is a telecom-specialized language model fine-tuned on telecommunications domain data. It is part of the OTel Family of Models, an open-source initiative to build industry-standard AI models for the global telecommunications sector.

Model Details

Attribute Value
Base Model google/gemma-3-4b-it
Parameters 4B
Training Method Full parameter fine-tuning
Language English
License Apache 2.0

Training Data

The model was trained on high-quality telecom-focused data curated by 100+ domain experts from organizations including AT&T, Microsoft, AMD, GSMA, RelationalAI, Essential AI, Purdue University, Khalifa University, University of Leeds, Yale University, The University of Texas at Dallas, NetoAI, and MantisNLP.

Data Sources:

  • GSMA Permanent Reference Documents
  • 3GPP Specifications
  • O-RAN Documentation
  • RFC Series
  • eSIM, terminals, security, networks, roaming, APIs
  • Industry whitepapers and telecom academic papers

Intended Use

The OTel model family is designed to power end-to-end Retrieval-Augmented Generation (RAG) pipelines for telecommunications. The three model types serve complementary roles:

  1. Embedding — Retrieve relevant chunks from telecom specifications, standards, and documentation.
  2. Reranker — Re-score and prioritize the retrieved chunks for relevance.
  3. LLM — Generate accurate responses grounded in the retrieved context.

Users can deploy the full pipeline or use individual models independently based on their needs.

Note: The LLMs include abstention training — if the model does not receive sufficient context, it will decline to answer rather than hallucinate. This means the models are optimized for context-grounded generation, not open-ended question answering.

Language Models

Embedding Models

Reranker Models

Training Infrastructure

  • Framework: ScalarLM (GPU-agnostic)
  • Compute: AMD and NVIDIA GPUs.

Citation

@misc{otel2026,
  title={OTel: Open Telco AI Models},
  author={Tavakkoli, Farbod and Diamos, Gregory and Paulk, Roderic and Terrazas, Jorden},
  year={2026},
  url={https://huggingface.co/farbodtavakkoli}
}

Contact

If you have any technical questions, please feel free to reach out to farbod.tavakkoli@att.com or farbodtavakoli@gmail.com

Description
Model synced from source: farbodtavakkoli/OTel-LLM-4B-IT
Readme 72 KiB
Languages
Jinja 100%