From f3438d8750890c907ef52d945b963fa8b25a711c Mon Sep 17 00:00:00 2001 From: Ditto Date: Thu, 4 Apr 2024 14:28:38 +0000 Subject: [PATCH] Update README.md --- README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 20adaf0..a5181a2 100644 --- a/README.md +++ b/README.md @@ -5,7 +5,7 @@ language: library_name: transformers --- -
+

Democratizing access to LLMs for the open-source community.
Let's advance AI, together.

@@ -16,7 +16,7 @@ library_name: transformers We are open-sourcing one of our early experiments of pretraining with custom architecture and datasets. This 1.1B parameter model is pre-trained from scratch using a custom-curated dataset of 41B tokens. The model's architecture experiments contain the addition of flash attention and a higher intermediate dimension of the MLP layer. The dataset is a combination of wiki, stories, arxiv, math and code. The model is available on huggingface [Boomer1B](https://huggingface.co/budecosystem/boomer-1b) -
+
## Getting Started on GitHub 💻