Update installation instructions
This commit is contained in:
@@ -48,11 +48,10 @@ It also uses Grouped-Query Attention (GQA) and Rotary Position Embeddings (RoPE)
|
||||
|
||||
## Usage
|
||||
|
||||
The [pull request](https://github.com/huggingface/transformers/pull/32495) to support this model in Hugging Face Transformers is under review and is expected to be merged soon. In the meantime, please follow the installation instructions below:
|
||||
Support for this model will be added in the upcoming `transformers` release. In the meantime, please install the library from source:
|
||||
|
||||
```
|
||||
$ git clone -b aot/head_dim_rope --single-branch https://github.com/suiyoubi/transformers.git && cd transformers
|
||||
$ pip install -e .
|
||||
pip install git+https://github.com/huggingface/transformers
|
||||
```
|
||||
|
||||
The following code provides an example of how to load the Minitron-8B model and use it to perform text generation.
|
||||
|
||||
Reference in New Issue
Block a user