Model: mlx-community/llm-jp-13b-instruct-full-dolly-ichikara_004_001_single-oasst-oasst2-v2.0 Source: Original Platform
49 lines
1.1 KiB
Markdown
49 lines
1.1 KiB
Markdown
---
|
|
language:
|
|
- en
|
|
- ja
|
|
license: apache-2.0
|
|
library_name: transformers
|
|
tags:
|
|
- mlx
|
|
datasets:
|
|
- databricks/databricks-dolly-15k
|
|
- llm-jp/databricks-dolly-15k-ja
|
|
- llm-jp/oasst1-21k-en
|
|
- llm-jp/oasst1-21k-ja
|
|
- llm-jp/oasst2-33k-en
|
|
- llm-jp/oasst2-33k-ja
|
|
programming_language:
|
|
- C
|
|
- C++
|
|
- C#
|
|
- Go
|
|
- Java
|
|
- JavaScript
|
|
- Lua
|
|
- PHP
|
|
- Python
|
|
- Ruby
|
|
- Rust
|
|
- Scala
|
|
- TypeScript
|
|
pipeline_tag: text-generation
|
|
inference: false
|
|
---
|
|
|
|
# mlx-community/llm-jp-13b-instruct-full-dolly-ichikara_004_001_single-oasst-oasst2-v2.0
|
|
This model was converted to MLX format from [`llm-jp/llm-jp-13b-instruct-full-dolly-ichikara_004_001_single-oasst-oasst2-v2.0`]() using mlx-lm version **0.12.0**.
|
|
Refer to the [original model card](https://huggingface.co/llm-jp/llm-jp-13b-instruct-full-dolly-ichikara_004_001_single-oasst-oasst2-v2.0) for more details on the model.
|
|
## Use with mlx
|
|
|
|
```bash
|
|
pip install mlx-lm
|
|
```
|
|
|
|
```python
|
|
from mlx_lm import load, generate
|
|
|
|
model, tokenizer = load("mlx-community/llm-jp-13b-instruct-full-dolly-ichikara_004_001_single-oasst-oasst2-v2.0")
|
|
response = generate(model, tokenizer, prompt="hello", verbose=True)
|
|
```
|