初始化项目,由ModelHub XC社区提供模型
Model: stockmark/stockmark-13b-instruct Source: Original Platform
This commit is contained in:
61
README.md
Normal file
61
README.md
Normal file
@@ -0,0 +1,61 @@
|
||||
---
|
||||
license: mit
|
||||
language:
|
||||
- ja
|
||||
library_name: transformers
|
||||
pipeline_tag: text-generation
|
||||
tags:
|
||||
- japanese
|
||||
- llama-2
|
||||
- instruction-tuning
|
||||
---
|
||||
|
||||
# Stockmark-13b-instruct
|
||||
|
||||
**Stockmark-13b-instruct** is an instruction-tuned version of [Stockmark-13b](https://huggingface.co/stockmark/stockmark-13b), a 13 billion parameter Japanese LLM. This model is developed by [Stockmark Inc.](https://stockmark.co.jp/)
|
||||
|
||||
We used data (2023/11/03 version) from [Project of Development of Japanese Instruction data for LLM](https://liat-aip.sakura.ne.jp/wp/llm%E3%81%AE%E3%81%9F%E3%82%81%E3%81%AE%E6%97%A5%E6%9C%AC%E8%AA%9E%E3%82%A4%E3%83%B3%E3%82%B9%E3%83%88%E3%83%A9%E3%82%AF%E3%82%B7%E3%83%A7%E3%83%B3%E3%83%87%E3%83%BC%E3%82%BF%E4%BD%9C%E6%88%90/) for instruction tuning.
|
||||
|
||||
Please see our [blog](https://tech.stockmark.co.jp/blog/202311_stockmark_13b_instruct/) for more details.
|
||||
|
||||
## How to use
|
||||
|
||||
```python
|
||||
import torch
|
||||
from transformers import AutoModelForCausalLM, AutoTokenizer
|
||||
|
||||
model = AutoModelForCausalLM.from_pretrained("stockmark/stockmark-13b-instruct", device_map="auto", torch_dtype=torch.bfloat16)
|
||||
tokenizer = AutoTokenizer.from_pretrained("stockmark/stockmark-13b-instruct")
|
||||
|
||||
instruction = "自然言語処理とは?"
|
||||
|
||||
prompt = f"""### Input:
|
||||
{instruction}
|
||||
|
||||
### Output:
|
||||
"""
|
||||
|
||||
inputs = tokenizer(prompt, return_tensors="pt").to(model.device)
|
||||
with torch.no_grad():
|
||||
tokens = model.generate(
|
||||
**inputs,
|
||||
max_new_tokens=128,
|
||||
do_sample=True,
|
||||
temperature=0.7
|
||||
)
|
||||
|
||||
output = tokenizer.decode(tokens[0], skip_special_tokens=True)
|
||||
print(output)
|
||||
```
|
||||
|
||||
## Training dataset
|
||||
[Project of Development of Japanese Instruction data for LLM](https://liat-aip.sakura.ne.jp/wp/llm%E3%81%AE%E3%81%9F%E3%82%81%E3%81%AE%E6%97%A5%E6%9C%AC%E8%AA%9E%E3%82%A4%E3%83%B3%E3%82%B9%E3%83%88%E3%83%A9%E3%82%AF%E3%82%B7%E3%83%A7%E3%83%B3%E3%83%87%E3%83%BC%E3%82%BF%E4%BD%9C%E6%88%90/)
|
||||
|
||||
## License
|
||||
MIT
|
||||
|
||||
## Developed by
|
||||
[Stockmark Inc.](https://stockmark.co.jp/)
|
||||
|
||||
## Author
|
||||
[Takahiro Omi](https://huggingface.co/omitakahiro)
|
||||
Reference in New Issue
Block a user