61 lines
1.6 KiB
Markdown
61 lines
1.6 KiB
Markdown
---
|
||
frameworks:
|
||
- Pytorch
|
||
license: Apache License 2.0
|
||
tasks:
|
||
- text-generation
|
||
|
||
# finetune chinese Meta Llama3 Instruct 8b with Llama-Factory
|
||
```
|
||
“top.model_name": "LLaMA3-8B-Chat",
|
||
"top.finetuning_type": "lora",
|
||
"top.adapter_path": [],
|
||
"top.quantization_bit": "none",
|
||
"top.template": "llama3",
|
||
"top.rope_scaling": "none",
|
||
top.booster": "none",
|
||
"train.training_stage": "Supervised Fine-Tuning",
|
||
"train.dataset_dir": "data",
|
||
"train.dataset": [
|
||
"alpaca_zh",
|
||
"alpaca_gpt4_zh",
|
||
"guanaco",
|
||
"oaast_sft_zh",
|
||
"wikipedia_zh"
|
||
],
|
||
top.model_name": "LLaMA3-8B-Chat",
|
||
"top.finetuning_type": "lora",
|
||
"top.adapter_path": [],
|
||
"top.quantization_bit": "none",
|
||
"top.template": "llama3",
|
||
"top.rope_scaling": "none",
|
||
"top.booster": "none",
|
||
"train.training_stage": "Supervised Fine-Tuning",
|
||
"train.dataset_dir": "data",
|
||
"train.dataset": [
|
||
"alpaca_zh",
|
||
"alpaca_gpt4_zh",
|
||
"guanaco",
|
||
"nsfc_zh",
|
||
"oaast_sft_zh",
|
||
"wikipedia_zh"
|
||
],
|
||
```
|
||
|
||
SDK下载
|
||
```bash
|
||
#安装ModelScope
|
||
pip install modelscope
|
||
```
|
||
```python
|
||
#SDK模型下载
|
||
from modelscope import snapshot_download
|
||
model_dir = snapshot_download('pooka74/LLaMA3-8B-Chat-Chinese')
|
||
```
|
||
Git下载
|
||
```
|
||
#Git模型下载
|
||
git clone https://www.modelscope.cn/pooka74/LLaMA3-8B-Chat-Chinese.git
|
||
```
|
||
|
||
<p style="color: lightgrey;">如果您是本模型的贡献者,我们邀请您根据<a href="https://modelscope.cn/docs/ModelScope%E6%A8%A1%E5%9E%8B%E6%8E%A5%E5%85%A5%E6%B5%81%E7%A8%8B%E6%A6%82%E8%A7%88" style="color: lightgrey; text-decoration: underline;">模型贡献文档</a>,及时完善模型卡片内容。</p> |