73 lines
2.7 KiB
Markdown
73 lines
2.7 KiB
Markdown
---
|
||
license: other
|
||
license_name: license
|
||
license_link: LICENSE
|
||
---
|
||
<div align="center">
|
||
<h1>
|
||
Index-1.9B-Chat-GGUF
|
||
</h1>
|
||
</div>
|
||
|
||
This repository is the GGUF version of [Index-1.9B-Chat](https://huggingface.co/IndexTeam/Index-1.9B-Chat), which adapts to llama.cpp and also provides ModelFile adaptation for Ollma.
|
||
|
||
For more details, see our [GitHub](https://github.com/bilibili/Index-1.9B) and [Index-1.9B Technical Report](https://github.com/bilibili/Index-1.9B/blob/main/Index-1.9B%20%E6%8A%80%E6%9C%AF%E6%8A%A5%E5%91%8A.pdf)
|
||
|
||
### LLAMA.CPP
|
||
```shell
|
||
# Install llama.cpp(https://github.com/ggerganov/llama.cpp)
|
||
git clone https://github.com/ggerganov/llama.cpp
|
||
cd llama.cpp
|
||
make
|
||
|
||
# Install llama-cpp-python(https://github.com/abetlen/llama-cpp-python)
|
||
pip install llama-cpp-python
|
||
```
|
||
llama.cpp terminal
|
||
```shell
|
||
./build/bin/llama-cli -m models/Index-1.9B-Chat/ggml-model-bf16.gguf --color -if
|
||
```
|
||
**Note!!** llama.cpp does not support custom chat_template, so you need to splice prompt yourself. The chat_template of Index-1.9B is
|
||
```shell
|
||
# The three delimiters are <unk>(token_id=0), reserved_0(token_id=3), reserved_1(token_id=4)
|
||
[<unk>]sytem_message[reserved_0]user_message[reserved_1]response
|
||
```
|
||
Use llama-cpp-python to support custom chat_template (already written to GGUF and can be used directly)
|
||
```python
|
||
from llama_cpp import Llama
|
||
|
||
model_path = "Index-1.9B-Chat-GGUF/ggml-model-Q6_K.gguf"
|
||
llm = Llama(model_path =model_path, verbose=True)
|
||
output = llm.create_chat_completion(
|
||
messages = [
|
||
{"role": "system", "content": "你是由哔哩哔哩自主研发的大语言模型,名为“Index”。你能够根据用户传入的信息,帮助用户完成指定的任务,并生成恰当的、符合要求的回复。"},
|
||
#{"role": "system", "content": "你需要扮演B站评论区老哥,用评论区阴阳怪气的话术回复,不要说你是AI"},
|
||
{"role": "user","content": "篮球和鸡有什么关系"}
|
||
]
|
||
)
|
||
print(output)
|
||
```
|
||
### OLLAMA
|
||
- Install [Ollama](https://github.com/ollama/ollama)
|
||
```shell
|
||
curl -fsSL https://ollama.com/install.sh | sh
|
||
```
|
||
```shell
|
||
# Start server
|
||
ollama serve
|
||
|
||
# Adaptation model, model file and System Message can be modified in OllamaModelFile
|
||
ollama create Index-1.9B-Chat -f Index-1.9B-Chat-GGUF/OllamaModelFile
|
||
|
||
# Start Terminal
|
||
ollama run Index-1.9B-Chat
|
||
|
||
# System Message can be specified dynamically
|
||
curl http://localhost:11434/api/chat -d '{
|
||
"model": "Index-1.9B-Chat",
|
||
"messages": [
|
||
{ "role": "system", "content": "你是由哔哩哔哩自主研发的大语言模型,名为“Index”。你能够根据用户传入的信息,帮助用户完成指定的任务,并生成恰当的、符合要求的回复。" },
|
||
{ "role": "user", "content": "续写 金坷垃" }
|
||
]
|
||
}'
|
||
``` |