初始化项目,由ModelHub XC社区提供模型
Model: maldv/dragonwar-7b-alpha Source: Original Platform
This commit is contained in:
54
README.md
Normal file
54
README.md
Normal file
@@ -0,0 +1,54 @@
|
||||
---
|
||||
library_name: transformers
|
||||
tags:
|
||||
- unsloth
|
||||
- book
|
||||
license: cc-by-nc-4.0
|
||||
---
|
||||
|
||||
|
||||

|
||||
|
||||
[gguf quants](https://huggingface.co/mradermacher/dragonwar-7b-alpha-GGUF)
|
||||
|
||||
# Dragonwar 7b - α
|
||||
|
||||
The time of the great dragon war is upon us! How many different fantasy novels? One hundred and seventeen you say?
|
||||
|
||||
Trained with full text windows, followed by completion, followed by ORPO, followed by one more epoch of the full text, rotated 1/4 in the window. That last train settled everything down and it seems quite coherent.
|
||||
|
||||
### How to Use
|
||||
|
||||
This is not a chat model, but intended for storymode or similar. No prompt, but start with a bit of story, or a name.
|
||||
|
||||
```
|
||||
*** Prologue
|
||||
|
||||
The sun rose
|
||||
```
|
||||
|
||||
Authors notes are highly effective. You can use an authors note of something like:
|
||||
|
||||
```
|
||||
[King Robb Stark and Lord Rahl are at war.]
|
||||
```
|
||||
|
||||
You have quite a cast of characters to draw from. Perhaps Perrin makes a stop by the Waystone Inn, or Zeddicus and Gandalf have a smoke together.
|
||||
|
||||
### Settings
|
||||
|
||||
I usually use Min-P of 0.1, dynatemp between 0.5 and 2, and smoothing between 0.05 and 0.2.
|
||||
|
||||
### Hacks
|
||||
|
||||
To get rid of unwanted EOS's, I did the following...
|
||||
|
||||
```
|
||||
import torch
|
||||
result_dict : dict[str, torch.Tensor] = model.state_dict()
|
||||
result_dict['lm_head.weight'][2] = 0
|
||||
model.state_dict = lambda : result_dict
|
||||
```
|
||||
|
||||
So now there are no EOS's at all, ever.
|
||||
|
||||
Reference in New Issue
Block a user