Files
dragonwar-7b-alpha/README.md
ModelHub XC 13ed1567bf 初始化项目,由ModelHub XC社区提供模型
Model: maldv/dragonwar-7b-alpha
Source: Original Platform
2026-04-11 11:16:58 +08:00

55 lines
1.4 KiB
Markdown
Raw Permalink Blame History

This file contains ambiguous Unicode characters

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

---
library_name: transformers
tags:
- unsloth
- book
license: cc-by-nc-4.0
---
![image/png](https://cdn-uploads.huggingface.co/production/uploads/65b19c1b098c85365af5a83e/BREbn9P_gPnXU02KCo9eJ.png)
[gguf quants](https://huggingface.co/mradermacher/dragonwar-7b-alpha-GGUF)
# Dragonwar 7b - α
The time of the great dragon war is upon us! How many different fantasy novels? One hundred and seventeen you say?
Trained with full text windows, followed by completion, followed by ORPO, followed by one more epoch of the full text, rotated 1/4 in the window. That last train settled everything down and it seems quite coherent.
### How to Use
This is not a chat model, but intended for storymode or similar. No prompt, but start with a bit of story, or a name.
```
*** Prologue
The sun rose
```
Authors notes are highly effective. You can use an authors note of something like:
```
[King Robb Stark and Lord Rahl are at war.]
```
You have quite a cast of characters to draw from. Perhaps Perrin makes a stop by the Waystone Inn, or Zeddicus and Gandalf have a smoke together.
### Settings
I usually use Min-P of 0.1, dynatemp between 0.5 and 2, and smoothing between 0.05 and 0.2.
### Hacks
To get rid of unwanted EOS's, I did the following...
```
import torch
result_dict : dict[str, torch.Tensor] = model.state_dict()
result_dict['lm_head.weight'][2] = 0
model.state_dict = lambda : result_dict
```
So now there are no EOS's at all, ever.