初始化项目,由ModelHub XC社区提供模型

Model: KoboldAI/OPT-2.7B-Nerybus-Mix
Source: Original Platform
This commit is contained in:
ModelHub XC
2026-05-13 15:06:41 +08:00
commit acded89882
12 changed files with 100490 additions and 0 deletions

47
.gitattributes vendored Normal file
View File

@@ -0,0 +1,47 @@
*.7z filter=lfs diff=lfs merge=lfs -text
*.arrow filter=lfs diff=lfs merge=lfs -text
*.bin filter=lfs diff=lfs merge=lfs -text
*.bin.* filter=lfs diff=lfs merge=lfs -text
*.bz2 filter=lfs diff=lfs merge=lfs -text
*.ftz filter=lfs diff=lfs merge=lfs -text
*.gz filter=lfs diff=lfs merge=lfs -text
*.h5 filter=lfs diff=lfs merge=lfs -text
*.joblib filter=lfs diff=lfs merge=lfs -text
*.lfs.* filter=lfs diff=lfs merge=lfs -text
*.model filter=lfs diff=lfs merge=lfs -text
*.msgpack filter=lfs diff=lfs merge=lfs -text
*.onnx filter=lfs diff=lfs merge=lfs -text
*.ot filter=lfs diff=lfs merge=lfs -text
*.parquet filter=lfs diff=lfs merge=lfs -text
*.pb filter=lfs diff=lfs merge=lfs -text
*.pt filter=lfs diff=lfs merge=lfs -text
*.pth filter=lfs diff=lfs merge=lfs -text
*.rar filter=lfs diff=lfs merge=lfs -text
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
*.tar.* filter=lfs diff=lfs merge=lfs -text
*.tflite filter=lfs diff=lfs merge=lfs -text
*.tgz filter=lfs diff=lfs merge=lfs -text
*.xz filter=lfs diff=lfs merge=lfs -text
*.zip filter=lfs diff=lfs merge=lfs -text
*.zstandard filter=lfs diff=lfs merge=lfs -text
*.tfevents* filter=lfs diff=lfs merge=lfs -text
*.db* filter=lfs diff=lfs merge=lfs -text
*.ark* filter=lfs diff=lfs merge=lfs -text
**/*ckpt*data* filter=lfs diff=lfs merge=lfs -text
**/*ckpt*.meta filter=lfs diff=lfs merge=lfs -text
**/*ckpt*.index filter=lfs diff=lfs merge=lfs -text
*.safetensors filter=lfs diff=lfs merge=lfs -text
*.ckpt filter=lfs diff=lfs merge=lfs -text
*.gguf* filter=lfs diff=lfs merge=lfs -text
*.ggml filter=lfs diff=lfs merge=lfs -text
*.llamafile* filter=lfs diff=lfs merge=lfs -text
*.pt2 filter=lfs diff=lfs merge=lfs -text
*.mlmodel filter=lfs diff=lfs merge=lfs -text
*.npy filter=lfs diff=lfs merge=lfs -text
*.npz filter=lfs diff=lfs merge=lfs -text
*.pickle filter=lfs diff=lfs merge=lfs -text
*.pkl filter=lfs diff=lfs merge=lfs -text
*.tar filter=lfs diff=lfs merge=lfs -text
*.wasm filter=lfs diff=lfs merge=lfs -text
*.zst filter=lfs diff=lfs merge=lfs -text
*tfevents* filter=lfs diff=lfs merge=lfs -text

62
README.md Normal file
View File

@@ -0,0 +1,62 @@
---
license: other
language:
- en
inference: false
---
# OPT-2.7B-Nerybus-Mix
This is an experimental model containing a ***parameter-wise 50/50 blend (weighted average)*** of the weights of *NerysV2-2.7B* and *ErebusV1-2.7B*
Preliminary testing produces pretty coherent outputs, it appears to retain the NSFWness of Erebus but with a Nerys-esque twist in terms of prose.
# License
The two models used for this blend, *NerysV2-2.7B* and *ErebusV1-2.7B* are made by **Mr. Seeker**.
- https://huggingface.co/KoboldAI/OPT-2.7B-Erebus
- https://huggingface.co/KoboldAI/OPT-2.7B-Nerys-v2
The base OPT-2.7B model is licensed under the OPT-175B license, Copyright (c) Meta Platforms, Inc. All Rights Reserved.
# Evaluation Results
As the original datasets used for the source models are not publically available, I use my own datasets for this evaluation, which may not provide accurate comparison.
Eval parameters: 32000 characters extracted from the middle of the corpus, tested in blocks of 1024 tokens each, same dataset used for each test batch.
```
Literotica Dataset Eval (Randomly selected stories)
{'eval_loss': 2.571258306503296, 'name': 'Concedo_OPT-2.7B-Nerybus-Mix'}
{'eval_loss': 2.5491442680358887, 'name': 'KoboldAI_OPT-2.7B-Erebus'}
{'eval_loss': 2.6158597469329834, 'name': 'KoboldAI_OPT-2.7B-Nerys'}
{'eval_loss': 2.614469051361084, 'name': 'facebook_opt-2.7b'}
{'eval_loss': 2.4960227012634277, 'name': '(Unreleased 2.7B ModronAI Model)'}
ASSTR Dataset Eval (Randomly selected stories)
{'eval_loss': 2.664412498474121, 'name': 'Concedo_OPT-2.7B-Nerybus-Mix'}
{'eval_loss': 2.6451029777526855, 'name': 'KoboldAI_OPT-2.7B-Erebus'}
{'eval_loss': 2.7259647846221924, 'name': 'KoboldAI_OPT-2.7B-Nerys'}
{'eval_loss': 2.6675195693969727, 'name': 'facebook_opt-2.7b'}
{'eval_loss': 2.962111473083496, 'name': '(Unreleased 2.7B ModronAI Model)'}
Sexstories Dataset Eval (Random highly rated stories)
{'eval_loss': 2.2352423667907715, 'name': 'Concedo_OPT-2.7B-Nerybus-Mix'}
{'eval_loss': 2.194378137588501, 'name': 'KoboldAI_OPT-2.7B-Erebus'}
{'eval_loss': 2.307469129562378, 'name': 'KoboldAI_OPT-2.7B-Nerys'}
{'eval_loss': 2.293961763381958, 'name': 'facebook_opt-2.7b'}
{'eval_loss': 2.0103421211242676, 'name': '(Unreleased 2.7B ModronAI Model)'}
Harry Potter Dataset Eval (Canon books)
{'eval_loss': 2.473742961883545, 'name': 'Concedo_OPT-2.7B-Nerybus-Mix'}
{'eval_loss': 2.480600357055664, 'name': 'KoboldAI_OPT-2.7B-Erebus'}
{'eval_loss': 2.506237506866455, 'name': 'KoboldAI_OPT-2.7B-Nerys'}
{'eval_loss': 2.5074169635772705, 'name': 'facebook_opt-2.7b'}
{'eval_loss': 2.273703098297119, 'name': '(Unreleased 2.7B ModronAI Model)'}
Star Wars Dataset Eval (Rogue One Novel)
{'eval_loss': 2.5031676292419434, 'name': 'Concedo_OPT-2.7B-Nerybus-Mix'}
{'eval_loss': 2.5239150524139404, 'name': 'KoboldAI_OPT-2.7B-Erebus'}
{'eval_loss': 2.526801586151123, 'name': 'KoboldAI_OPT-2.7B-Nerys'}
{'eval_loss': 2.473283529281616, 'name': 'facebook_opt-2.7b'}
{'eval_loss': 2.955465793609619, 'name': '(Unreleased 2.7B ModronAI Model)'}
```
It is recommend to use this model with the KoboldAI software. All feedback and comments can be directed to Concedo on the KoboldAI discord.

30
config.json Normal file
View File

@@ -0,0 +1,30 @@
{
"_remove_final_layer_norm": false,
"activation_dropout": 0.0,
"activation_function": "relu",
"antemplate": "[Genre: <|>]",
"architectures": [
"OPTForCausalLM"
],
"attention_dropout": 0.0,
"bos_token_id": 2,
"do_layer_norm_before": true,
"dropout": 0.1,
"eos_token_id": 2,
"ffn_dim": 10240,
"hidden_size": 2560,
"init_std": 0.02,
"layerdrop": 0.0,
"max_position_embeddings": 2048,
"model_type": "opt",
"num_attention_heads": 32,
"num_hidden_layers": 32,
"pad_token_id": 1,
"prefix": "</s>",
"torch_dtype": "float16",
"transformers_version": "4.24.0",
"use_cache": false,
"vocab_size": 50265,
"welcome": "\n\nYou are currently running the model `OPT-2.7B-Nerybus-Mix`\n\nThis is an experimental model containing a parameter-wise 50/50 blend (weighted average) of NerysV2-2.7B and ErebusV1-2.7B\n\n",
"word_embed_proj_dim": 2560
}

1
configuration.json Normal file
View File

@@ -0,0 +1 @@
{"framework": "pytorch", "task": "text-generation", "allow_remote": true}

50001
merges.txt Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:dcb032a7479ed28eed424c831e361f94e6cc883d8be2aaf49a0475e0d1ec120c
size 2051264837

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:f50d9e16a4eb1c868048de732bdca9589f3ee612cdd4bfbe9b0994c954e96c92
size 2045654113

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:cb3fdd9148335934dea500528b484aa6323ae5fe4601d87bf98cd09c8c3e1809
size 1463757971

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:d451f692b37977763b3247ecbb67bfe6fce98de5dbb017b1b64259c1d95abc02
size 45090

30
special_tokens_map.json Normal file
View File

@@ -0,0 +1,30 @@
{
"bos_token": {
"content": "</s>",
"lstrip": false,
"normalized": true,
"rstrip": false,
"single_word": false
},
"eos_token": {
"content": "</s>",
"lstrip": false,
"normalized": true,
"rstrip": false,
"single_word": false
},
"pad_token": {
"content": "<pad>",
"lstrip": false,
"normalized": true,
"rstrip": false,
"single_word": false
},
"unk_token": {
"content": "</s>",
"lstrip": false,
"normalized": true,
"rstrip": false,
"single_word": false
}
}

40
tokenizer_config.json Normal file
View File

@@ -0,0 +1,40 @@
{
"add_bos_token": true,
"add_prefix_space": false,
"bos_token": {
"__type": "AddedToken",
"content": "</s>",
"lstrip": false,
"normalized": true,
"rstrip": false,
"single_word": false
},
"eos_token": {
"__type": "AddedToken",
"content": "</s>",
"lstrip": false,
"normalized": true,
"rstrip": false,
"single_word": false
},
"errors": "replace",
"name_or_path": "KoboldAI/OPT-2.7B-Erebus",
"pad_token": {
"__type": "AddedToken",
"content": "<pad>",
"lstrip": false,
"normalized": true,
"rstrip": false,
"single_word": false
},
"special_tokens_map_file": null,
"tokenizer_class": "GPT2Tokenizer",
"unk_token": {
"__type": "AddedToken",
"content": "</s>",
"lstrip": false,
"normalized": true,
"rstrip": false,
"single_word": false
}
}

50267
vocab.json Normal file

File diff suppressed because it is too large Load Diff