初始化项目,由ModelHub XC社区提供模型

Model: lordalbior/TheVagrant-12B
Source: Original Platform
This commit is contained in:
ModelHub XC
2026-04-21 08:29:06 +08:00
commit e1feb09f6e
35 changed files with 8704 additions and 0 deletions

View File

@@ -0,0 +1,60 @@
---
base_model:
- TheDrummer/UnslopNemo-12B-v4.1
library_name: transformers
tags:
- mergekit
- merge
---
# vagrant
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the [DARE TIES](https://arxiv.org/abs/2311.03099) merge method using [TheDrummer/UnslopNemo-12B-v4.1](https://huggingface.co/TheDrummer/UnslopNemo-12B-v4.1) as a base.
### Models Merged
The following models were included in the merge:
* intermediates/storyteller
* intermediates/superstar
* intermediates/bard
### Configuration
The following YAML configuration was used to produce this model:
```yaml
base_model: TheDrummer/UnslopNemo-12B-v4.1
chat_template: chatml
dtype: bfloat16
merge_method: dare_ties
modules:
default:
slices:
- sources:
- layer_range: [0, 40]
model: intermediates/storyteller
parameters:
density: 0.9
weight: 1.0
- layer_range: [0, 40]
model: intermediates/bard
parameters:
density: 0.5
weight: 0.7
- layer_range: [0, 40]
model: intermediates/superstar
parameters:
density: 0.5
weight: 0.7
- layer_range: [0, 40]
model: TheDrummer/UnslopNemo-12B-v4.1
out_dtype: bfloat16
parameters:
normalize: 1.0
tokenizer: {}
```