初始化项目,由ModelHub XC社区提供模型

Model: Chickaboo/ChickaQ-V2-Large-Beta
Source: Original Platform
This commit is contained in:
ModelHub XC
2026-04-30 09:48:12 +08:00
commit 47d89377ed
16 changed files with 454714 additions and 0 deletions

35
.gitattributes vendored Normal file
View File

@@ -0,0 +1,35 @@
*.7z filter=lfs diff=lfs merge=lfs -text
*.arrow filter=lfs diff=lfs merge=lfs -text
*.bin filter=lfs diff=lfs merge=lfs -text
*.bz2 filter=lfs diff=lfs merge=lfs -text
*.ckpt filter=lfs diff=lfs merge=lfs -text
*.ftz filter=lfs diff=lfs merge=lfs -text
*.gz filter=lfs diff=lfs merge=lfs -text
*.h5 filter=lfs diff=lfs merge=lfs -text
*.joblib filter=lfs diff=lfs merge=lfs -text
*.lfs.* filter=lfs diff=lfs merge=lfs -text
*.mlmodel filter=lfs diff=lfs merge=lfs -text
*.model filter=lfs diff=lfs merge=lfs -text
*.msgpack filter=lfs diff=lfs merge=lfs -text
*.npy filter=lfs diff=lfs merge=lfs -text
*.npz filter=lfs diff=lfs merge=lfs -text
*.onnx filter=lfs diff=lfs merge=lfs -text
*.ot filter=lfs diff=lfs merge=lfs -text
*.parquet filter=lfs diff=lfs merge=lfs -text
*.pb filter=lfs diff=lfs merge=lfs -text
*.pickle filter=lfs diff=lfs merge=lfs -text
*.pkl filter=lfs diff=lfs merge=lfs -text
*.pt filter=lfs diff=lfs merge=lfs -text
*.pth filter=lfs diff=lfs merge=lfs -text
*.rar filter=lfs diff=lfs merge=lfs -text
*.safetensors filter=lfs diff=lfs merge=lfs -text
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
*.tar.* filter=lfs diff=lfs merge=lfs -text
*.tar filter=lfs diff=lfs merge=lfs -text
*.tflite filter=lfs diff=lfs merge=lfs -text
*.tgz filter=lfs diff=lfs merge=lfs -text
*.wasm filter=lfs diff=lfs merge=lfs -text
*.xz filter=lfs diff=lfs merge=lfs -text
*.zip filter=lfs diff=lfs merge=lfs -text
*.zst filter=lfs diff=lfs merge=lfs -text
*tfevents* filter=lfs diff=lfs merge=lfs -text

52
README.md Normal file
View File

@@ -0,0 +1,52 @@
---
base_model:
- vilm/Quyen-mini-v0.1
- Qwen/Qwen1.5-1.8B-Chat
library_name: transformers
tags:
- mergekit
- merge
license: mit
---
# Models in the ChickaQ family
- **ChickaQ (0.5B)**
- **ChickaQ-Large (1.8B)**
- **ChickaQ-V2-Beta (0.9B)**
- **ChickaQ-V2-Large-Beta (3B)**
# mergedmodel
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the passthrough merge method.
### Models Merged
The following models were included in the merge:
* [vilm/Quyen-mini-v0.1](https://huggingface.co/vilm/Quyen-mini-v0.1)
* [Qwen/Qwen1.5-1.8B-Chat](https://huggingface.co/Qwen/Qwen1.5-1.8B-Chat)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
slices:
- sources:
- model: Qwen/Qwen1.5-1.8B-Chat
layer_range: [0, 24]
- sources:
- model: vilm/Quyen-mini-v0.1
layer_range: [0, 24]
merge_method: passthrough
dtype: bfloat16
```
### License
---
license: mit
license_link: [mit](https://mit-license.org/)
---

5
added_tokens.json Normal file
View File

@@ -0,0 +1,5 @@
{
"<|endoftext|>": 151643,
"<|im_end|>": 151645,
"<|im_start|>": 151644
}

27
config.json Normal file
View File

@@ -0,0 +1,27 @@
{
"_name_or_path": "vilm/Quyen-mini-v0.1",
"architectures": [
"Qwen2ForCausalLM"
],
"attention_dropout": 0.0,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 2048,
"initializer_range": 0.02,
"intermediate_size": 5504,
"max_position_embeddings": 8192,
"max_window_layers": 21,
"model_type": "qwen2",
"num_attention_heads": 16,
"num_hidden_layers": 48,
"num_key_value_heads": 16,
"rms_norm_eps": 1e-06,
"rope_theta": 1000000.0,
"sliding_window": 4096,
"tie_word_embeddings": false,
"torch_dtype": "bfloat16",
"transformers_version": "4.38.2",
"use_cache": false,
"use_sliding_window": false,
"vocab_size": 151936
}

9
mergekit_config.yml Normal file
View File

@@ -0,0 +1,9 @@
slices:
- sources:
- model: Qwen/Qwen1.5-1.8B-Chat
layer_range: [0, 24]
- sources:
- model: vilm/Quyen-mini-v0.1
layer_range: [0, 24]
merge_method: passthrough
dtype: bfloat16

151388
merges.txt Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:a7359a4e1befca22b75999994d37420e0bf9975ee6529f1592290d07d8b6f17a
size 1856014944

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:10c62748bea4653c1f8110dc7e399e5140cb02a7b721cd8f7ee7aaddf40db6f0
size 1994173104

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:0340c3430a992c7d10f31706f7eaabd50f23d5337459d97699e2b4bdef6f5a86
size 1993247032

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:8b522d2fa9dfcf611f4929a8655c1998ed015417ce95e4fdd181496acb3a2a37
size 1492966736

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:6d71347dce727d041fb0340562abf5a86cb23dae5b85f2681f7a2db0bf0bed8c
size 622329992

File diff suppressed because one or more lines are too long

27
special_tokens_map.json Normal file
View File

@@ -0,0 +1,27 @@
{
"additional_special_tokens": [
"<|im_start|>",
"<|im_end|>"
],
"bos_token": {
"content": "<|im_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false
},
"eos_token": {
"content": "<|im_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false
},
"pad_token": {
"content": "<|endoftext|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false
}
}

303111
tokenizer.json Normal file

File diff suppressed because it is too large Load Diff

43
tokenizer_config.json Normal file
View File

@@ -0,0 +1,43 @@
{
"add_prefix_space": false,
"added_tokens_decoder": {
"151643": {
"content": "<|endoftext|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151644": {
"content": "<|im_start|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151645": {
"content": "<|im_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
}
},
"additional_special_tokens": [
"<|im_start|>",
"<|im_end|>"
],
"bos_token": "<|im_end|>",
"chat_template": "{% for message in messages %}{{'<|im_start|>' + message['role'] + '\n' + message['content'] + '<|im_end|>' + '\n'}}{% endfor %}{% if add_generation_prompt %}{{ '<|im_start|>assistant\n' }}{% endif %}",
"clean_up_tokenization_spaces": false,
"eos_token": "<|im_end|>",
"errors": "replace",
"model_max_length": 32768,
"pad_token": "<|endoftext|>",
"split_special_tokens": false,
"tokenizer_class": "Qwen2Tokenizer",
"unk_token": null
}

1
vocab.json Normal file

File diff suppressed because one or more lines are too long