初始化项目,由ModelHub XC社区提供模型
Model: OrionLLM/Nebula Source: Original Platform
This commit is contained in:
39
.gitattributes
vendored
Normal file
39
.gitattributes
vendored
Normal file
@@ -0,0 +1,39 @@
|
|||||||
|
*.7z filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.arrow filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.bin filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.bz2 filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.ckpt filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.ftz filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.gz filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.h5 filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.joblib filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.lfs.* filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.mlmodel filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.model filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.msgpack filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.npy filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.npz filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.onnx filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.ot filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.parquet filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.pb filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.pickle filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.pkl filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.pt filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.pth filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.rar filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.safetensors filter=lfs diff=lfs merge=lfs -text
|
||||||
|
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.tar.* filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.tar filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.tflite filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.tgz filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.wasm filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.xz filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.zip filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.zst filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
||||||
|
figures/baguettotron_structure.png filter=lfs diff=lfs merge=lfs -text
|
||||||
|
figures/table_evaluation.png filter=lfs diff=lfs merge=lfs -text
|
||||||
|
figures/training_baguettotron.png filter=lfs diff=lfs merge=lfs -text
|
||||||
|
figures/comparison_models.png filter=lfs diff=lfs merge=lfs -text
|
||||||
71
README.md
Normal file
71
README.md
Normal file
@@ -0,0 +1,71 @@
|
|||||||
|
---
|
||||||
|
license: apache-2.0
|
||||||
|
pipeline_tag: text-generation
|
||||||
|
library_name: transformers
|
||||||
|
tags:
|
||||||
|
- nebula
|
||||||
|
- reasoning
|
||||||
|
- text-generation
|
||||||
|
- transformers
|
||||||
|
---
|
||||||
|
|
||||||
|
# Nebula
|
||||||
|
|
||||||
|
<p align="center">
|
||||||
|
<img src="https://cdn-uploads.huggingface.co/production/uploads/685ea8ff7b4139b6845ce395/YF0kEDYMGJhcM3Lbl2EOD.png" alt="Nebula logo" width="100">
|
||||||
|
</p>
|
||||||
|
|
||||||
|
## 1. Introduction
|
||||||
|
**Nebula** is a **320M-parameter** generalist Small Reasoning Model trained on **200B+ tokens**, designed for edge AI and on-device deployment.
|
||||||
|
|
||||||
|
Nebula is designed to deliver an unusually strong balance of **memory**, **general reasoning**, **math**, and **retrieval-friendly behavior** for its size class, aiming to outperform many small models of a similar parameter range on non-code, industry-style benchmarks.
|
||||||
|
|
||||||
|
## 2. Reasoning style
|
||||||
|
Nebula’s reasoning traces use an intentionally compact style with **dense, short, frequently non-verbal sentences**, optimized for efficiency under limited model capacity.
|
||||||
|
|
||||||
|
Traces use the following stenographic notation integrated into special tokens:
|
||||||
|
|
||||||
|
### Logical markers
|
||||||
|
|
||||||
|
| Token | Meaning | Usage |
|
||||||
|
| ----- | ------- | ----- |
|
||||||
|
| **→** | derivation / implication | For very short causal/logical flow |
|
||||||
|
| **↺** | iterative return / refinement loop | For backtracking, reconsidering priors, RAG re-querying |
|
||||||
|
| **?** | uncertainty/questions to resolve | Can be appended to short expressions/words, not only interrogatives |
|
||||||
|
| **!/※** | insight/breakthroughs | Emphatic mark for knowledge discovery |
|
||||||
|
| **≈** | approximation/estimates | For intermediary hypothesis / uncertain preliminary statements |
|
||||||
|
| **∴** | therefore / final step | Use sparingly to mark stable conclusions |
|
||||||
|
|
||||||
|
### Uncertainty
|
||||||
|
|
||||||
|
| Token | Meaning | Usage |
|
||||||
|
| ----- | ------- | ----- |
|
||||||
|
| **●** | high confidence | well-supported empirical/theoretical ground; “anchor points.” |
|
||||||
|
| **◐** | medium/partial confidence | incomplete data; plausible but unverified links |
|
||||||
|
| **○** | low confidence | speculation, missing context, weak inference chain |
|
||||||
|
| **⚠** | bias/premise risk | domain mismatch, cultural assumptions, language-switch artifacts |
|
||||||
|
| **?maybe?** | soft speculation | marks tentative ideas, branches that might collapse later |
|
||||||
|
|
||||||
|
### Verification process
|
||||||
|
|
||||||
|
| Token | Meaning | Usage |
|
||||||
|
| ----- | ------- | ----- |
|
||||||
|
| **☐** | unverified hypothesis | raw claim, no cross-check yet |
|
||||||
|
| **☑** | intermediate verification | one source/argument supports it |
|
||||||
|
| **✓** | confirmed/validated | multiple independent supports (●-level) |
|
||||||
|
|
||||||
|
This reasoning format is designed to remain expressive while being lightweight enough for a small model.
|
||||||
|
|
||||||
|
## 3. Fine-Tuning/RL
|
||||||
|
Nebula has been successfully fine-tuned for a variety of tasks
|
||||||
|
|
||||||
|
Because Nebula is a reasoning-oriented model, it is expected to train well with reinforcement learning methods such as **GRPO**, both for **verifiable tasks** (with objective rewards) and for subjective tasks using an **LLM-as-a-judge**.
|
||||||
|
|
||||||
|
## 4. Benchmarks
|
||||||
|
|
||||||
|
| Model | MMLU |
|
||||||
|
|------|-----:|
|
||||||
|
| **Nebula** | **40.0** |
|
||||||
|
| SmolLM2-360M | 35.8 |
|
||||||
|
| Gemma 3 270M (IT) | 26.5 |
|
||||||
|
| Granite-4.0-H-350M | 36.21 |
|
||||||
7
chat_template.json
Normal file
7
chat_template.json
Normal file
@@ -0,0 +1,7 @@
|
|||||||
|
{
|
||||||
|
"chat_template": "{% for m in messages %}<|im_start|>{{ m['role'] }}\n{{ m['content'] }}<|im_end|>\n{% endfor %}{% if add_generation_prompt %}<|im_start|>assistant\n<think>\n{% endif %}",
|
||||||
|
"eos_token": "<|im_end|>",
|
||||||
|
"bos_token": "<|im_start|>",
|
||||||
|
"stop": ["<|im_end|>"],
|
||||||
|
"roles": { "user": "user", "assistant": "assistant", "system": "system" }
|
||||||
|
}
|
||||||
29
config.json
Normal file
29
config.json
Normal file
@@ -0,0 +1,29 @@
|
|||||||
|
{
|
||||||
|
"architectures": [
|
||||||
|
"LlamaForCausalLM"
|
||||||
|
],
|
||||||
|
"attention_bias": false,
|
||||||
|
"attention_dropout": 0.0,
|
||||||
|
"bos_token_id": 1,
|
||||||
|
"eos_token_id": 2,
|
||||||
|
"head_dim": 64,
|
||||||
|
"hidden_act": "silu",
|
||||||
|
"hidden_size": 576,
|
||||||
|
"initializer_range": 0.02,
|
||||||
|
"intermediate_size": 1536,
|
||||||
|
"max_position_embeddings": 4096,
|
||||||
|
"mlp_bias": false,
|
||||||
|
"model_type": "llama",
|
||||||
|
"num_attention_heads": 9,
|
||||||
|
"num_hidden_layers": 80,
|
||||||
|
"num_key_value_heads": 3,
|
||||||
|
"pretraining_tp": 1,
|
||||||
|
"rms_norm_eps": 1e-05,
|
||||||
|
"rope_scaling": null,
|
||||||
|
"rope_theta": 10000,
|
||||||
|
"tie_word_embeddings": true,
|
||||||
|
"torch_dtype": "bfloat16",
|
||||||
|
"transformers_version": "4.51.3",
|
||||||
|
"use_cache": true,
|
||||||
|
"vocab_size": 65536
|
||||||
|
}
|
||||||
6
generation_config.json
Normal file
6
generation_config.json
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
{
|
||||||
|
"_from_model_config": true,
|
||||||
|
"bos_token_id": 1,
|
||||||
|
"eos_token_id": 2,
|
||||||
|
"transformers_version": "4.51.3"
|
||||||
|
}
|
||||||
3
model.safetensors
Normal file
3
model.safetensors
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
version https://git-lfs.github.com/spec/v1
|
||||||
|
oid sha256:dbb3fe0fd0d97a28c140aa315ec4a651f20432e9b7a509908a620190f506644b
|
||||||
|
size 641995416
|
||||||
53
special_tokens_map.json
Normal file
53
special_tokens_map.json
Normal file
@@ -0,0 +1,53 @@
|
|||||||
|
{
|
||||||
|
"additional_special_tokens": [
|
||||||
|
"<|im_start|>",
|
||||||
|
"<|im_end>",
|
||||||
|
"<think>",
|
||||||
|
"</think>",
|
||||||
|
"source_1",
|
||||||
|
"source_2",
|
||||||
|
"source_3",
|
||||||
|
"source_4",
|
||||||
|
"source_5",
|
||||||
|
"source_6",
|
||||||
|
"source_7",
|
||||||
|
"source_8",
|
||||||
|
"source_9",
|
||||||
|
"source_10",
|
||||||
|
"<ref",
|
||||||
|
"</ref>",
|
||||||
|
"→",
|
||||||
|
"↺",
|
||||||
|
"※",
|
||||||
|
"?maybe?",
|
||||||
|
"●",
|
||||||
|
"◐",
|
||||||
|
"○",
|
||||||
|
"⚠",
|
||||||
|
"☐",
|
||||||
|
"☑",
|
||||||
|
"✓",
|
||||||
|
"⟨H≈0.1⟩",
|
||||||
|
"⟨H≈0.2⟩",
|
||||||
|
"⟨H≈0.3⟩",
|
||||||
|
"⟨H≈0.4⟩",
|
||||||
|
"⟨H≈0.5⟩",
|
||||||
|
"⟨H≈0.6⟩",
|
||||||
|
"⟨H≈0.7⟩",
|
||||||
|
"⟨H≈0.8⟩",
|
||||||
|
"⟨H≈0.9⟩",
|
||||||
|
"⟨H≈1.0⟩",
|
||||||
|
"⟨H≈1.1⟩",
|
||||||
|
"⟨H≈1.2⟩",
|
||||||
|
"⟨H≈1.3⟩",
|
||||||
|
"⟨H≈1.4⟩",
|
||||||
|
"⟨H≈1.5⟩",
|
||||||
|
"⟨H≈1.6⟩",
|
||||||
|
"⟨H≈1.7⟩",
|
||||||
|
"⟨H≈1.8⟩"
|
||||||
|
],
|
||||||
|
"bos_token": "<|begin_of_text|>",
|
||||||
|
"eos_token": "<|end_of_text|>",
|
||||||
|
"pad_token": "[PAD]",
|
||||||
|
"unk_token": "[UNK]"
|
||||||
|
}
|
||||||
327047
tokenizer.json
Normal file
327047
tokenizer.json
Normal file
File diff suppressed because it is too large
Load Diff
451
tokenizer_config.json
Normal file
451
tokenizer_config.json
Normal file
@@ -0,0 +1,451 @@
|
|||||||
|
{
|
||||||
|
"added_tokens_decoder": {
|
||||||
|
"0": {
|
||||||
|
"content": "[UNK]",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"1": {
|
||||||
|
"content": "<|begin_of_text|>",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"2": {
|
||||||
|
"content": "<|end_of_text|>",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"3": {
|
||||||
|
"content": "[PAD]",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65491": {
|
||||||
|
"content": "<|im_start|>",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65492": {
|
||||||
|
"content": "<|im_end>",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65493": {
|
||||||
|
"content": "<think>",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65494": {
|
||||||
|
"content": "</think>",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65495": {
|
||||||
|
"content": "source_1",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65496": {
|
||||||
|
"content": "source_2",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65497": {
|
||||||
|
"content": "source_3",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65498": {
|
||||||
|
"content": "source_4",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65499": {
|
||||||
|
"content": "source_5",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65500": {
|
||||||
|
"content": "source_6",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65501": {
|
||||||
|
"content": "source_7",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65502": {
|
||||||
|
"content": "source_8",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65503": {
|
||||||
|
"content": "source_9",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65504": {
|
||||||
|
"content": "source_10",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65505": {
|
||||||
|
"content": "<ref",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65506": {
|
||||||
|
"content": "</ref>",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65507": {
|
||||||
|
"content": "→",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65508": {
|
||||||
|
"content": "↺",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65509": {
|
||||||
|
"content": "※",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65510": {
|
||||||
|
"content": "?maybe?",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65511": {
|
||||||
|
"content": "●",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65512": {
|
||||||
|
"content": "◐",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65513": {
|
||||||
|
"content": "○",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65514": {
|
||||||
|
"content": "⚠",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65515": {
|
||||||
|
"content": "☐",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65516": {
|
||||||
|
"content": "☑",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65517": {
|
||||||
|
"content": "✓",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65518": {
|
||||||
|
"content": "⟨H≈0.1⟩",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65519": {
|
||||||
|
"content": "⟨H≈0.2⟩",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65520": {
|
||||||
|
"content": "⟨H≈0.3⟩",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65521": {
|
||||||
|
"content": "⟨H≈0.4⟩",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65522": {
|
||||||
|
"content": "⟨H≈0.5⟩",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65523": {
|
||||||
|
"content": "⟨H≈0.6⟩",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65524": {
|
||||||
|
"content": "⟨H≈0.7⟩",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65525": {
|
||||||
|
"content": "⟨H≈0.8⟩",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65526": {
|
||||||
|
"content": "⟨H≈0.9⟩",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65527": {
|
||||||
|
"content": "⟨H≈1.0⟩",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65528": {
|
||||||
|
"content": "⟨H≈1.1⟩",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65529": {
|
||||||
|
"content": "⟨H≈1.2⟩",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65530": {
|
||||||
|
"content": "⟨H≈1.3⟩",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65531": {
|
||||||
|
"content": "⟨H≈1.4⟩",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65532": {
|
||||||
|
"content": "⟨H≈1.5⟩",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65533": {
|
||||||
|
"content": "⟨H≈1.6⟩",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65534": {
|
||||||
|
"content": "⟨H≈1.7⟩",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
},
|
||||||
|
"65535": {
|
||||||
|
"content": "⟨H≈1.8⟩",
|
||||||
|
"lstrip": false,
|
||||||
|
"normalized": false,
|
||||||
|
"rstrip": false,
|
||||||
|
"single_word": false,
|
||||||
|
"special": true
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"additional_special_tokens": [
|
||||||
|
"<|im_start|>",
|
||||||
|
"<|im_end>",
|
||||||
|
"<think>",
|
||||||
|
"</think>",
|
||||||
|
"source_1",
|
||||||
|
"source_2",
|
||||||
|
"source_3",
|
||||||
|
"source_4",
|
||||||
|
"source_5",
|
||||||
|
"source_6",
|
||||||
|
"source_7",
|
||||||
|
"source_8",
|
||||||
|
"source_9",
|
||||||
|
"source_10",
|
||||||
|
"<ref",
|
||||||
|
"</ref>",
|
||||||
|
"→",
|
||||||
|
"↺",
|
||||||
|
"※",
|
||||||
|
"?maybe?",
|
||||||
|
"●",
|
||||||
|
"◐",
|
||||||
|
"○",
|
||||||
|
"⚠",
|
||||||
|
"☐",
|
||||||
|
"☑",
|
||||||
|
"✓",
|
||||||
|
"⟨H≈0.1⟩",
|
||||||
|
"⟨H≈0.2⟩",
|
||||||
|
"⟨H≈0.3⟩",
|
||||||
|
"⟨H≈0.4⟩",
|
||||||
|
"⟨H≈0.5⟩",
|
||||||
|
"⟨H≈0.6⟩",
|
||||||
|
"⟨H≈0.7⟩",
|
||||||
|
"⟨H≈0.8⟩",
|
||||||
|
"⟨H≈0.9⟩",
|
||||||
|
"⟨H≈1.0⟩",
|
||||||
|
"⟨H≈1.1⟩",
|
||||||
|
"⟨H≈1.2⟩",
|
||||||
|
"⟨H≈1.3⟩",
|
||||||
|
"⟨H≈1.4⟩",
|
||||||
|
"⟨H≈1.5⟩",
|
||||||
|
"⟨H≈1.6⟩",
|
||||||
|
"⟨H≈1.7⟩",
|
||||||
|
"⟨H≈1.8⟩"
|
||||||
|
],
|
||||||
|
"bos_token": "<|begin_of_text|>",
|
||||||
|
"clean_up_tokenization_spaces": true,
|
||||||
|
"eos_token": "<|end_of_text|>",
|
||||||
|
"extra_special_tokens": {},
|
||||||
|
"model_max_length": 1000000000000000019884624838656,
|
||||||
|
"pad_token": "[PAD]",
|
||||||
|
"tokenizer_class": "PreTrainedTokenizer",
|
||||||
|
"unk_token": "[UNK]"
|
||||||
|
}
|
||||||
Reference in New Issue
Block a user