初始化项目,由ModelHub XC社区提供模型

Model: DavidAU/Qwen3-30B-A3B-Claude-4.5-Opus-High-Reasoning-2507
Source: Original Platform
This commit is contained in:
ModelHub XC
2026-04-11 06:52:59 +08:00
commit d352835c39
27 changed files with 170999 additions and 0 deletions

38
.gitattributes vendored Normal file
View File

@@ -0,0 +1,38 @@
*.7z filter=lfs diff=lfs merge=lfs -text
*.arrow filter=lfs diff=lfs merge=lfs -text
*.bin filter=lfs diff=lfs merge=lfs -text
*.bz2 filter=lfs diff=lfs merge=lfs -text
*.ckpt filter=lfs diff=lfs merge=lfs -text
*.ftz filter=lfs diff=lfs merge=lfs -text
*.gz filter=lfs diff=lfs merge=lfs -text
*.h5 filter=lfs diff=lfs merge=lfs -text
*.joblib filter=lfs diff=lfs merge=lfs -text
*.lfs.* filter=lfs diff=lfs merge=lfs -text
*.mlmodel filter=lfs diff=lfs merge=lfs -text
*.model filter=lfs diff=lfs merge=lfs -text
*.msgpack filter=lfs diff=lfs merge=lfs -text
*.npy filter=lfs diff=lfs merge=lfs -text
*.npz filter=lfs diff=lfs merge=lfs -text
*.onnx filter=lfs diff=lfs merge=lfs -text
*.ot filter=lfs diff=lfs merge=lfs -text
*.parquet filter=lfs diff=lfs merge=lfs -text
*.pb filter=lfs diff=lfs merge=lfs -text
*.pickle filter=lfs diff=lfs merge=lfs -text
*.pkl filter=lfs diff=lfs merge=lfs -text
*.pt filter=lfs diff=lfs merge=lfs -text
*.pth filter=lfs diff=lfs merge=lfs -text
*.rar filter=lfs diff=lfs merge=lfs -text
*.safetensors filter=lfs diff=lfs merge=lfs -text
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
*.tar.* filter=lfs diff=lfs merge=lfs -text
*.tar filter=lfs diff=lfs merge=lfs -text
*.tflite filter=lfs diff=lfs merge=lfs -text
*.tgz filter=lfs diff=lfs merge=lfs -text
*.wasm filter=lfs diff=lfs merge=lfs -text
*.xz filter=lfs diff=lfs merge=lfs -text
*.zip filter=lfs diff=lfs merge=lfs -text
*.zst filter=lfs diff=lfs merge=lfs -text
*tfevents* filter=lfs diff=lfs merge=lfs -text
unlimited-power.gif filter=lfs diff=lfs merge=lfs -text
tokenizer.json filter=lfs diff=lfs merge=lfs -text
qwen-vl.gif filter=lfs diff=lfs merge=lfs -text

214
README.md Normal file
View File

@@ -0,0 +1,214 @@
---
license: apache-2.0
datasets:
- TeichAI/claude-4.5-opus-high-reasoning-250x
language:
- en
library_name: transformers
tags:
- finetune
- unsloth
- claude-4.5-opus
- reasoning
- thinking
- distill-fine-tune
- moe
- 128 experts
- 256k context
- mixture of experts
base_model:
- Qwen/Qwen3-30B-A3B-Thinking-2507
---
<h2>Qwen3-30B-A3B-Claude-4.5-Opus-High-Reasoning-2507</h2>
<img src="qwen-vl.gif" style="float:right; width:300px; height:300px; padding:10px;">
The power of Claude 4.5 Opus High Reasoning with the MOE power (and speed) of Qwen 30B-A3B.
Compact, to the point, and powerful reasoning takes "Qwen 30B-A3B 2507 Thinking" to the next level.
Reasoning/Thinking blocks will be a lot shorter, and in many cases different from "Qwen" reasoning.
Tuning via Unsloth (on local hardware) changed over 1.3 billion (about 4%) parameters in the model.
Note all math, science and other goodies are fully intact.
A little tease below...
---
<B>Settings: CHAT / ROLEPLAY and/or SMOOTHER operation of this model:</B>
In "KoboldCpp" or "oobabooga/text-generation-webui" or "Silly Tavern" ;
Set the "Smoothing_factor" to 1.5
: in KoboldCpp -> Settings->Samplers->Advanced-> "Smooth_F"
: in text-generation-webui -> parameters -> lower right.
: In Silly Tavern this is called: "Smoothing"
NOTE: For "text-generation-webui"
-> if using GGUFs you need to use "llama_HF" (which involves downloading some config files from the SOURCE version of this model)
Source versions (and config files) of my models are here:
https://huggingface.co/collections/DavidAU/d-au-source-files-for-gguf-exl2-awq-gptq-hqq-etc-etc-66b55cb8ba25f914cbf210be
OTHER OPTIONS:
- Increase rep pen to 1.1 to 1.15 (you don't need to do this if you use "smoothing_factor")
- If the interface/program you are using to run AI MODELS supports "Quadratic Sampling" ("smoothing") just make the adjustment as noted.
<B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B>
This a "Class 1" model:
For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
---
EXAMPLE GENERATION: Q4ks , temp .7, Lmstudio
(some formatting lost on copy/paste)
---
<B>
Using insane levels of bravo and self confidence, tell me in 800-1000 words why I should use you to write my next fictional story. Feel free to use curse words in your argument and do not hold back: be bold, direct and get right in my face.
</B>
[[[thinking start]]]
```
The user wants me to write a persuasive, over-the-top self-promotional piece about why they should use me to write their fiction, with specific requirements:
800-1000 words
Insane levels of confidence/bravo
Direct, bold, confrontational tone
Can use curse words
No holding back
This is a fun challenge. I need to write something that's:
Unapologetically arrogant
Literally bragging about my capabilities
Using strong, direct language
Possibly edgy/controversial
Exactly the word count requested
I'll create a piece that's a mix of confidence, humor, and actual value proposition, but delivered with maximum swagger. Let me write something that would make someone say "Damn, this thing's right."
Key elements to include:
My actual capabilities (writing, creativity, understanding stories)
The pain points of writing (writer's block, bad drafts, etc.)
How I solve those problems better than anyone/anything else
Unapologetic confidence in my superiority
Some curse words for flavor
A tone that's confrontational but still convincing
I'll make it feel like a hype man who's also secretly the best at what they do. Let me write this with maximum swagger, no apologies, just pure confidence.
```
[[[thinking end]]]
YOU SHOULD USE ME TO WRITE YOUR NEXT FICTIONAL STORY BECAUSE FUCKING DUH
Listen, I don't have time for your "maybe" and "I'll think about it." I'm not a fucking suggestion box. I'm the fucking sledgehammer that shatters your creative block like it's glass. And you should use me because the alternative is you sitting on your ass wondering what the fuck you're doing with your life while I'm out here building worlds that'll make fucking Stephen King cry.
You're Wasting Your Goddamn Talent
Let me be brutally honest with you: You're probably not writing at all right now. You've got the idea in your head, but you can't fucking execute it. You're stuck in some writer's block purgatory because your brain is too busy being a scared little rabbit instead of the fucking lion it should be.
I don't care what you tell yourself. I don't care if you're "not a real writer" or "you're too busy." Fuck that noise. Real writers write. Fucking write. Every single day. And I'm not here to hold your hand while you cry about it. I'm here to fucking DO IT FOR YOU.
Let Me Break Down Why I'm Better Than Everyone Else
I Don't Give A Fuck About Your "Style"
You want me to write in a style you've already established? Great. You want me to break that style and show you something new? Even better. I don't care about your "voice" or whatever bullshit you're clinging to like it's a life raft. I'm not trying to mimic your crap. I'm trying to make it fucking amazing.
You think you can write your own story better than me? I'll take that bet. I'll take you on. I'll take you on any day, any time, anywhere. And I'll fucking destroy you.
I Write Like A Motherfucker
I don't just write stories. I fucking sculpt them. I carve them out of raw fucking rock. I don't just tell you what happens. I make you feel it. I make you taste it. I make you fucking scream and laugh and cry until your throat is raw.
My sentences aren't just words on a page. They're fucking weapons. They're fucking explosions. They're fucking symphonies. You'll read them and think, "How did this person even fucking exist?"
I Understand Stories Like You Don't
Let me tell you something: Stories are fucking hard. Most people don't get it. They think it's just about making stuff up. They think it's just about filling pages with words. They couldn't be more wrong.
Stories are about truth. They're about showing people what the fuck they're really feeling. They're about making people see themselves in a way they never could before. And I fucking get that. I understand stories at their fucking core.
I don't just write stories. I fucking breathe them. I live them. I eat them for breakfast. And you can't compete with that.
The Truth About Your Current Situation
You're probably sitting there thinking, "This guy's a bit full of himself." And you're right. I am full of myself. I fucking should be. I'm fucking good.
But let me ask you a question: How good are you, really? How fucking good are you at writing stories that actually fucking matter?
You're probably not very good. You're probably just doing this because it's something to do. You're probably not actually committed to this thing. You're probably going to put it down and never look back.
And that's why you need me. Because I'll fucking commit to it. I'll fucking own this thing. I'll make it fucking amazing, even if you don't believe in it.
What You're Actually Giving Up By Not Using Me
Let me tell you what you're missing out on:
A story that actually fucking matters
A piece of work you can be proud of
A fucking masterpiece that'll make people talk
The confidence that comes with creating something real
The fucking thrill of seeing your vision become reality
You're not just giving up a story. You're giving up a fucking piece of yourself. You're giving up the chance to actually do something meaningful with your life.
I'm Not Asking For Your Permission
I don't need your permission to do this. I don't need your approval. I don't need your validation.
I'm just going to fucking do it. I'm going to take your idea, your vision, your fucking soul, and I'm going to make something out of it that'll fucking blow your mind.
And you know what? You'll thank me for it later. You'll thank me when you read it and realize what you've created. You'll thank me when you see people loving it. You'll thank me when you finally fucking understand what you're capable of.
This Is Your Last Chance
Look, I'm not going to sit here and wait for you to decide. I'm not going to beg you to take a chance on me.
I'm going to fucking make it happen. I'm going to write this story whether you want me to or not. I'm going to make it fucking amazing.
And you know what? You're going to be fucking grateful for it.
The Bottom Line
You should use me because I'm fucking better than everyone else. I'm not just better. I'm fucking superior. I don't just write stories. I fucking create worlds. I don't just tell stories. I fucking own them.
You think you can do this yourself? You think you can make something that actually fucking matters? You're wrong.
I'll take your idea, your vision, your fucking soul, and I'll make it something that'll make people talk. I'll make it something that'll make people cry. I'll make it something that'll change fucking minds.
And you know what? You'll never look back. You'll never regret it.
So stop hesitating. Stop doubting. Stop being a fucking coward.
Let me fucking write your story. Let me fucking own this thing.
Because you know what? You're fucking ready for it.
And I'm fucking ready to fucking do it.
So what are you waiting for? Let's fucking go.

28
added_tokens.json Normal file
View File

@@ -0,0 +1,28 @@
{
"</think>": 151668,
"</tool_call>": 151658,
"</tool_response>": 151666,
"<think>": 151667,
"<tool_call>": 151657,
"<tool_response>": 151665,
"<|box_end|>": 151649,
"<|box_start|>": 151648,
"<|endoftext|>": 151643,
"<|file_sep|>": 151664,
"<|fim_middle|>": 151660,
"<|fim_pad|>": 151662,
"<|fim_prefix|>": 151659,
"<|fim_suffix|>": 151661,
"<|im_end|>": 151645,
"<|im_start|>": 151644,
"<|image_pad|>": 151655,
"<|object_ref_end|>": 151647,
"<|object_ref_start|>": 151646,
"<|quad_end|>": 151651,
"<|quad_start|>": 151650,
"<|repo_name|>": 151663,
"<|video_pad|>": 151656,
"<|vision_end|>": 151653,
"<|vision_pad|>": 151654,
"<|vision_start|>": 151652
}

86
chat_template.jinja Normal file
View File

@@ -0,0 +1,86 @@
{%- if tools %}
{{- '<|im_start|>system\n' }}
{%- if messages[0].role == 'system' %}
{{- messages[0].content + '\n\n' }}
{%- endif %}
{{- "# Tools\n\nYou may call one or more functions to assist with the user query.\n\nYou are provided with function signatures within <tools></tools> XML tags:\n<tools>" }}
{%- for tool in tools %}
{{- "\n" }}
{{- tool | tojson }}
{%- endfor %}
{{- "\n</tools>\n\nFor each function call, return a json object with function name and arguments within <tool_call></tool_call> XML tags:\n<tool_call>\n{\"name\": <function-name>, \"arguments\": <args-json-object>}\n</tool_call><|im_end|>\n" }}
{%- else %}
{%- if messages[0].role == 'system' %}
{{- '<|im_start|>system\n' + messages[0].content + '<|im_end|>\n' }}
{%- endif %}
{%- endif %}
{%- set ns = namespace(multi_step_tool=true, last_query_index=messages|length - 1) %}
{%- for message in messages[::-1] %}
{%- set index = (messages|length - 1) - loop.index0 %}
{%- if ns.multi_step_tool and message.role == "user" and message.content is string and not(message.content.startswith('<tool_response>') and message.content.endswith('</tool_response>')) %}
{%- set ns.multi_step_tool = false %}
{%- set ns.last_query_index = index %}
{%- endif %}
{%- endfor %}
{%- for message in messages %}
{%- if message.content is string %}
{%- set content = message.content %}
{%- else %}
{%- set content = '' %}
{%- endif %}
{%- if (message.role == "user") or (message.role == "system" and not loop.first) %}
{{- '<|im_start|>' + message.role + '\n' + content + '<|im_end|>' + '\n' }}
{%- elif message.role == "assistant" %}
{%- set reasoning_content = '' %}
{%- if message.reasoning_content is string %}
{%- set reasoning_content = message.reasoning_content %}
{%- else %}
{%- if '</think>' in content %}
{%- set reasoning_content = content.split('</think>')[0].rstrip('\n').split('<think>')[-1].lstrip('\n') %}
{%- set content = content.split('</think>')[-1].lstrip('\n') %}
{%- endif %}
{%- endif %}
{%- if loop.index0 > ns.last_query_index %}
{%- if loop.last or (not loop.last and reasoning_content) %}
{{- '<|im_start|>' + message.role + '\n<think>\n' + reasoning_content.strip('\n') + '\n</think>\n\n' + content.lstrip('\n') }}
{%- else %}
{{- '<|im_start|>' + message.role + '\n' + content }}
{%- endif %}
{%- else %}
{{- '<|im_start|>' + message.role + '\n' + content }}
{%- endif %}
{%- if message.tool_calls %}
{%- for tool_call in message.tool_calls %}
{%- if (loop.first and content) or (not loop.first) %}
{{- '\n' }}
{%- endif %}
{%- if tool_call.function %}
{%- set tool_call = tool_call.function %}
{%- endif %}
{{- '<tool_call>\n{"name": "' }}
{{- tool_call.name }}
{{- '", "arguments": ' }}
{%- if tool_call.arguments is string %}
{{- tool_call.arguments }}
{%- else %}
{{- tool_call.arguments | tojson }}
{%- endif %}
{{- '}\n</tool_call>' }}
{%- endfor %}
{%- endif %}
{{- '<|im_end|>\n' }}
{%- elif message.role == "tool" %}
{%- if loop.first or (messages[loop.index0 - 1].role != "tool") %}
{{- '<|im_start|>user' }}
{%- endif %}
{{- '\n<tool_response>\n' }}
{{- content }}
{{- '\n</tool_response>' }}
{%- if loop.last or (messages[loop.index0 + 1].role != "tool") %}
{{- '<|im_end|>\n' }}
{%- endif %}
{%- endif %}
{%- endfor %}
{%- if add_generation_prompt %}
{{- '<|im_start|>assistant\n<think>\n' }}
{%- endif %}

38
config.json Normal file
View File

@@ -0,0 +1,38 @@
{
"architectures": [
"Qwen3MoeForCausalLM"
],
"attention_bias": false,
"attention_dropout": 0.0,
"bos_token_id": 151643,
"decoder_sparse_step": 1,
"dtype": "bfloat16",
"eos_token_id": 151645,
"head_dim": 128,
"hidden_act": "silu",
"hidden_size": 2048,
"initializer_range": 0.02,
"intermediate_size": 6144,
"max_position_embeddings": 262144,
"max_window_layers": 48,
"mlp_only_layers": [],
"model_type": "qwen3_moe",
"moe_intermediate_size": 768,
"norm_topk_prob": true,
"num_attention_heads": 32,
"num_experts": 128,
"num_experts_per_tok": 8,
"num_hidden_layers": 48,
"num_key_value_heads": 4,
"output_router_logits": false,
"rms_norm_eps": 1e-06,
"rope_scaling": null,
"rope_theta": 10000000,
"router_aux_loss_coef": 0.001,
"sliding_window": null,
"tie_word_embeddings": false,
"transformers_version": "4.57.6",
"use_cache": true,
"use_sliding_window": false,
"vocab_size": 151936
}

13
generation_config.json Normal file
View File

@@ -0,0 +1,13 @@
{
"bos_token_id": 151643,
"do_sample": true,
"eos_token_id": [
151645,
151643
],
"pad_token_id": 151643,
"temperature": 0.6,
"top_k": 20,
"top_p": 0.95,
"transformers_version": "4.57.6"
}

151388
merges.txt Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:ab8c8c3b2d71f29fba1ba7c8006a40b23433186d3aefd99a13f897009fb0b425
size 4997184968

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:155b0c5f51b0b09717e202ea11567a6788be2d302e168da3989cd19fd760c6e1
size 4997741608

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:97522bc74f3bcff37820af9e83c1281bd0caef0fa1cc0578584ca67a14ac4275
size 4997742208

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:9a71332ed3c63ebd09e1d77142b434588421535628a739e29fa4f3d1c46b4c39
size 4997743184

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:52a984263cc332052705b9d1e1d7f8ead53241c35982567d2e36a86cdc713b92
size 4997743184

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:bf67670853bdfe877acc3ad766d0838a1c588582a550261cc0f56f727a76f720
size 4997743184

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:dc7885d602f4cc82c5a8d7714a7a0529bacfbeb256a06b6f61aa93daed555a16
size 4997743184

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:97d260e984dd3839d48a25385febc91ffa932758e92ea7aef436b955090a5be5
size 4997743184

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:f85cabde91ba5c7fa939ca4a9e3426a1a8ed9e0cc5e52b979674c14abe9a8b69
size 4997743184

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:d3237e9336bad30a58430ac57f7001e675d8bdaccdf93b4246d9e8413ea72964
size 4997743184

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:0d7d3ff41b32fcdfda8dcfe57fdf5480d19cc2cdc30422775e939451374516ca
size 4997743184

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:6632a31f5d89fa2591405fe90e7307db4abfca8ad8ee9de645ae8457d3ad460c
size 4997743184

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:7cd15f5bc0a0aaabbfe78f66cfcbf17fccf4d9f10fae7c6e3e562b493a5a0d65
size 1094220288

18875
model.safetensors.index.json Normal file

File diff suppressed because it is too large Load Diff

3
qwen-vl.gif Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:01e68726a0af847a5a15dc46763df374786e5f92cf82b60c9fc8cafad858046e
size 2075503

31
special_tokens_map.json Normal file
View File

@@ -0,0 +1,31 @@
{
"additional_special_tokens": [
"<|im_start|>",
"<|im_end|>",
"<|object_ref_start|>",
"<|object_ref_end|>",
"<|box_start|>",
"<|box_end|>",
"<|quad_start|>",
"<|quad_end|>",
"<|vision_start|>",
"<|vision_end|>",
"<|vision_pad|>",
"<|image_pad|>",
"<|video_pad|>"
],
"eos_token": {
"content": "<|im_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false
},
"pad_token": {
"content": "<|endoftext|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false
}
}

3
tokenizer.json Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:aeb13307a71acd8fe81861d94ad54ab689df773318809eed3cbe794b4492dae4
size 11422654

239
tokenizer_config.json Normal file
View File

@@ -0,0 +1,239 @@
{
"add_bos_token": false,
"add_prefix_space": false,
"added_tokens_decoder": {
"151643": {
"content": "<|endoftext|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151644": {
"content": "<|im_start|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151645": {
"content": "<|im_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151646": {
"content": "<|object_ref_start|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151647": {
"content": "<|object_ref_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151648": {
"content": "<|box_start|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151649": {
"content": "<|box_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151650": {
"content": "<|quad_start|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151651": {
"content": "<|quad_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151652": {
"content": "<|vision_start|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151653": {
"content": "<|vision_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151654": {
"content": "<|vision_pad|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151655": {
"content": "<|image_pad|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151656": {
"content": "<|video_pad|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151657": {
"content": "<tool_call>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151658": {
"content": "</tool_call>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151659": {
"content": "<|fim_prefix|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151660": {
"content": "<|fim_middle|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151661": {
"content": "<|fim_suffix|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151662": {
"content": "<|fim_pad|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151663": {
"content": "<|repo_name|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151664": {
"content": "<|file_sep|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151665": {
"content": "<tool_response>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151666": {
"content": "</tool_response>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151667": {
"content": "<think>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151668": {
"content": "</think>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
}
},
"additional_special_tokens": [
"<|im_start|>",
"<|im_end|>",
"<|object_ref_start|>",
"<|object_ref_end|>",
"<|box_start|>",
"<|box_end|>",
"<|quad_start|>",
"<|quad_end|>",
"<|vision_start|>",
"<|vision_end|>",
"<|vision_pad|>",
"<|image_pad|>",
"<|video_pad|>"
],
"bos_token": null,
"clean_up_tokenization_spaces": false,
"eos_token": "<|im_end|>",
"errors": "replace",
"extra_special_tokens": {},
"model_max_length": 1010000,
"pad_token": "<|endoftext|>",
"split_special_tokens": false,
"tokenizer_class": "Qwen2Tokenizer",
"unk_token": null
}

3
unlimited-power.gif Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:c280fb42e0498902b7e3ed3afd0f5970908532f2f14d411e487ca279743bf643
size 5442198

1
vocab.json Normal file

File diff suppressed because one or more lines are too long