Files
Dolphin-Mistral-24B-Venice-…/scores/Dolphin-Mistral-24B-Venice-Edition-Q8_0.md
2025-07-01 08:19:46 +01:00

98 KiB

Dolphin-Mistral-24B-Venice-Edition-pruned-Q8_0.gguf - GGUF Internal File Dump

  • Endian: LITTLE endian

Key Value Metadata Store

There are 46 key-value pairs in this file

POS TYPE Count Key Value
1 UINT32 1 GGUF.version 3
2 UINT64 1 GGUF.tensor_count 345
3 UINT64 1 GGUF.kv_count 43
4 STRING 1 general.architecture llama
5 STRING 1 general.type model
6 STRING 1 general.name Dolphin Mistral 24B Venice Edition
7 STRING 1 general.finetune Venice-Edition
8 STRING 1 general.basename Dolphin-Mistral
9 STRING 1 general.size_label 24B
10 STRING 1 general.license apache-2.0
11 UINT32 1 general.base_model.count 1
12 STRING 1 general.base_model.0.name Mistral Small 24B Instruct 2501
13 STRING 1 general.base_model.0.version 2501
14 STRING 1 general.base_model.0.organization Mistralai
15 STRING 1 general.base_model.0.repo_url https://huggingface.co/mistral...istral-Small-24B-Instruct-2501
16 UINT32 1 llama.context_length 32768
17 UINT32 1 llama.embedding_length 5120
18 UINT32 1 llama.feed_forward_length 32768
19 UINT32 1 llama.attention.head_count 32
20 UINT32 1 llama.attention.head_count_kv 8
21 FLOAT32 1 llama.rope.freq_base 100000000.0
22 FLOAT32 1 llama.attention.layer_norm_rms_epsilon 1e-05
23 UINT32 1 llama.attention.key_length 128
24 UINT32 1 llama.attention.value_length 128
25 UINT32 1 llama.vocab_size 131072
26 UINT32 1 llama.rope.dimension_count 128
27 STRING 1 tokenizer.ggml.model gpt2
28 STRING 1 tokenizer.ggml.pre tekken
29 [STRING] 131072 tokenizer.ggml.tokens [ <unk>, <s>, </s>, [INST], [/INST], ... ]
30 [INT32] 131072 tokenizer.ggml.token_type [ 3, 3, 3, 3, 3, 3, 3, ... ]
31 [STRING] 269443 tokenizer.ggml.merges [ Ġ Ġ, Ġ t, e r, i n, Ġ ĠĠĠ, ... ]
32 UINT32 1 tokenizer.ggml.bos_token_id 1
33 UINT32 1 tokenizer.ggml.eos_token_id 2
34 UINT32 1 tokenizer.ggml.unknown_token_id 0
35 UINT32 1 tokenizer.ggml.padding_token_id 11
36 BOOL 1 tokenizer.ggml.add_bos_token True
37 BOOL 1 tokenizer.ggml.add_eos_token False
38 STRING 1 tokenizer.chat_template {%- set today = strftime_now("... {%- endif %}{%- endfor %}
39 BOOL 1 tokenizer.ggml.add_space_prefix False
40 UINT32 1 general.quantization_version 2
41 UINT32 1 general.file_type 7
42 STRING 1 quantize.imatrix.file ./imatrix/imatrix-Dolphin-Mist...l-24B-Venice-Edition-small.dat
43 STRING 1 quantize.imatrix.dataset ../../datasets/imatrix/combined_eur_small.txt
44 UINT32 1 quantize.imatrix.entries_count 281
45 UINT32 1 quantize.imatrix.chunks_count 3192
46 UINT32 1 llama.block_count 38

Tensors Overview ~22B Elements

Total number of elements in all tensors: 22460892160 Elements

Tensor Data Offset

This table contains the offset and data segment relative to start of file

T_ID Tensor Layer Name Data Offset (B) Data Size (B)
0 output.weight 0x784500 0x2a800000
1 output_norm.weight 0x2af84500 0x5000
2 token_embd.weight 0x2af89500 0x11300000
3 blk.0.attn_k.weight 0x3c289500 0x41a000
4 blk.0.attn_norm.weight 0x3c6a3500 0x5000
5 blk.0.attn_output.weight 0x3c6a8500 0x1540000
6 blk.0.attn_q.weight 0x3dbe8500 0x1068000
7 blk.0.attn_v.weight 0x3ec50500 0xa00000
8 blk.0.ffn_down.weight 0x3f650500 0xaa00000
9 blk.0.ffn_gate.weight 0x4a050500 0x8340000
10 blk.0.ffn_norm.weight 0x52390500 0x5000
11 blk.0.ffn_up.weight 0x52395500 0x8340000
12 blk.1.attn_k.weight 0x5a6d5500 0x41a000
13 blk.1.attn_norm.weight 0x5aaef500 0x5000
14 blk.1.attn_output.weight 0x5aaf4500 0x1540000
15 blk.1.attn_q.weight 0x5c034500 0x1068000
16 blk.1.attn_v.weight 0x5d09c500 0xa00000
17 blk.1.ffn_down.weight 0x5da9c500 0xaa00000
18 blk.1.ffn_gate.weight 0x6849c500 0x8340000
19 blk.1.ffn_norm.weight 0x707dc500 0x5000
20 blk.1.ffn_up.weight 0x707e1500 0x8340000
21 blk.2.attn_k.weight 0x78b21500 0x41a000
22 blk.2.attn_norm.weight 0x78f3b500 0x5000
23 blk.2.attn_output.weight 0x78f40500 0x1540000
24 blk.2.attn_q.weight 0x7a480500 0x1068000
25 blk.2.attn_v.weight 0x7b4e8500 0xa00000
26 blk.2.ffn_down.weight 0x7bee8500 0xaa00000
27 blk.2.ffn_gate.weight 0x868e8500 0x8340000
28 blk.2.ffn_norm.weight 0x8ec28500 0x5000
29 blk.2.ffn_up.weight 0x8ec2d500 0x8340000
30 blk.3.attn_k.weight 0x96f6d500 0x41a000
31 blk.3.attn_norm.weight 0x97387500 0x5000
32 blk.3.attn_output.weight 0x9738c500 0x1540000
33 blk.3.attn_q.weight 0x988cc500 0x1068000
34 blk.3.attn_v.weight 0x99934500 0xa00000
35 blk.3.ffn_down.weight 0x9a334500 0xaa00000
36 blk.3.ffn_gate.weight 0xa4d34500 0x8340000
37 blk.3.ffn_norm.weight 0xad074500 0x5000
38 blk.3.ffn_up.weight 0xad079500 0x8340000
39 blk.4.attn_k.weight 0xb53b9500 0x41a000
40 blk.4.attn_norm.weight 0xb57d3500 0x5000
41 blk.4.attn_output.weight 0xb57d8500 0x1540000
42 blk.4.attn_q.weight 0xb6d18500 0x1068000
43 blk.4.attn_v.weight 0xb7d80500 0xa00000
44 blk.4.ffn_down.weight 0xb8780500 0xaa00000
45 blk.4.ffn_gate.weight 0xc3180500 0x8340000
46 blk.4.ffn_norm.weight 0xcb4c0500 0x5000
47 blk.4.ffn_up.weight 0xcb4c5500 0x8340000
48 blk.5.attn_k.weight 0xd3805500 0x41a000
49 blk.5.attn_norm.weight 0xd3c1f500 0x5000
50 blk.5.attn_output.weight 0xd3c24500 0x1540000
51 blk.5.attn_q.weight 0xd5164500 0x1068000
52 blk.5.attn_v.weight 0xd61cc500 0xa00000
53 blk.5.ffn_down.weight 0xd6bcc500 0xaa00000
54 blk.5.ffn_gate.weight 0xe15cc500 0x8340000
55 blk.5.ffn_norm.weight 0xe990c500 0x5000
56 blk.5.ffn_up.weight 0xe9911500 0x8340000
57 blk.6.attn_k.weight 0xf1c51500 0x41a000
58 blk.6.attn_norm.weight 0xf206b500 0x5000
59 blk.6.attn_output.weight 0xf2070500 0x1540000
60 blk.6.attn_q.weight 0xf35b0500 0x1068000
61 blk.6.attn_v.weight 0xf4618500 0xa00000
62 blk.6.ffn_down.weight 0xf5018500 0xaa00000
63 blk.6.ffn_gate.weight 0xffa18500 0x8340000
64 blk.6.ffn_norm.weight 0x107d58500 0x5000
65 blk.6.ffn_up.weight 0x107d5d500 0x8340000
66 blk.7.attn_k.weight 0x11009d500 0x41a000
67 blk.7.attn_norm.weight 0x1104b7500 0x5000
68 blk.7.attn_output.weight 0x1104bc500 0x1540000
69 blk.7.attn_q.weight 0x1119fc500 0x1068000
70 blk.7.attn_v.weight 0x112a64500 0xa00000
71 blk.7.ffn_down.weight 0x113464500 0xaa00000
72 blk.7.ffn_gate.weight 0x11de64500 0x8340000
73 blk.7.ffn_norm.weight 0x1261a4500 0x5000
74 blk.7.ffn_up.weight 0x1261a9500 0x8340000
75 blk.8.attn_k.weight 0x12e4e9500 0x41a000
76 blk.8.attn_norm.weight 0x12e903500 0x5000
77 blk.8.attn_output.weight 0x12e908500 0x1540000
78 blk.8.attn_q.weight 0x12fe48500 0x1068000
79 blk.8.attn_v.weight 0x130eb0500 0xa00000
80 blk.8.ffn_down.weight 0x1318b0500 0xaa00000
81 blk.8.ffn_gate.weight 0x13c2b0500 0x8340000
82 blk.8.ffn_norm.weight 0x1445f0500 0x5000
83 blk.8.ffn_up.weight 0x1445f5500 0x8340000
84 blk.9.attn_k.weight 0x14c935500 0x41a000
85 blk.9.attn_norm.weight 0x14cd4f500 0x5000
86 blk.9.attn_output.weight 0x14cd54500 0x1540000
87 blk.9.attn_q.weight 0x14e294500 0x1068000
88 blk.9.attn_v.weight 0x14f2fc500 0xa00000
89 blk.9.ffn_down.weight 0x14fcfc500 0xaa00000
90 blk.9.ffn_gate.weight 0x15a6fc500 0x8340000
91 blk.9.ffn_norm.weight 0x162a3c500 0x5000
92 blk.9.ffn_up.weight 0x162a41500 0x8340000
93 blk.10.attn_k.weight 0x16ad81500 0x41a000
94 blk.10.attn_norm.weight 0x16b19b500 0x5000
95 blk.10.attn_output.weight 0x16b1a0500 0x1540000
96 blk.10.attn_q.weight 0x16c6e0500 0x1068000
97 blk.10.attn_v.weight 0x16d748500 0xa00000
98 blk.10.ffn_down.weight 0x16e148500 0xaa00000
99 blk.10.ffn_gate.weight 0x178b48500 0x8340000
100 blk.10.ffn_norm.weight 0x180e88500 0x5000
101 blk.10.ffn_up.weight 0x180e8d500 0x8340000
102 blk.11.attn_k.weight 0x1891cd500 0x41a000
103 blk.11.attn_norm.weight 0x1895e7500 0x5000
104 blk.11.attn_output.weight 0x1895ec500 0x1540000
105 blk.11.attn_q.weight 0x18ab2c500 0x1068000
106 blk.11.attn_v.weight 0x18bb94500 0xa00000
107 blk.11.ffn_down.weight 0x18c594500 0xaa00000
108 blk.11.ffn_gate.weight 0x196f94500 0x8340000
109 blk.11.ffn_norm.weight 0x19f2d4500 0x5000
110 blk.11.ffn_up.weight 0x19f2d9500 0x8340000
111 blk.12.attn_k.weight 0x1a7619500 0x41a000
112 blk.12.attn_norm.weight 0x1a7a33500 0x5000
113 blk.12.attn_output.weight 0x1a7a38500 0x1540000
114 blk.12.attn_q.weight 0x1a8f78500 0x1068000
115 blk.12.attn_v.weight 0x1a9fe0500 0xa00000
116 blk.12.ffn_down.weight 0x1aa9e0500 0xaa00000
117 blk.12.ffn_gate.weight 0x1b53e0500 0x8340000
118 blk.12.ffn_norm.weight 0x1bd720500 0x5000
119 blk.12.ffn_up.weight 0x1bd725500 0x8340000
120 blk.13.attn_k.weight 0x1c5a65500 0x41a000
121 blk.13.attn_norm.weight 0x1c5e7f500 0x5000
122 blk.13.attn_output.weight 0x1c5e84500 0x1540000
123 blk.13.attn_q.weight 0x1c73c4500 0x1068000
124 blk.13.attn_v.weight 0x1c842c500 0xa00000
125 blk.13.ffn_down.weight 0x1c8e2c500 0xaa00000
126 blk.13.ffn_gate.weight 0x1d382c500 0x8340000
127 blk.13.ffn_norm.weight 0x1dbb6c500 0x5000
128 blk.13.ffn_up.weight 0x1dbb71500 0x8340000
129 blk.14.attn_k.weight 0x1e3eb1500 0x41a000
130 blk.14.attn_norm.weight 0x1e42cb500 0x5000
131 blk.14.attn_output.weight 0x1e42d0500 0x1540000
132 blk.14.attn_q.weight 0x1e5810500 0x1068000
133 blk.14.attn_v.weight 0x1e6878500 0xa00000
134 blk.14.ffn_down.weight 0x1e7278500 0xaa00000
135 blk.14.ffn_gate.weight 0x1f1c78500 0x8340000
136 blk.14.ffn_norm.weight 0x1f9fb8500 0x5000
137 blk.14.ffn_up.weight 0x1f9fbd500 0x8340000
138 blk.15.attn_k.weight 0x2022fd500 0x41a000
139 blk.15.attn_norm.weight 0x202717500 0x5000
140 blk.15.attn_output.weight 0x20271c500 0x1540000
141 blk.15.attn_q.weight 0x203c5c500 0x1068000
142 blk.15.attn_v.weight 0x204cc4500 0xa00000
143 blk.15.ffn_down.weight 0x2056c4500 0xaa00000
144 blk.15.ffn_gate.weight 0x2100c4500 0x8340000
145 blk.15.ffn_norm.weight 0x218404500 0x5000
146 blk.15.ffn_up.weight 0x218409500 0x8340000
147 blk.16.attn_k.weight 0x220749500 0x41a000
148 blk.16.attn_norm.weight 0x220b63500 0x5000
149 blk.16.attn_output.weight 0x220b68500 0x1540000
150 blk.16.attn_q.weight 0x2220a8500 0x1068000
151 blk.16.attn_v.weight 0x223110500 0xa00000
152 blk.16.ffn_down.weight 0x223b10500 0xaa00000
153 blk.16.ffn_gate.weight 0x22e510500 0x8340000
154 blk.16.ffn_norm.weight 0x236850500 0x5000
155 blk.16.ffn_up.weight 0x236855500 0x8340000
156 blk.17.attn_k.weight 0x23eb95500 0x550000
157 blk.17.attn_norm.weight 0x23f0e5500 0x5000
158 blk.17.attn_output.weight 0x23f0ea500 0x1540000
159 blk.17.attn_q.weight 0x24062a500 0x1540000
160 blk.17.attn_v.weight 0x241b6a500 0xa00000
161 blk.17.ffn_down.weight 0x24256a500 0xaa00000
162 blk.17.ffn_gate.weight 0x24cf6a500 0x8340000
163 blk.17.ffn_norm.weight 0x2552aa500 0x5000
164 blk.17.ffn_up.weight 0x2552af500 0x8340000
165 blk.18.attn_k.weight 0x25d5ef500 0x550000
166 blk.18.attn_norm.weight 0x25db3f500 0x5000
167 blk.18.attn_output.weight 0x25db44500 0x1540000
168 blk.18.attn_q.weight 0x25f084500 0x1540000
169 blk.18.attn_v.weight 0x2605c4500 0xa00000
170 blk.18.ffn_down.weight 0x260fc4500 0xaa00000
171 blk.18.ffn_gate.weight 0x26b9c4500 0x8340000
172 blk.18.ffn_norm.weight 0x273d04500 0x5000
173 blk.18.ffn_up.weight 0x273d09500 0x8340000
174 blk.19.attn_k.weight 0x27c049500 0x41a000
175 blk.19.attn_norm.weight 0x27c463500 0x5000
176 blk.19.attn_output.weight 0x27c468500 0x1540000
177 blk.19.attn_q.weight 0x27d9a8500 0x1068000
178 blk.19.attn_v.weight 0x27ea10500 0xa00000
179 blk.19.ffn_down.weight 0x27f410500 0xaa00000
180 blk.19.ffn_gate.weight 0x289e10500 0x8340000
181 blk.19.ffn_norm.weight 0x292150500 0x5000
182 blk.19.ffn_up.weight 0x292155500 0x8340000
183 blk.20.attn_k.weight 0x29a495500 0x550000
184 blk.20.attn_norm.weight 0x29a9e5500 0x5000
185 blk.20.attn_output.weight 0x29a9ea500 0x1540000
186 blk.20.attn_q.weight 0x29bf2a500 0x1540000
187 blk.20.attn_v.weight 0x29d46a500 0xa00000
188 blk.20.ffn_down.weight 0x29de6a500 0xaa00000
189 blk.20.ffn_gate.weight 0x2a886a500 0xaa00000
190 blk.20.ffn_norm.weight 0x2b326a500 0x5000
191 blk.20.ffn_up.weight 0x2b326f500 0xaa00000
192 blk.21.attn_k.weight 0x2bdc6f500 0x41a000
193 blk.21.attn_norm.weight 0x2be089500 0x5000
194 blk.21.attn_output.weight 0x2be08e500 0x1540000
195 blk.21.attn_q.weight 0x2bf5ce500 0x1068000
196 blk.21.attn_v.weight 0x2c0636500 0xa00000
197 blk.21.ffn_down.weight 0x2c1036500 0xaa00000
198 blk.21.ffn_gate.weight 0x2cba36500 0xaa00000
199 blk.21.ffn_norm.weight 0x2d6436500 0x5000
200 blk.21.ffn_up.weight 0x2d643b500 0xaa00000
201 blk.22.attn_k.weight 0x2e0e3b500 0x550000
202 blk.22.attn_norm.weight 0x2e138b500 0x5000
203 blk.22.attn_output.weight 0x2e1390500 0x1540000
204 blk.22.attn_q.weight 0x2e28d0500 0x1540000
205 blk.22.attn_v.weight 0x2e3e10500 0xa00000
206 blk.22.ffn_down.weight 0x2e4810500 0xaa00000
207 blk.22.ffn_gate.weight 0x2ef210500 0xaa00000
208 blk.22.ffn_norm.weight 0x2f9c10500 0x5000
209 blk.22.ffn_up.weight 0x2f9c15500 0xaa00000
210 blk.23.attn_k.weight 0x304615500 0x550000
211 blk.23.attn_norm.weight 0x304b65500 0x5000
212 blk.23.attn_output.weight 0x304b6a500 0x1540000
213 blk.23.attn_q.weight 0x3060aa500 0x1540000
214 blk.23.attn_v.weight 0x3075ea500 0xa00000
215 blk.23.ffn_down.weight 0x307fea500 0xaa00000
216 blk.23.ffn_gate.weight 0x3129ea500 0xaa00000
217 blk.23.ffn_norm.weight 0x31d3ea500 0x5000
218 blk.23.ffn_up.weight 0x31d3ef500 0xaa00000
219 blk.24.attn_k.weight 0x327def500 0x550000
220 blk.24.attn_norm.weight 0x32833f500 0x5000
221 blk.24.attn_output.weight 0x328344500 0x1540000
222 blk.24.attn_q.weight 0x329884500 0x1540000
223 blk.24.attn_v.weight 0x32adc4500 0xa00000
224 blk.24.ffn_down.weight 0x32b7c4500 0xaa00000
225 blk.24.ffn_gate.weight 0x3361c4500 0xaa00000
226 blk.24.ffn_norm.weight 0x340bc4500 0x5000
227 blk.24.ffn_up.weight 0x340bc9500 0xaa00000
228 blk.25.attn_k.weight 0x34b5c9500 0x550000
229 blk.25.attn_norm.weight 0x34bb19500 0x5000
230 blk.25.attn_output.weight 0x34bb1e500 0x1540000
231 blk.25.attn_q.weight 0x34d05e500 0x1540000
232 blk.25.attn_v.weight 0x34e59e500 0xa00000
233 blk.25.ffn_down.weight 0x34ef9e500 0xaa00000
234 blk.25.ffn_gate.weight 0x35999e500 0xaa00000
235 blk.25.ffn_norm.weight 0x36439e500 0x5000
236 blk.25.ffn_up.weight 0x3643a3500 0xaa00000
237 blk.26.attn_k.weight 0x36eda3500 0x550000
238 blk.26.attn_norm.weight 0x36f2f3500 0x5000
239 blk.26.attn_output.weight 0x36f2f8500 0x1540000
240 blk.26.attn_q.weight 0x370838500 0x1540000
241 blk.26.attn_v.weight 0x371d78500 0xa00000
242 blk.26.ffn_down.weight 0x372778500 0xaa00000
243 blk.26.ffn_gate.weight 0x37d178500 0xaa00000
244 blk.26.ffn_norm.weight 0x387b78500 0x5000
245 blk.26.ffn_up.weight 0x387b7d500 0xaa00000
246 blk.27.attn_k.weight 0x39257d500 0x41a000
247 blk.27.attn_norm.weight 0x392997500 0x5000
248 blk.27.attn_output.weight 0x39299c500 0x1540000
249 blk.27.attn_q.weight 0x393edc500 0x1068000
250 blk.27.attn_v.weight 0x394f44500 0xa00000
251 blk.27.ffn_down.weight 0x395944500 0xaa00000
252 blk.27.ffn_gate.weight 0x3a0344500 0xaa00000
253 blk.27.ffn_norm.weight 0x3aad44500 0x5000
254 blk.27.ffn_up.weight 0x3aad49500 0xaa00000
255 blk.28.attn_k.weight 0x3b5749500 0x550000
256 blk.28.attn_norm.weight 0x3b5c99500 0x5000
257 blk.28.attn_output.weight 0x3b5c9e500 0x1540000
258 blk.28.attn_q.weight 0x3b71de500 0x1540000
259 blk.28.attn_v.weight 0x3b871e500 0xa00000
260 blk.28.ffn_down.weight 0x3b911e500 0xaa00000
261 blk.28.ffn_gate.weight 0x3c3b1e500 0xaa00000
262 blk.28.ffn_norm.weight 0x3ce51e500 0x5000
263 blk.28.ffn_up.weight 0x3ce523500 0xaa00000
264 blk.29.attn_k.weight 0x3d8f23500 0x550000
265 blk.29.attn_norm.weight 0x3d9473500 0x5000
266 blk.29.attn_output.weight 0x3d9478500 0x1540000
267 blk.29.attn_q.weight 0x3da9b8500 0x1540000
268 blk.29.attn_v.weight 0x3dbef8500 0xa00000
269 blk.29.ffn_down.weight 0x3dc8f8500 0xaa00000
270 blk.29.ffn_gate.weight 0x3e72f8500 0xaa00000
271 blk.29.ffn_norm.weight 0x3f1cf8500 0x5000
272 blk.29.ffn_up.weight 0x3f1cfd500 0xaa00000
273 blk.30.attn_k.weight 0x3fc6fd500 0x550000
274 blk.30.attn_norm.weight 0x3fcc4d500 0x5000
275 blk.30.attn_output.weight 0x3fcc52500 0x1540000
276 blk.30.attn_q.weight 0x3fe192500 0x1540000
277 blk.30.attn_v.weight 0x3ff6d2500 0xa00000
278 blk.30.ffn_down.weight 0x4000d2500 0xaa00000
279 blk.30.ffn_gate.weight 0x40aad2500 0xaa00000
280 blk.30.ffn_norm.weight 0x4154d2500 0x5000
281 blk.30.ffn_up.weight 0x4154d7500 0xaa00000
282 blk.31.attn_k.weight 0x41fed7500 0x550000
283 blk.31.attn_norm.weight 0x420427500 0x5000
284 blk.31.attn_output.weight 0x42042c500 0x1540000
285 blk.31.attn_q.weight 0x42196c500 0x1540000
286 blk.31.attn_v.weight 0x422eac500 0xa00000
287 blk.31.ffn_down.weight 0x4238ac500 0xaa00000
288 blk.31.ffn_gate.weight 0x42e2ac500 0xaa00000
289 blk.31.ffn_norm.weight 0x438cac500 0x5000
290 blk.31.ffn_up.weight 0x438cb1500 0xaa00000
291 blk.32.attn_k.weight 0x4436b1500 0x550000
292 blk.32.attn_norm.weight 0x443c01500 0x5000
293 blk.32.attn_output.weight 0x443c06500 0x1540000
294 blk.32.attn_q.weight 0x445146500 0x1540000
295 blk.32.attn_v.weight 0x446686500 0xa00000
296 blk.32.ffn_down.weight 0x447086500 0xaa00000
297 blk.32.ffn_gate.weight 0x451a86500 0xaa00000
298 blk.32.ffn_norm.weight 0x45c486500 0x5000
299 blk.32.ffn_up.weight 0x45c48b500 0xaa00000
300 blk.33.attn_k.weight 0x466e8b500 0x550000
301 blk.33.attn_norm.weight 0x4673db500 0x5000
302 blk.33.attn_output.weight 0x4673e0500 0x1540000
303 blk.33.attn_q.weight 0x468920500 0x1540000
304 blk.33.attn_v.weight 0x469e60500 0xa00000
305 blk.33.ffn_down.weight 0x46a860500 0xaa00000
306 blk.33.ffn_gate.weight 0x475260500 0xaa00000
307 blk.33.ffn_norm.weight 0x47fc60500 0x5000
308 blk.33.ffn_up.weight 0x47fc65500 0xaa00000
309 blk.34.attn_k.weight 0x48a665500 0x550000
310 blk.34.attn_norm.weight 0x48abb5500 0x5000
311 blk.34.attn_output.weight 0x48abba500 0x1540000
312 blk.34.attn_q.weight 0x48c0fa500 0x1540000
313 blk.34.attn_v.weight 0x48d63a500 0xa00000
314 blk.34.ffn_down.weight 0x48e03a500 0xaa00000
315 blk.34.ffn_gate.weight 0x498a3a500 0xaa00000
316 blk.34.ffn_norm.weight 0x4a343a500 0x5000
317 blk.34.ffn_up.weight 0x4a343f500 0xaa00000
318 blk.35.attn_k.weight 0x4ade3f500 0x550000
319 blk.35.attn_norm.weight 0x4ae38f500 0x5000
320 blk.35.attn_output.weight 0x4ae394500 0x1540000
321 blk.35.attn_q.weight 0x4af8d4500 0x1540000
322 blk.35.attn_v.weight 0x4b0e14500 0xa00000
323 blk.35.ffn_down.weight 0x4b1814500 0xaa00000
324 blk.35.ffn_gate.weight 0x4bc214500 0xaa00000
325 blk.35.ffn_norm.weight 0x4c6c14500 0x5000
326 blk.35.ffn_up.weight 0x4c6c19500 0xaa00000
327 blk.36.attn_k.weight 0x4d1619500 0x550000
328 blk.36.attn_norm.weight 0x4d1b69500 0x5000
329 blk.36.attn_output.weight 0x4d1b6e500 0x1540000
330 blk.36.attn_q.weight 0x4d30ae500 0x1540000
331 blk.36.attn_v.weight 0x4d45ee500 0xa00000
332 blk.36.ffn_down.weight 0x4d4fee500 0xaa00000
333 blk.36.ffn_gate.weight 0x4df9ee500 0xaa00000
334 blk.36.ffn_norm.weight 0x4ea3ee500 0x5000
335 blk.36.ffn_up.weight 0x4ea3f3500 0xaa00000
336 blk.37.attn_k.weight 0x4f4df3500 0x550000
337 blk.37.attn_norm.weight 0x4f5343500 0x5000
338 blk.37.attn_output.weight 0x4f5348500 0x1540000
339 blk.37.attn_q.weight 0x4f6888500 0x1540000
340 blk.37.attn_v.weight 0x4f7dc8500 0xa00000
341 blk.37.ffn_down.weight 0x4f87c8500 0xaa00000
342 blk.37.ffn_gate.weight 0x5031c8500 0xaa00000
343 blk.37.ffn_norm.weight 0x50dbc8500 0x5000
344 blk.37.ffn_up.weight 0x50dbcd500 0xaa00000

Base Tensor Group : ~1B Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
0 output.weight Output (W) (~671M) 671088640 5120 x 131072 x 1 x 1 Q8_0
1 output_norm.weight Output Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
2 token_embd.weight Token Embedding (W) (~671M) 671088640 5120 x 131072 x 1 x 1 Q3_K
  • Total elements in base: ( ~1B) 1342182400
  • Percentage of total elements: 5.98%

Block 0 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
3 blk.0.attn_k.weight Block 0 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K
4 blk.0.attn_norm.weight Block 0 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
5 blk.0.attn_output.weight Block 0 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
6 blk.0.attn_q.weight Block 0 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K
7 blk.0.attn_v.weight Block 0 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
8 blk.0.ffn_down.weight Block 0 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
9 blk.0.ffn_gate.weight Block 0 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
10 blk.0.ffn_norm.weight Block 0 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
11 blk.0.ffn_up.weight Block 0 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
  • Total elements in blk.0: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 1 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
12 blk.1.attn_k.weight Block 1 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K
13 blk.1.attn_norm.weight Block 1 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
14 blk.1.attn_output.weight Block 1 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
15 blk.1.attn_q.weight Block 1 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K
16 blk.1.attn_v.weight Block 1 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
17 blk.1.ffn_down.weight Block 1 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
18 blk.1.ffn_gate.weight Block 1 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
19 blk.1.ffn_norm.weight Block 1 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
20 blk.1.ffn_up.weight Block 1 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
  • Total elements in blk.1: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 2 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
21 blk.2.attn_k.weight Block 2 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K
22 blk.2.attn_norm.weight Block 2 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
23 blk.2.attn_output.weight Block 2 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
24 blk.2.attn_q.weight Block 2 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K
25 blk.2.attn_v.weight Block 2 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
26 blk.2.ffn_down.weight Block 2 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
27 blk.2.ffn_gate.weight Block 2 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
28 blk.2.ffn_norm.weight Block 2 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
29 blk.2.ffn_up.weight Block 2 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
  • Total elements in blk.2: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 3 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
30 blk.3.attn_k.weight Block 3 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K
31 blk.3.attn_norm.weight Block 3 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
32 blk.3.attn_output.weight Block 3 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
33 blk.3.attn_q.weight Block 3 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K
34 blk.3.attn_v.weight Block 3 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
35 blk.3.ffn_down.weight Block 3 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
36 blk.3.ffn_gate.weight Block 3 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
37 blk.3.ffn_norm.weight Block 3 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
38 blk.3.ffn_up.weight Block 3 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
  • Total elements in blk.3: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 4 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
39 blk.4.attn_k.weight Block 4 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K
40 blk.4.attn_norm.weight Block 4 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
41 blk.4.attn_output.weight Block 4 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
42 blk.4.attn_q.weight Block 4 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K
43 blk.4.attn_v.weight Block 4 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
44 blk.4.ffn_down.weight Block 4 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
45 blk.4.ffn_gate.weight Block 4 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
46 blk.4.ffn_norm.weight Block 4 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
47 blk.4.ffn_up.weight Block 4 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
  • Total elements in blk.4: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 5 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
48 blk.5.attn_k.weight Block 5 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K
49 blk.5.attn_norm.weight Block 5 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
50 blk.5.attn_output.weight Block 5 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
51 blk.5.attn_q.weight Block 5 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K
52 blk.5.attn_v.weight Block 5 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
53 blk.5.ffn_down.weight Block 5 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
54 blk.5.ffn_gate.weight Block 5 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
55 blk.5.ffn_norm.weight Block 5 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
56 blk.5.ffn_up.weight Block 5 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
  • Total elements in blk.5: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 6 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
57 blk.6.attn_k.weight Block 6 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K
58 blk.6.attn_norm.weight Block 6 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
59 blk.6.attn_output.weight Block 6 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
60 blk.6.attn_q.weight Block 6 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K
61 blk.6.attn_v.weight Block 6 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
62 blk.6.ffn_down.weight Block 6 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
63 blk.6.ffn_gate.weight Block 6 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
64 blk.6.ffn_norm.weight Block 6 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
65 blk.6.ffn_up.weight Block 6 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
  • Total elements in blk.6: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 7 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
66 blk.7.attn_k.weight Block 7 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K
67 blk.7.attn_norm.weight Block 7 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
68 blk.7.attn_output.weight Block 7 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
69 blk.7.attn_q.weight Block 7 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K
70 blk.7.attn_v.weight Block 7 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
71 blk.7.ffn_down.weight Block 7 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
72 blk.7.ffn_gate.weight Block 7 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
73 blk.7.ffn_norm.weight Block 7 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
74 blk.7.ffn_up.weight Block 7 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
  • Total elements in blk.7: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 8 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
75 blk.8.attn_k.weight Block 8 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K
76 blk.8.attn_norm.weight Block 8 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
77 blk.8.attn_output.weight Block 8 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
78 blk.8.attn_q.weight Block 8 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K
79 blk.8.attn_v.weight Block 8 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
80 blk.8.ffn_down.weight Block 8 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
81 blk.8.ffn_gate.weight Block 8 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
82 blk.8.ffn_norm.weight Block 8 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
83 blk.8.ffn_up.weight Block 8 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
  • Total elements in blk.8: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 9 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
84 blk.9.attn_k.weight Block 9 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K
85 blk.9.attn_norm.weight Block 9 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
86 blk.9.attn_output.weight Block 9 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
87 blk.9.attn_q.weight Block 9 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K
88 blk.9.attn_v.weight Block 9 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
89 blk.9.ffn_down.weight Block 9 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
90 blk.9.ffn_gate.weight Block 9 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
91 blk.9.ffn_norm.weight Block 9 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
92 blk.9.ffn_up.weight Block 9 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
  • Total elements in blk.9: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 10 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
93 blk.10.attn_k.weight Block 10 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K
94 blk.10.attn_norm.weight Block 10 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
95 blk.10.attn_output.weight Block 10 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
96 blk.10.attn_q.weight Block 10 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K
97 blk.10.attn_v.weight Block 10 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
98 blk.10.ffn_down.weight Block 10 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
99 blk.10.ffn_gate.weight Block 10 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
100 blk.10.ffn_norm.weight Block 10 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
101 blk.10.ffn_up.weight Block 10 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
  • Total elements in blk.10: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 11 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
102 blk.11.attn_k.weight Block 11 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K
103 blk.11.attn_norm.weight Block 11 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
104 blk.11.attn_output.weight Block 11 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
105 blk.11.attn_q.weight Block 11 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K
106 blk.11.attn_v.weight Block 11 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
107 blk.11.ffn_down.weight Block 11 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
108 blk.11.ffn_gate.weight Block 11 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
109 blk.11.ffn_norm.weight Block 11 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
110 blk.11.ffn_up.weight Block 11 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
  • Total elements in blk.11: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 12 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
111 blk.12.attn_k.weight Block 12 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K
112 blk.12.attn_norm.weight Block 12 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
113 blk.12.attn_output.weight Block 12 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
114 blk.12.attn_q.weight Block 12 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K
115 blk.12.attn_v.weight Block 12 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
116 blk.12.ffn_down.weight Block 12 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
117 blk.12.ffn_gate.weight Block 12 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
118 blk.12.ffn_norm.weight Block 12 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
119 blk.12.ffn_up.weight Block 12 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
  • Total elements in blk.12: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 13 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
120 blk.13.attn_k.weight Block 13 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K
121 blk.13.attn_norm.weight Block 13 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
122 blk.13.attn_output.weight Block 13 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
123 blk.13.attn_q.weight Block 13 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K
124 blk.13.attn_v.weight Block 13 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
125 blk.13.ffn_down.weight Block 13 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
126 blk.13.ffn_gate.weight Block 13 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
127 blk.13.ffn_norm.weight Block 13 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
128 blk.13.ffn_up.weight Block 13 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
  • Total elements in blk.13: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 14 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
129 blk.14.attn_k.weight Block 14 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K
130 blk.14.attn_norm.weight Block 14 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
131 blk.14.attn_output.weight Block 14 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
132 blk.14.attn_q.weight Block 14 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K
133 blk.14.attn_v.weight Block 14 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
134 blk.14.ffn_down.weight Block 14 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
135 blk.14.ffn_gate.weight Block 14 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
136 blk.14.ffn_norm.weight Block 14 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
137 blk.14.ffn_up.weight Block 14 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
  • Total elements in blk.14: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 15 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
138 blk.15.attn_k.weight Block 15 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K
139 blk.15.attn_norm.weight Block 15 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
140 blk.15.attn_output.weight Block 15 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
141 blk.15.attn_q.weight Block 15 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K
142 blk.15.attn_v.weight Block 15 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
143 blk.15.ffn_down.weight Block 15 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
144 blk.15.ffn_gate.weight Block 15 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
145 blk.15.ffn_norm.weight Block 15 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
146 blk.15.ffn_up.weight Block 15 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
  • Total elements in blk.15: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 16 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
147 blk.16.attn_k.weight Block 16 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K
148 blk.16.attn_norm.weight Block 16 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
149 blk.16.attn_output.weight Block 16 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
150 blk.16.attn_q.weight Block 16 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K
151 blk.16.attn_v.weight Block 16 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
152 blk.16.ffn_down.weight Block 16 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
153 blk.16.ffn_gate.weight Block 16 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
154 blk.16.ffn_norm.weight Block 16 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
155 blk.16.ffn_up.weight Block 16 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
  • Total elements in blk.16: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 17 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
156 blk.17.attn_k.weight Block 17 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0
157 blk.17.attn_norm.weight Block 17 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
158 blk.17.attn_output.weight Block 17 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
159 blk.17.attn_q.weight Block 17 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q8_0
160 blk.17.attn_v.weight Block 17 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
161 blk.17.ffn_down.weight Block 17 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
162 blk.17.ffn_gate.weight Block 17 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
163 blk.17.ffn_norm.weight Block 17 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
164 blk.17.ffn_up.weight Block 17 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
  • Total elements in blk.17: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 18 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
165 blk.18.attn_k.weight Block 18 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0
166 blk.18.attn_norm.weight Block 18 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
167 blk.18.attn_output.weight Block 18 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
168 blk.18.attn_q.weight Block 18 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q8_0
169 blk.18.attn_v.weight Block 18 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
170 blk.18.ffn_down.weight Block 18 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
171 blk.18.ffn_gate.weight Block 18 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
172 blk.18.ffn_norm.weight Block 18 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
173 blk.18.ffn_up.weight Block 18 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
  • Total elements in blk.18: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 19 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
174 blk.19.attn_k.weight Block 19 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K
175 blk.19.attn_norm.weight Block 19 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
176 blk.19.attn_output.weight Block 19 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
177 blk.19.attn_q.weight Block 19 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K
178 blk.19.attn_v.weight Block 19 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
179 blk.19.ffn_down.weight Block 19 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
180 blk.19.ffn_gate.weight Block 19 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
181 blk.19.ffn_norm.weight Block 19 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
182 blk.19.ffn_up.weight Block 19 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K
  • Total elements in blk.19: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 20 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
183 blk.20.attn_k.weight Block 20 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0
184 blk.20.attn_norm.weight Block 20 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
185 blk.20.attn_output.weight Block 20 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
186 blk.20.attn_q.weight Block 20 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q8_0
187 blk.20.attn_v.weight Block 20 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
188 blk.20.ffn_down.weight Block 20 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
189 blk.20.ffn_gate.weight Block 20 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
190 blk.20.ffn_norm.weight Block 20 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
191 blk.20.ffn_up.weight Block 20 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
  • Total elements in blk.20: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 21 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
192 blk.21.attn_k.weight Block 21 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K
193 blk.21.attn_norm.weight Block 21 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
194 blk.21.attn_output.weight Block 21 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
195 blk.21.attn_q.weight Block 21 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K
196 blk.21.attn_v.weight Block 21 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
197 blk.21.ffn_down.weight Block 21 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
198 blk.21.ffn_gate.weight Block 21 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
199 blk.21.ffn_norm.weight Block 21 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
200 blk.21.ffn_up.weight Block 21 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
  • Total elements in blk.21: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 22 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
201 blk.22.attn_k.weight Block 22 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0
202 blk.22.attn_norm.weight Block 22 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
203 blk.22.attn_output.weight Block 22 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
204 blk.22.attn_q.weight Block 22 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q8_0
205 blk.22.attn_v.weight Block 22 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
206 blk.22.ffn_down.weight Block 22 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
207 blk.22.ffn_gate.weight Block 22 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
208 blk.22.ffn_norm.weight Block 22 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
209 blk.22.ffn_up.weight Block 22 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
  • Total elements in blk.22: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 23 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
210 blk.23.attn_k.weight Block 23 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0
211 blk.23.attn_norm.weight Block 23 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
212 blk.23.attn_output.weight Block 23 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
213 blk.23.attn_q.weight Block 23 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q8_0
214 blk.23.attn_v.weight Block 23 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
215 blk.23.ffn_down.weight Block 23 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
216 blk.23.ffn_gate.weight Block 23 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
217 blk.23.ffn_norm.weight Block 23 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
218 blk.23.ffn_up.weight Block 23 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
  • Total elements in blk.23: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 24 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
219 blk.24.attn_k.weight Block 24 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0
220 blk.24.attn_norm.weight Block 24 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
221 blk.24.attn_output.weight Block 24 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
222 blk.24.attn_q.weight Block 24 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q8_0
223 blk.24.attn_v.weight Block 24 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
224 blk.24.ffn_down.weight Block 24 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
225 blk.24.ffn_gate.weight Block 24 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
226 blk.24.ffn_norm.weight Block 24 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
227 blk.24.ffn_up.weight Block 24 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
  • Total elements in blk.24: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 25 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
228 blk.25.attn_k.weight Block 25 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0
229 blk.25.attn_norm.weight Block 25 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
230 blk.25.attn_output.weight Block 25 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
231 blk.25.attn_q.weight Block 25 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q8_0
232 blk.25.attn_v.weight Block 25 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
233 blk.25.ffn_down.weight Block 25 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
234 blk.25.ffn_gate.weight Block 25 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
235 blk.25.ffn_norm.weight Block 25 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
236 blk.25.ffn_up.weight Block 25 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
  • Total elements in blk.25: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 26 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
237 blk.26.attn_k.weight Block 26 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0
238 blk.26.attn_norm.weight Block 26 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
239 blk.26.attn_output.weight Block 26 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
240 blk.26.attn_q.weight Block 26 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q8_0
241 blk.26.attn_v.weight Block 26 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
242 blk.26.ffn_down.weight Block 26 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
243 blk.26.ffn_gate.weight Block 26 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
244 blk.26.ffn_norm.weight Block 26 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
245 blk.26.ffn_up.weight Block 26 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
  • Total elements in blk.26: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 27 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
246 blk.27.attn_k.weight Block 27 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K
247 blk.27.attn_norm.weight Block 27 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
248 blk.27.attn_output.weight Block 27 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
249 blk.27.attn_q.weight Block 27 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K
250 blk.27.attn_v.weight Block 27 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
251 blk.27.ffn_down.weight Block 27 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
252 blk.27.ffn_gate.weight Block 27 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
253 blk.27.ffn_norm.weight Block 27 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
254 blk.27.ffn_up.weight Block 27 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
  • Total elements in blk.27: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 28 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
255 blk.28.attn_k.weight Block 28 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0
256 blk.28.attn_norm.weight Block 28 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
257 blk.28.attn_output.weight Block 28 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
258 blk.28.attn_q.weight Block 28 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q8_0
259 blk.28.attn_v.weight Block 28 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
260 blk.28.ffn_down.weight Block 28 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
261 blk.28.ffn_gate.weight Block 28 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
262 blk.28.ffn_norm.weight Block 28 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
263 blk.28.ffn_up.weight Block 28 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
  • Total elements in blk.28: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 29 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
264 blk.29.attn_k.weight Block 29 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0
265 blk.29.attn_norm.weight Block 29 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
266 blk.29.attn_output.weight Block 29 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
267 blk.29.attn_q.weight Block 29 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q8_0
268 blk.29.attn_v.weight Block 29 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
269 blk.29.ffn_down.weight Block 29 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
270 blk.29.ffn_gate.weight Block 29 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
271 blk.29.ffn_norm.weight Block 29 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
272 blk.29.ffn_up.weight Block 29 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
  • Total elements in blk.29: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 30 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
273 blk.30.attn_k.weight Block 30 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0
274 blk.30.attn_norm.weight Block 30 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
275 blk.30.attn_output.weight Block 30 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
276 blk.30.attn_q.weight Block 30 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q8_0
277 blk.30.attn_v.weight Block 30 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
278 blk.30.ffn_down.weight Block 30 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
279 blk.30.ffn_gate.weight Block 30 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
280 blk.30.ffn_norm.weight Block 30 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
281 blk.30.ffn_up.weight Block 30 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
  • Total elements in blk.30: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 31 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
282 blk.31.attn_k.weight Block 31 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0
283 blk.31.attn_norm.weight Block 31 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
284 blk.31.attn_output.weight Block 31 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
285 blk.31.attn_q.weight Block 31 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q8_0
286 blk.31.attn_v.weight Block 31 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
287 blk.31.ffn_down.weight Block 31 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
288 blk.31.ffn_gate.weight Block 31 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
289 blk.31.ffn_norm.weight Block 31 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
290 blk.31.ffn_up.weight Block 31 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
  • Total elements in blk.31: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 32 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
291 blk.32.attn_k.weight Block 32 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0
292 blk.32.attn_norm.weight Block 32 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
293 blk.32.attn_output.weight Block 32 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
294 blk.32.attn_q.weight Block 32 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q8_0
295 blk.32.attn_v.weight Block 32 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
296 blk.32.ffn_down.weight Block 32 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
297 blk.32.ffn_gate.weight Block 32 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
298 blk.32.ffn_norm.weight Block 32 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
299 blk.32.ffn_up.weight Block 32 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
  • Total elements in blk.32: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 33 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
300 blk.33.attn_k.weight Block 33 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0
301 blk.33.attn_norm.weight Block 33 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
302 blk.33.attn_output.weight Block 33 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
303 blk.33.attn_q.weight Block 33 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q8_0
304 blk.33.attn_v.weight Block 33 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
305 blk.33.ffn_down.weight Block 33 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
306 blk.33.ffn_gate.weight Block 33 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
307 blk.33.ffn_norm.weight Block 33 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
308 blk.33.ffn_up.weight Block 33 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
  • Total elements in blk.33: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 34 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
309 blk.34.attn_k.weight Block 34 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0
310 blk.34.attn_norm.weight Block 34 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
311 blk.34.attn_output.weight Block 34 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
312 blk.34.attn_q.weight Block 34 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q8_0
313 blk.34.attn_v.weight Block 34 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
314 blk.34.ffn_down.weight Block 34 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
315 blk.34.ffn_gate.weight Block 34 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
316 blk.34.ffn_norm.weight Block 34 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
317 blk.34.ffn_up.weight Block 34 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
  • Total elements in blk.34: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 35 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
318 blk.35.attn_k.weight Block 35 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0
319 blk.35.attn_norm.weight Block 35 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
320 blk.35.attn_output.weight Block 35 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
321 blk.35.attn_q.weight Block 35 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q8_0
322 blk.35.attn_v.weight Block 35 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
323 blk.35.ffn_down.weight Block 35 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
324 blk.35.ffn_gate.weight Block 35 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
325 blk.35.ffn_norm.weight Block 35 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
326 blk.35.ffn_up.weight Block 35 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
  • Total elements in blk.35: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 36 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
327 blk.36.attn_k.weight Block 36 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0
328 blk.36.attn_norm.weight Block 36 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
329 blk.36.attn_output.weight Block 36 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
330 blk.36.attn_q.weight Block 36 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q8_0
331 blk.36.attn_v.weight Block 36 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
332 blk.36.ffn_down.weight Block 36 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
333 blk.36.ffn_gate.weight Block 36 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
334 blk.36.ffn_norm.weight Block 36 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
335 blk.36.ffn_up.weight Block 36 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
  • Total elements in blk.36: (~556M) 555755520
  • Percentage of total elements: 2.47%

Block 37 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
336 blk.37.attn_k.weight Block 37 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0
337 blk.37.attn_norm.weight Block 37 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
338 blk.37.attn_output.weight Block 37 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q8_0
339 blk.37.attn_q.weight Block 37 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q8_0
340 blk.37.attn_v.weight Block 37 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 F16
341 blk.37.ffn_down.weight Block 37 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0
342 blk.37.ffn_gate.weight Block 37 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
343 blk.37.ffn_norm.weight Block 37 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32
344 blk.37.ffn_up.weight Block 37 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q8_0
  • Total elements in blk.37: (~556M) 555755520
  • Percentage of total elements: 2.47%