Files
Dolphin-Mistral-24B-Venice-…/scores/Dolphin-Mistral-24B-Venice-Edition-IQ3_M.md
2025-07-01 08:19:46 +01:00

1155 lines
99 KiB
Markdown

# Dolphin-Mistral-24B-Venice-Edition-pruned-IQ3_M.gguf - GGUF Internal File Dump
- Endian: LITTLE endian
## Key Value Metadata Store
There are 46 key-value pairs in this file
| POS | TYPE | Count | Key | Value |
|----:|:---------|-------:|:---------------------------------------|:--------------------------------------------------------------------|
| 1 | UINT32 | 1 | GGUF.version | 3 |
| 2 | UINT64 | 1 | GGUF.tensor_count | 345 |
| 3 | UINT64 | 1 | GGUF.kv_count | 43 |
| 4 | STRING | 1 | general.architecture | `llama` |
| 5 | STRING | 1 | general.type | `model` |
| 6 | STRING | 1 | general.name | `Dolphin Mistral 24B Venice Edition` |
| 7 | STRING | 1 | general.finetune | `Venice-Edition` |
| 8 | STRING | 1 | general.basename | `Dolphin-Mistral` |
| 9 | STRING | 1 | general.size_label | `24B` |
| 10 | STRING | 1 | general.license | `apache-2.0` |
| 11 | UINT32 | 1 | general.base_model.count | 1 |
| 12 | STRING | 1 | general.base_model.0.name | `Mistral Small 24B Instruct 2501` |
| 13 | STRING | 1 | general.base_model.0.version | `2501` |
| 14 | STRING | 1 | general.base_model.0.organization | `Mistralai` |
| 15 | STRING | 1 | general.base_model.0.repo_url | `https://huggingface.co/mistral`...`istral-Small-24B-Instruct-2501` |
| 16 | UINT32 | 1 | llama.context_length | 32768 |
| 17 | UINT32 | 1 | llama.embedding_length | 5120 |
| 18 | UINT32 | 1 | llama.feed_forward_length | 32768 |
| 19 | UINT32 | 1 | llama.attention.head_count | 32 |
| 20 | UINT32 | 1 | llama.attention.head_count_kv | 8 |
| 21 | FLOAT32 | 1 | llama.rope.freq_base | 100000000.0 |
| 22 | FLOAT32 | 1 | llama.attention.layer_norm_rms_epsilon | 1e-05 |
| 23 | UINT32 | 1 | llama.attention.key_length | 128 |
| 24 | UINT32 | 1 | llama.attention.value_length | 128 |
| 25 | UINT32 | 1 | llama.vocab_size | 131072 |
| 26 | UINT32 | 1 | llama.rope.dimension_count | 128 |
| 27 | STRING | 1 | tokenizer.ggml.model | `gpt2` |
| 28 | STRING | 1 | tokenizer.ggml.pre | `tekken` |
| 29 | [STRING] | 131072 | tokenizer.ggml.tokens | [ `<unk>`, `<s>`, `</s>`, `[INST]`, `[/INST]`, ... ] |
| 30 | [INT32] | 131072 | tokenizer.ggml.token_type | [ 3, 3, 3, 3, 3, 3, 3, ... ] |
| 31 | [STRING] | 269443 | tokenizer.ggml.merges | [ `Ġ Ġ`, `Ġ t`, `e r`, `i n`, `Ġ ĠĠĠ`, ... ] |
| 32 | UINT32 | 1 | tokenizer.ggml.bos_token_id | 1 |
| 33 | UINT32 | 1 | tokenizer.ggml.eos_token_id | 2 |
| 34 | UINT32 | 1 | tokenizer.ggml.unknown_token_id | 0 |
| 35 | UINT32 | 1 | tokenizer.ggml.padding_token_id | 11 |
| 36 | BOOL | 1 | tokenizer.ggml.add_bos_token | True |
| 37 | BOOL | 1 | tokenizer.ggml.add_eos_token | False |
| 38 | STRING | 1 | tokenizer.chat_template | `{%- set today = strftime_now("`...` {%- endif %}{%- endfor %}` |
| 39 | BOOL | 1 | tokenizer.ggml.add_space_prefix | False |
| 40 | UINT32 | 1 | general.quantization_version | 2 |
| 41 | UINT32 | 1 | general.file_type | 27 |
| 42 | STRING | 1 | quantize.imatrix.file | `./imatrix/imatrix-Dolphin-Mist`...`l-24B-Venice-Edition-small.dat` |
| 43 | STRING | 1 | quantize.imatrix.dataset | `../../datasets/imatrix/combined_eur_small.txt` |
| 44 | UINT32 | 1 | quantize.imatrix.entries_count | 281 |
| 45 | UINT32 | 1 | quantize.imatrix.chunks_count | 3192 |
| 46 | UINT32 | 1 | llama.block_count | 38 |
## Tensors Overview ~22B Elements
Total number of elements in all tensors: 22460892160 Elements
- [Dolphin-Mistral-24B-Venice-Edition-pruned-IQ3\_M.gguf - GGUF Internal File Dump](#Dolphin-Mistral-24B-Venice-Edition-pruned-iq3_mgguf---gguf-internal-file-dump)
- [Key Value Metadata Store](#key-value-metadata-store)
- [Tensors Overview ~22B Elements](#tensors-overview-22b-elements)
- [Tensor Data Offset](#tensor-data-offset)
- [Base Tensor Group : ~1B Elements](#base-tensor-group--1b-elements)
- [Block 0 Tensor Group : ~556M Elements](#block-0-tensor-group--556m-elements)
- [Block 1 Tensor Group : ~556M Elements](#block-1-tensor-group--556m-elements)
- [Block 2 Tensor Group : ~556M Elements](#block-2-tensor-group--556m-elements)
- [Block 3 Tensor Group : ~556M Elements](#block-3-tensor-group--556m-elements)
- [Block 4 Tensor Group : ~556M Elements](#block-4-tensor-group--556m-elements)
- [Block 5 Tensor Group : ~556M Elements](#block-5-tensor-group--556m-elements)
- [Block 6 Tensor Group : ~556M Elements](#block-6-tensor-group--556m-elements)
- [Block 7 Tensor Group : ~556M Elements](#block-7-tensor-group--556m-elements)
- [Block 8 Tensor Group : ~556M Elements](#block-8-tensor-group--556m-elements)
- [Block 9 Tensor Group : ~556M Elements](#block-9-tensor-group--556m-elements)
- [Block 10 Tensor Group : ~556M Elements](#block-10-tensor-group--556m-elements)
- [Block 11 Tensor Group : ~556M Elements](#block-11-tensor-group--556m-elements)
- [Block 12 Tensor Group : ~556M Elements](#block-12-tensor-group--556m-elements)
- [Block 13 Tensor Group : ~556M Elements](#block-13-tensor-group--556m-elements)
- [Block 14 Tensor Group : ~556M Elements](#block-14-tensor-group--556m-elements)
- [Block 15 Tensor Group : ~556M Elements](#block-15-tensor-group--556m-elements)
- [Block 16 Tensor Group : ~556M Elements](#block-16-tensor-group--556m-elements)
- [Block 17 Tensor Group : ~556M Elements](#block-17-tensor-group--556m-elements)
- [Block 18 Tensor Group : ~556M Elements](#block-18-tensor-group--556m-elements)
- [Block 19 Tensor Group : ~556M Elements](#block-19-tensor-group--556m-elements)
- [Block 20 Tensor Group : ~556M Elements](#block-20-tensor-group--556m-elements)
- [Block 21 Tensor Group : ~556M Elements](#block-21-tensor-group--556m-elements)
- [Block 22 Tensor Group : ~556M Elements](#block-22-tensor-group--556m-elements)
- [Block 23 Tensor Group : ~556M Elements](#block-23-tensor-group--556m-elements)
- [Block 24 Tensor Group : ~556M Elements](#block-24-tensor-group--556m-elements)
- [Block 25 Tensor Group : ~556M Elements](#block-25-tensor-group--556m-elements)
- [Block 26 Tensor Group : ~556M Elements](#block-26-tensor-group--556m-elements)
- [Block 27 Tensor Group : ~556M Elements](#block-27-tensor-group--556m-elements)
- [Block 28 Tensor Group : ~556M Elements](#block-28-tensor-group--556m-elements)
- [Block 29 Tensor Group : ~556M Elements](#block-29-tensor-group--556m-elements)
- [Block 30 Tensor Group : ~556M Elements](#block-30-tensor-group--556m-elements)
- [Block 31 Tensor Group : ~556M Elements](#block-31-tensor-group--556m-elements)
- [Block 32 Tensor Group : ~556M Elements](#block-32-tensor-group--556m-elements)
- [Block 33 Tensor Group : ~556M Elements](#block-33-tensor-group--556m-elements)
- [Block 34 Tensor Group : ~556M Elements](#block-34-tensor-group--556m-elements)
- [Block 35 Tensor Group : ~556M Elements](#block-35-tensor-group--556m-elements)
- [Block 36 Tensor Group : ~556M Elements](#block-36-tensor-group--556m-elements)
- [Block 37 Tensor Group : ~556M Elements](#block-37-tensor-group--556m-elements)
### Tensor Data Offset
This table contains the offset and data segment relative to start of file
| T_ID | Tensor Layer Name | Data Offset (B) | Data Size (B) |
|-----:|:--------------------------|-----------------:|-----------------:|
| 0 | output.weight | 0x784500 | 0x11300000 |
| 1 | output_norm.weight | 0x11a84500 | 0x5000 |
| 2 | token_embd.weight | 0x11a89500 | 0x11300000 |
| 3 | blk.0.attn_k.weight | 0x22d89500 | 0x1ea000 |
| 4 | blk.0.attn_norm.weight | 0x22f73500 | 0x5000 |
| 5 | blk.0.attn_output.weight | 0x22f78500 | 0xb40000 |
| 6 | blk.0.attn_q.weight | 0x23ab8500 | 0x7a8000 |
| 7 | blk.0.attn_v.weight | 0x24260500 | 0x226000 |
| 8 | blk.0.ffn_down.weight | 0x24486500 | 0x5a00000 |
| 9 | blk.0.ffn_gate.weight | 0x29e86500 | 0x3d40000 |
| 10 | blk.0.ffn_norm.weight | 0x2dbc6500 | 0x5000 |
| 11 | blk.0.ffn_up.weight | 0x2dbcb500 | 0x3d40000 |
| 12 | blk.1.attn_k.weight | 0x3190b500 | 0x1ea000 |
| 13 | blk.1.attn_norm.weight | 0x31af5500 | 0x5000 |
| 14 | blk.1.attn_output.weight | 0x31afa500 | 0xb40000 |
| 15 | blk.1.attn_q.weight | 0x3263a500 | 0x7a8000 |
| 16 | blk.1.attn_v.weight | 0x32de2500 | 0x226000 |
| 17 | blk.1.ffn_down.weight | 0x33008500 | 0x5a00000 |
| 18 | blk.1.ffn_gate.weight | 0x38a08500 | 0x3d40000 |
| 19 | blk.1.ffn_norm.weight | 0x3c748500 | 0x5000 |
| 20 | blk.1.ffn_up.weight | 0x3c74d500 | 0x3d40000 |
| 21 | blk.2.attn_k.weight | 0x4048d500 | 0x1ea000 |
| 22 | blk.2.attn_norm.weight | 0x40677500 | 0x5000 |
| 23 | blk.2.attn_output.weight | 0x4067c500 | 0xb40000 |
| 24 | blk.2.attn_q.weight | 0x411bc500 | 0x7a8000 |
| 25 | blk.2.attn_v.weight | 0x41964500 | 0x226000 |
| 26 | blk.2.ffn_down.weight | 0x41b8a500 | 0x5a00000 |
| 27 | blk.2.ffn_gate.weight | 0x4758a500 | 0x3d40000 |
| 28 | blk.2.ffn_norm.weight | 0x4b2ca500 | 0x5000 |
| 29 | blk.2.ffn_up.weight | 0x4b2cf500 | 0x3d40000 |
| 30 | blk.3.attn_k.weight | 0x4f00f500 | 0x1ea000 |
| 31 | blk.3.attn_norm.weight | 0x4f1f9500 | 0x5000 |
| 32 | blk.3.attn_output.weight | 0x4f1fe500 | 0xb40000 |
| 33 | blk.3.attn_q.weight | 0x4fd3e500 | 0x7a8000 |
| 34 | blk.3.attn_v.weight | 0x504e6500 | 0x226000 |
| 35 | blk.3.ffn_down.weight | 0x5070c500 | 0x5a00000 |
| 36 | blk.3.ffn_gate.weight | 0x5610c500 | 0x3d40000 |
| 37 | blk.3.ffn_norm.weight | 0x59e4c500 | 0x5000 |
| 38 | blk.3.ffn_up.weight | 0x59e51500 | 0x3d40000 |
| 39 | blk.4.attn_k.weight | 0x5db91500 | 0x1ea000 |
| 40 | blk.4.attn_norm.weight | 0x5dd7b500 | 0x5000 |
| 41 | blk.4.attn_output.weight | 0x5dd80500 | 0xb40000 |
| 42 | blk.4.attn_q.weight | 0x5e8c0500 | 0x7a8000 |
| 43 | blk.4.attn_v.weight | 0x5f068500 | 0x226000 |
| 44 | blk.4.ffn_down.weight | 0x5f28e500 | 0x5a00000 |
| 45 | blk.4.ffn_gate.weight | 0x64c8e500 | 0x3d40000 |
| 46 | blk.4.ffn_norm.weight | 0x689ce500 | 0x5000 |
| 47 | blk.4.ffn_up.weight | 0x689d3500 | 0x3d40000 |
| 48 | blk.5.attn_k.weight | 0x6c713500 | 0x1ea000 |
| 49 | blk.5.attn_norm.weight | 0x6c8fd500 | 0x5000 |
| 50 | blk.5.attn_output.weight | 0x6c902500 | 0xb40000 |
| 51 | blk.5.attn_q.weight | 0x6d442500 | 0x7a8000 |
| 52 | blk.5.attn_v.weight | 0x6dbea500 | 0x226000 |
| 53 | blk.5.ffn_down.weight | 0x6de10500 | 0x44c0000 |
| 54 | blk.5.ffn_gate.weight | 0x722d0500 | 0x3d40000 |
| 55 | blk.5.ffn_norm.weight | 0x76010500 | 0x5000 |
| 56 | blk.5.ffn_up.weight | 0x76015500 | 0x3d40000 |
| 57 | blk.6.attn_k.weight | 0x79d55500 | 0x1ea000 |
| 58 | blk.6.attn_norm.weight | 0x79f3f500 | 0x5000 |
| 59 | blk.6.attn_output.weight | 0x79f44500 | 0xb40000 |
| 60 | blk.6.attn_q.weight | 0x7aa84500 | 0x7a8000 |
| 61 | blk.6.attn_v.weight | 0x7b22c500 | 0x226000 |
| 62 | blk.6.ffn_down.weight | 0x7b452500 | 0x44c0000 |
| 63 | blk.6.ffn_gate.weight | 0x7f912500 | 0x3d40000 |
| 64 | blk.6.ffn_norm.weight | 0x83652500 | 0x5000 |
| 65 | blk.6.ffn_up.weight | 0x83657500 | 0x3d40000 |
| 66 | blk.7.attn_k.weight | 0x87397500 | 0x1ea000 |
| 67 | blk.7.attn_norm.weight | 0x87581500 | 0x5000 |
| 68 | blk.7.attn_output.weight | 0x87586500 | 0xb40000 |
| 69 | blk.7.attn_q.weight | 0x880c6500 | 0x7a8000 |
| 70 | blk.7.attn_v.weight | 0x8886e500 | 0x226000 |
| 71 | blk.7.ffn_down.weight | 0x88a94500 | 0x44c0000 |
| 72 | blk.7.ffn_gate.weight | 0x8cf54500 | 0x3d40000 |
| 73 | blk.7.ffn_norm.weight | 0x90c94500 | 0x5000 |
| 74 | blk.7.ffn_up.weight | 0x90c99500 | 0x3d40000 |
| 75 | blk.8.attn_k.weight | 0x949d9500 | 0x1ea000 |
| 76 | blk.8.attn_norm.weight | 0x94bc3500 | 0x5000 |
| 77 | blk.8.attn_output.weight | 0x94bc8500 | 0xb40000 |
| 78 | blk.8.attn_q.weight | 0x95708500 | 0x7a8000 |
| 79 | blk.8.attn_v.weight | 0x95eb0500 | 0x226000 |
| 80 | blk.8.ffn_down.weight | 0x960d6500 | 0x44c0000 |
| 81 | blk.8.ffn_gate.weight | 0x9a596500 | 0x3d40000 |
| 82 | blk.8.ffn_norm.weight | 0x9e2d6500 | 0x5000 |
| 83 | blk.8.ffn_up.weight | 0x9e2db500 | 0x3d40000 |
| 84 | blk.9.attn_k.weight | 0xa201b500 | 0x1ea000 |
| 85 | blk.9.attn_norm.weight | 0xa2205500 | 0x5000 |
| 86 | blk.9.attn_output.weight | 0xa220a500 | 0xb40000 |
| 87 | blk.9.attn_q.weight | 0xa2d4a500 | 0x7a8000 |
| 88 | blk.9.attn_v.weight | 0xa34f2500 | 0x226000 |
| 89 | blk.9.ffn_down.weight | 0xa3718500 | 0x44c0000 |
| 90 | blk.9.ffn_gate.weight | 0xa7bd8500 | 0x3d40000 |
| 91 | blk.9.ffn_norm.weight | 0xab918500 | 0x5000 |
| 92 | blk.9.ffn_up.weight | 0xab91d500 | 0x3d40000 |
| 93 | blk.10.attn_k.weight | 0xaf65d500 | 0x1ea000 |
| 94 | blk.10.attn_norm.weight | 0xaf847500 | 0x5000 |
| 95 | blk.10.attn_output.weight | 0xaf84c500 | 0xb40000 |
| 96 | blk.10.attn_q.weight | 0xb038c500 | 0x7a8000 |
| 97 | blk.10.attn_v.weight | 0xb0b34500 | 0x226000 |
| 98 | blk.10.ffn_down.weight | 0xb0d5a500 | 0x44c0000 |
| 99 | blk.10.ffn_gate.weight | 0xb521a500 | 0x3d40000 |
| 100 | blk.10.ffn_norm.weight | 0xb8f5a500 | 0x5000 |
| 101 | blk.10.ffn_up.weight | 0xb8f5f500 | 0x3d40000 |
| 102 | blk.11.attn_k.weight | 0xbcc9f500 | 0x1ea000 |
| 103 | blk.11.attn_norm.weight | 0xbce89500 | 0x5000 |
| 104 | blk.11.attn_output.weight | 0xbce8e500 | 0xb40000 |
| 105 | blk.11.attn_q.weight | 0xbd9ce500 | 0x7a8000 |
| 106 | blk.11.attn_v.weight | 0xbe176500 | 0x226000 |
| 107 | blk.11.ffn_down.weight | 0xbe39c500 | 0x44c0000 |
| 108 | blk.11.ffn_gate.weight | 0xc285c500 | 0x3d40000 |
| 109 | blk.11.ffn_norm.weight | 0xc659c500 | 0x5000 |
| 110 | blk.11.ffn_up.weight | 0xc65a1500 | 0x3d40000 |
| 111 | blk.12.attn_k.weight | 0xca2e1500 | 0x1ea000 |
| 112 | blk.12.attn_norm.weight | 0xca4cb500 | 0x5000 |
| 113 | blk.12.attn_output.weight | 0xca4d0500 | 0xb40000 |
| 114 | blk.12.attn_q.weight | 0xcb010500 | 0x7a8000 |
| 115 | blk.12.attn_v.weight | 0xcb7b8500 | 0x226000 |
| 116 | blk.12.ffn_down.weight | 0xcb9de500 | 0x44c0000 |
| 117 | blk.12.ffn_gate.weight | 0xcfe9e500 | 0x3d40000 |
| 118 | blk.12.ffn_norm.weight | 0xd3bde500 | 0x5000 |
| 119 | blk.12.ffn_up.weight | 0xd3be3500 | 0x3d40000 |
| 120 | blk.13.attn_k.weight | 0xd7923500 | 0x1ea000 |
| 121 | blk.13.attn_norm.weight | 0xd7b0d500 | 0x5000 |
| 122 | blk.13.attn_output.weight | 0xd7b12500 | 0xb40000 |
| 123 | blk.13.attn_q.weight | 0xd8652500 | 0x7a8000 |
| 124 | blk.13.attn_v.weight | 0xd8dfa500 | 0x226000 |
| 125 | blk.13.ffn_down.weight | 0xd9020500 | 0x44c0000 |
| 126 | blk.13.ffn_gate.weight | 0xdd4e0500 | 0x3d40000 |
| 127 | blk.13.ffn_norm.weight | 0xe1220500 | 0x5000 |
| 128 | blk.13.ffn_up.weight | 0xe1225500 | 0x3d40000 |
| 129 | blk.14.attn_k.weight | 0xe4f65500 | 0x1ea000 |
| 130 | blk.14.attn_norm.weight | 0xe514f500 | 0x5000 |
| 131 | blk.14.attn_output.weight | 0xe5154500 | 0xb40000 |
| 132 | blk.14.attn_q.weight | 0xe5c94500 | 0x7a8000 |
| 133 | blk.14.attn_v.weight | 0xe643c500 | 0x226000 |
| 134 | blk.14.ffn_down.weight | 0xe6662500 | 0x44c0000 |
| 135 | blk.14.ffn_gate.weight | 0xeab22500 | 0x3d40000 |
| 136 | blk.14.ffn_norm.weight | 0xee862500 | 0x5000 |
| 137 | blk.14.ffn_up.weight | 0xee867500 | 0x3d40000 |
| 138 | blk.15.attn_k.weight | 0xf25a7500 | 0x1ea000 |
| 139 | blk.15.attn_norm.weight | 0xf2791500 | 0x5000 |
| 140 | blk.15.attn_output.weight | 0xf2796500 | 0xb40000 |
| 141 | blk.15.attn_q.weight | 0xf32d6500 | 0x7a8000 |
| 142 | blk.15.attn_v.weight | 0xf3a7e500 | 0x226000 |
| 143 | blk.15.ffn_down.weight | 0xf3ca4500 | 0x44c0000 |
| 144 | blk.15.ffn_gate.weight | 0xf8164500 | 0x3d40000 |
| 145 | blk.15.ffn_norm.weight | 0xfbea4500 | 0x5000 |
| 146 | blk.15.ffn_up.weight | 0xfbea9500 | 0x3d40000 |
| 147 | blk.16.attn_k.weight | 0xffbe9500 | 0x1ea000 |
| 148 | blk.16.attn_norm.weight | 0xffdd3500 | 0x5000 |
| 149 | blk.16.attn_output.weight | 0xffdd8500 | 0xb40000 |
| 150 | blk.16.attn_q.weight | 0x100918500 | 0x7a8000 |
| 151 | blk.16.attn_v.weight | 0x1010c0500 | 0x226000 |
| 152 | blk.16.ffn_down.weight | 0x1012e6500 | 0x44c0000 |
| 153 | blk.16.ffn_gate.weight | 0x1057a6500 | 0x3d40000 |
| 154 | blk.16.ffn_norm.weight | 0x1094e6500 | 0x5000 |
| 155 | blk.16.ffn_up.weight | 0x1094eb500 | 0x3d40000 |
| 156 | blk.17.attn_k.weight | 0x10d22b500 | 0x226000 |
| 157 | blk.17.attn_norm.weight | 0x10d451500 | 0x5000 |
| 158 | blk.17.attn_output.weight | 0x10d456500 | 0xb40000 |
| 159 | blk.17.attn_q.weight | 0x10df96500 | 0x898000 |
| 160 | blk.17.attn_v.weight | 0x10e82e500 | 0x2d0000 |
| 161 | blk.17.ffn_down.weight | 0x10eafe500 | 0x44c0000 |
| 162 | blk.17.ffn_gate.weight | 0x112fbe500 | 0x3d40000 |
| 163 | blk.17.ffn_norm.weight | 0x116cfe500 | 0x5000 |
| 164 | blk.17.ffn_up.weight | 0x116d03500 | 0x3d40000 |
| 165 | blk.18.attn_k.weight | 0x11aa43500 | 0x226000 |
| 166 | blk.18.attn_norm.weight | 0x11ac69500 | 0x5000 |
| 167 | blk.18.attn_output.weight | 0x11ac6e500 | 0xb40000 |
| 168 | blk.18.attn_q.weight | 0x11b7ae500 | 0x898000 |
| 169 | blk.18.attn_v.weight | 0x11c046500 | 0x2d0000 |
| 170 | blk.18.ffn_down.weight | 0x11c316500 | 0x44c0000 |
| 171 | blk.18.ffn_gate.weight | 0x1207d6500 | 0x3d40000 |
| 172 | blk.18.ffn_norm.weight | 0x124516500 | 0x5000 |
| 173 | blk.18.ffn_up.weight | 0x12451b500 | 0x3d40000 |
| 174 | blk.19.attn_k.weight | 0x12825b500 | 0x1ea000 |
| 175 | blk.19.attn_norm.weight | 0x128445500 | 0x5000 |
| 176 | blk.19.attn_output.weight | 0x12844a500 | 0xb40000 |
| 177 | blk.19.attn_q.weight | 0x128f8a500 | 0x7a8000 |
| 178 | blk.19.attn_v.weight | 0x129732500 | 0x226000 |
| 179 | blk.19.ffn_down.weight | 0x129958500 | 0x44c0000 |
| 180 | blk.19.ffn_gate.weight | 0x12de18500 | 0x3d40000 |
| 181 | blk.19.ffn_norm.weight | 0x131b58500 | 0x5000 |
| 182 | blk.19.ffn_up.weight | 0x131b5d500 | 0x3d40000 |
| 183 | blk.20.attn_k.weight | 0x13589d500 | 0x226000 |
| 184 | blk.20.attn_norm.weight | 0x135ac3500 | 0x5000 |
| 185 | blk.20.attn_output.weight | 0x135ac8500 | 0xb40000 |
| 186 | blk.20.attn_q.weight | 0x136608500 | 0x898000 |
| 187 | blk.20.attn_v.weight | 0x136ea0500 | 0x2d0000 |
| 188 | blk.20.ffn_down.weight | 0x137170500 | 0x44c0000 |
| 189 | blk.20.ffn_gate.weight | 0x13b630500 | 0x44c0000 |
| 190 | blk.20.ffn_norm.weight | 0x13faf0500 | 0x5000 |
| 191 | blk.20.ffn_up.weight | 0x13faf5500 | 0x44c0000 |
| 192 | blk.21.attn_k.weight | 0x143fb5500 | 0x1ea000 |
| 193 | blk.21.attn_norm.weight | 0x14419f500 | 0x5000 |
| 194 | blk.21.attn_output.weight | 0x1441a4500 | 0xb40000 |
| 195 | blk.21.attn_q.weight | 0x144ce4500 | 0x7a8000 |
| 196 | blk.21.attn_v.weight | 0x14548c500 | 0x226000 |
| 197 | blk.21.ffn_down.weight | 0x1456b2500 | 0x44c0000 |
| 198 | blk.21.ffn_gate.weight | 0x149b72500 | 0x44c0000 |
| 199 | blk.21.ffn_norm.weight | 0x14e032500 | 0x5000 |
| 200 | blk.21.ffn_up.weight | 0x14e037500 | 0x44c0000 |
| 201 | blk.22.attn_k.weight | 0x1524f7500 | 0x226000 |
| 202 | blk.22.attn_norm.weight | 0x15271d500 | 0x5000 |
| 203 | blk.22.attn_output.weight | 0x152722500 | 0xb40000 |
| 204 | blk.22.attn_q.weight | 0x153262500 | 0x898000 |
| 205 | blk.22.attn_v.weight | 0x153afa500 | 0x2d0000 |
| 206 | blk.22.ffn_down.weight | 0x153dca500 | 0x44c0000 |
| 207 | blk.22.ffn_gate.weight | 0x15828a500 | 0x44c0000 |
| 208 | blk.22.ffn_norm.weight | 0x15c74a500 | 0x5000 |
| 209 | blk.22.ffn_up.weight | 0x15c74f500 | 0x44c0000 |
| 210 | blk.23.attn_k.weight | 0x160c0f500 | 0x226000 |
| 211 | blk.23.attn_norm.weight | 0x160e35500 | 0x5000 |
| 212 | blk.23.attn_output.weight | 0x160e3a500 | 0xb40000 |
| 213 | blk.23.attn_q.weight | 0x16197a500 | 0x898000 |
| 214 | blk.23.attn_v.weight | 0x162212500 | 0x2d0000 |
| 215 | blk.23.ffn_down.weight | 0x1624e2500 | 0x44c0000 |
| 216 | blk.23.ffn_gate.weight | 0x1669a2500 | 0x44c0000 |
| 217 | blk.23.ffn_norm.weight | 0x16ae62500 | 0x5000 |
| 218 | blk.23.ffn_up.weight | 0x16ae67500 | 0x44c0000 |
| 219 | blk.24.attn_k.weight | 0x16f327500 | 0x226000 |
| 220 | blk.24.attn_norm.weight | 0x16f54d500 | 0x5000 |
| 221 | blk.24.attn_output.weight | 0x16f552500 | 0xb40000 |
| 222 | blk.24.attn_q.weight | 0x170092500 | 0x898000 |
| 223 | blk.24.attn_v.weight | 0x17092a500 | 0x2d0000 |
| 224 | blk.24.ffn_down.weight | 0x170bfa500 | 0x44c0000 |
| 225 | blk.24.ffn_gate.weight | 0x1750ba500 | 0x44c0000 |
| 226 | blk.24.ffn_norm.weight | 0x17957a500 | 0x5000 |
| 227 | blk.24.ffn_up.weight | 0x17957f500 | 0x44c0000 |
| 228 | blk.25.attn_k.weight | 0x17da3f500 | 0x226000 |
| 229 | blk.25.attn_norm.weight | 0x17dc65500 | 0x5000 |
| 230 | blk.25.attn_output.weight | 0x17dc6a500 | 0xb40000 |
| 231 | blk.25.attn_q.weight | 0x17e7aa500 | 0x898000 |
| 232 | blk.25.attn_v.weight | 0x17f042500 | 0x2d0000 |
| 233 | blk.25.ffn_down.weight | 0x17f312500 | 0x44c0000 |
| 234 | blk.25.ffn_gate.weight | 0x1837d2500 | 0x44c0000 |
| 235 | blk.25.ffn_norm.weight | 0x187c92500 | 0x5000 |
| 236 | blk.25.ffn_up.weight | 0x187c97500 | 0x44c0000 |
| 237 | blk.26.attn_k.weight | 0x18c157500 | 0x226000 |
| 238 | blk.26.attn_norm.weight | 0x18c37d500 | 0x5000 |
| 239 | blk.26.attn_output.weight | 0x18c382500 | 0xb40000 |
| 240 | blk.26.attn_q.weight | 0x18cec2500 | 0x898000 |
| 241 | blk.26.attn_v.weight | 0x18d75a500 | 0x2d0000 |
| 242 | blk.26.ffn_down.weight | 0x18da2a500 | 0x44c0000 |
| 243 | blk.26.ffn_gate.weight | 0x191eea500 | 0x44c0000 |
| 244 | blk.26.ffn_norm.weight | 0x1963aa500 | 0x5000 |
| 245 | blk.26.ffn_up.weight | 0x1963af500 | 0x44c0000 |
| 246 | blk.27.attn_k.weight | 0x19a86f500 | 0x1ea000 |
| 247 | blk.27.attn_norm.weight | 0x19aa59500 | 0x5000 |
| 248 | blk.27.attn_output.weight | 0x19aa5e500 | 0xb40000 |
| 249 | blk.27.attn_q.weight | 0x19b59e500 | 0x7a8000 |
| 250 | blk.27.attn_v.weight | 0x19bd46500 | 0x226000 |
| 251 | blk.27.ffn_down.weight | 0x19bf6c500 | 0x44c0000 |
| 252 | blk.27.ffn_gate.weight | 0x1a042c500 | 0x44c0000 |
| 253 | blk.27.ffn_norm.weight | 0x1a48ec500 | 0x5000 |
| 254 | blk.27.ffn_up.weight | 0x1a48f1500 | 0x44c0000 |
| 255 | blk.28.attn_k.weight | 0x1a8db1500 | 0x226000 |
| 256 | blk.28.attn_norm.weight | 0x1a8fd7500 | 0x5000 |
| 257 | blk.28.attn_output.weight | 0x1a8fdc500 | 0xb40000 |
| 258 | blk.28.attn_q.weight | 0x1a9b1c500 | 0x898000 |
| 259 | blk.28.attn_v.weight | 0x1aa3b4500 | 0x2d0000 |
| 260 | blk.28.ffn_down.weight | 0x1aa684500 | 0x44c0000 |
| 261 | blk.28.ffn_gate.weight | 0x1aeb44500 | 0x44c0000 |
| 262 | blk.28.ffn_norm.weight | 0x1b3004500 | 0x5000 |
| 263 | blk.28.ffn_up.weight | 0x1b3009500 | 0x44c0000 |
| 264 | blk.29.attn_k.weight | 0x1b74c9500 | 0x226000 |
| 265 | blk.29.attn_norm.weight | 0x1b76ef500 | 0x5000 |
| 266 | blk.29.attn_output.weight | 0x1b76f4500 | 0xb40000 |
| 267 | blk.29.attn_q.weight | 0x1b8234500 | 0x898000 |
| 268 | blk.29.attn_v.weight | 0x1b8acc500 | 0x2d0000 |
| 269 | blk.29.ffn_down.weight | 0x1b8d9c500 | 0x44c0000 |
| 270 | blk.29.ffn_gate.weight | 0x1bd25c500 | 0x44c0000 |
| 271 | blk.29.ffn_norm.weight | 0x1c171c500 | 0x5000 |
| 272 | blk.29.ffn_up.weight | 0x1c1721500 | 0x44c0000 |
| 273 | blk.30.attn_k.weight | 0x1c5be1500 | 0x226000 |
| 274 | blk.30.attn_norm.weight | 0x1c5e07500 | 0x5000 |
| 275 | blk.30.attn_output.weight | 0x1c5e0c500 | 0xb40000 |
| 276 | blk.30.attn_q.weight | 0x1c694c500 | 0x898000 |
| 277 | blk.30.attn_v.weight | 0x1c71e4500 | 0x2d0000 |
| 278 | blk.30.ffn_down.weight | 0x1c74b4500 | 0x44c0000 |
| 279 | blk.30.ffn_gate.weight | 0x1cb974500 | 0x44c0000 |
| 280 | blk.30.ffn_norm.weight | 0x1cfe34500 | 0x5000 |
| 281 | blk.30.ffn_up.weight | 0x1cfe39500 | 0x44c0000 |
| 282 | blk.31.attn_k.weight | 0x1d42f9500 | 0x226000 |
| 283 | blk.31.attn_norm.weight | 0x1d451f500 | 0x5000 |
| 284 | blk.31.attn_output.weight | 0x1d4524500 | 0xb40000 |
| 285 | blk.31.attn_q.weight | 0x1d5064500 | 0x898000 |
| 286 | blk.31.attn_v.weight | 0x1d58fc500 | 0x2d0000 |
| 287 | blk.31.ffn_down.weight | 0x1d5bcc500 | 0x44c0000 |
| 288 | blk.31.ffn_gate.weight | 0x1da08c500 | 0x44c0000 |
| 289 | blk.31.ffn_norm.weight | 0x1de54c500 | 0x5000 |
| 290 | blk.31.ffn_up.weight | 0x1de551500 | 0x44c0000 |
| 291 | blk.32.attn_k.weight | 0x1e2a11500 | 0x226000 |
| 292 | blk.32.attn_norm.weight | 0x1e2c37500 | 0x5000 |
| 293 | blk.32.attn_output.weight | 0x1e2c3c500 | 0xb40000 |
| 294 | blk.32.attn_q.weight | 0x1e377c500 | 0x898000 |
| 295 | blk.32.attn_v.weight | 0x1e4014500 | 0x2d0000 |
| 296 | blk.32.ffn_down.weight | 0x1e42e4500 | 0x44c0000 |
| 297 | blk.32.ffn_gate.weight | 0x1e87a4500 | 0x44c0000 |
| 298 | blk.32.ffn_norm.weight | 0x1ecc64500 | 0x5000 |
| 299 | blk.32.ffn_up.weight | 0x1ecc69500 | 0x44c0000 |
| 300 | blk.33.attn_k.weight | 0x1f1129500 | 0x226000 |
| 301 | blk.33.attn_norm.weight | 0x1f134f500 | 0x5000 |
| 302 | blk.33.attn_output.weight | 0x1f1354500 | 0xb40000 |
| 303 | blk.33.attn_q.weight | 0x1f1e94500 | 0x898000 |
| 304 | blk.33.attn_v.weight | 0x1f272c500 | 0x2d0000 |
| 305 | blk.33.ffn_down.weight | 0x1f29fc500 | 0x44c0000 |
| 306 | blk.33.ffn_gate.weight | 0x1f6ebc500 | 0x44c0000 |
| 307 | blk.33.ffn_norm.weight | 0x1fb37c500 | 0x5000 |
| 308 | blk.33.ffn_up.weight | 0x1fb381500 | 0x44c0000 |
| 309 | blk.34.attn_k.weight | 0x1ff841500 | 0x226000 |
| 310 | blk.34.attn_norm.weight | 0x1ffa67500 | 0x5000 |
| 311 | blk.34.attn_output.weight | 0x1ffa6c500 | 0xb40000 |
| 312 | blk.34.attn_q.weight | 0x2005ac500 | 0x898000 |
| 313 | blk.34.attn_v.weight | 0x200e44500 | 0x2d0000 |
| 314 | blk.34.ffn_down.weight | 0x201114500 | 0x44c0000 |
| 315 | blk.34.ffn_gate.weight | 0x2055d4500 | 0x44c0000 |
| 316 | blk.34.ffn_norm.weight | 0x209a94500 | 0x5000 |
| 317 | blk.34.ffn_up.weight | 0x209a99500 | 0x44c0000 |
| 318 | blk.35.attn_k.weight | 0x20df59500 | 0x226000 |
| 319 | blk.35.attn_norm.weight | 0x20e17f500 | 0x5000 |
| 320 | blk.35.attn_output.weight | 0x20e184500 | 0xb40000 |
| 321 | blk.35.attn_q.weight | 0x20ecc4500 | 0x898000 |
| 322 | blk.35.attn_v.weight | 0x20f55c500 | 0x2d0000 |
| 323 | blk.35.ffn_down.weight | 0x20f82c500 | 0x44c0000 |
| 324 | blk.35.ffn_gate.weight | 0x213cec500 | 0x44c0000 |
| 325 | blk.35.ffn_norm.weight | 0x2181ac500 | 0x5000 |
| 326 | blk.35.ffn_up.weight | 0x2181b1500 | 0x44c0000 |
| 327 | blk.36.attn_k.weight | 0x21c671500 | 0x226000 |
| 328 | blk.36.attn_norm.weight | 0x21c897500 | 0x5000 |
| 329 | blk.36.attn_output.weight | 0x21c89c500 | 0xb40000 |
| 330 | blk.36.attn_q.weight | 0x21d3dc500 | 0x898000 |
| 331 | blk.36.attn_v.weight | 0x21dc74500 | 0x2d0000 |
| 332 | blk.36.ffn_down.weight | 0x21df44500 | 0x44c0000 |
| 333 | blk.36.ffn_gate.weight | 0x222404500 | 0x44c0000 |
| 334 | blk.36.ffn_norm.weight | 0x2268c4500 | 0x5000 |
| 335 | blk.36.ffn_up.weight | 0x2268c9500 | 0x44c0000 |
| 336 | blk.37.attn_k.weight | 0x22ad89500 | 0x226000 |
| 337 | blk.37.attn_norm.weight | 0x22afaf500 | 0x5000 |
| 338 | blk.37.attn_output.weight | 0x22afb4500 | 0xb40000 |
| 339 | blk.37.attn_q.weight | 0x22baf4500 | 0x898000 |
| 340 | blk.37.attn_v.weight | 0x22c38c500 | 0x2d0000 |
| 341 | blk.37.ffn_down.weight | 0x22c65c500 | 0x44c0000 |
| 342 | blk.37.ffn_gate.weight | 0x230b1c500 | 0x44c0000 |
| 343 | blk.37.ffn_norm.weight | 0x234fdc500 | 0x5000 |
| 344 | blk.37.ffn_up.weight | 0x234fe1500 | 0x44c0000 |
### <a name="base">Base Tensor Group : ~1B Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:-------------------|:---------------------------------|:------------------|:----------------------|:------|
| 0 | output.weight | Output (W) | (~671M) 671088640 | 5120 x 131072 x 1 x 1 | IQ3_S |
| 1 | output_norm.weight | Output Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 2 | token_embd.weight | Token Embedding (W) | (~671M) 671088640 | 5120 x 131072 x 1 x 1 | IQ3_S |
- Total elements in base: ( ~1B) 1342182400
- Percentage of total elements: 5.98%
### <a name="blk_0">Block 0 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:-------------------------|:-----------------------------------------------|:------------------|:----------------------|:--------|
| 3 | blk.0.attn_k.weight | Block 0 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_XXS |
| 4 | blk.0.attn_norm.weight | Block 0 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 5 | blk.0.attn_output.weight | Block 0 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 6 | blk.0.attn_q.weight | Block 0 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_XXS |
| 7 | blk.0.attn_v.weight | Block 0 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 8 | blk.0.ffn_down.weight | Block 0 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
| 9 | blk.0.ffn_gate.weight | Block 0 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
| 10 | blk.0.ffn_norm.weight | Block 0 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 11 | blk.0.ffn_up.weight | Block 0 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
- Total elements in blk.0: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_1">Block 1 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:-------------------------|:-----------------------------------------------|:------------------|:----------------------|:--------|
| 12 | blk.1.attn_k.weight | Block 1 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_XXS |
| 13 | blk.1.attn_norm.weight | Block 1 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 14 | blk.1.attn_output.weight | Block 1 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 15 | blk.1.attn_q.weight | Block 1 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_XXS |
| 16 | blk.1.attn_v.weight | Block 1 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 17 | blk.1.ffn_down.weight | Block 1 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
| 18 | blk.1.ffn_gate.weight | Block 1 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
| 19 | blk.1.ffn_norm.weight | Block 1 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 20 | blk.1.ffn_up.weight | Block 1 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
- Total elements in blk.1: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_2">Block 2 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:-------------------------|:-----------------------------------------------|:------------------|:----------------------|:--------|
| 21 | blk.2.attn_k.weight | Block 2 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_XXS |
| 22 | blk.2.attn_norm.weight | Block 2 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 23 | blk.2.attn_output.weight | Block 2 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 24 | blk.2.attn_q.weight | Block 2 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_XXS |
| 25 | blk.2.attn_v.weight | Block 2 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 26 | blk.2.ffn_down.weight | Block 2 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
| 27 | blk.2.ffn_gate.weight | Block 2 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
| 28 | blk.2.ffn_norm.weight | Block 2 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 29 | blk.2.ffn_up.weight | Block 2 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
- Total elements in blk.2: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_3">Block 3 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:-------------------------|:-----------------------------------------------|:------------------|:----------------------|:--------|
| 30 | blk.3.attn_k.weight | Block 3 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_XXS |
| 31 | blk.3.attn_norm.weight | Block 3 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 32 | blk.3.attn_output.weight | Block 3 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 33 | blk.3.attn_q.weight | Block 3 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_XXS |
| 34 | blk.3.attn_v.weight | Block 3 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 35 | blk.3.ffn_down.weight | Block 3 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
| 36 | blk.3.ffn_gate.weight | Block 3 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
| 37 | blk.3.ffn_norm.weight | Block 3 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 38 | blk.3.ffn_up.weight | Block 3 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
- Total elements in blk.3: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_4">Block 4 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:-------------------------|:-----------------------------------------------|:------------------|:----------------------|:--------|
| 39 | blk.4.attn_k.weight | Block 4 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_XXS |
| 40 | blk.4.attn_norm.weight | Block 4 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 41 | blk.4.attn_output.weight | Block 4 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 42 | blk.4.attn_q.weight | Block 4 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_XXS |
| 43 | blk.4.attn_v.weight | Block 4 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 44 | blk.4.ffn_down.weight | Block 4 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
| 45 | blk.4.ffn_gate.weight | Block 4 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
| 46 | blk.4.ffn_norm.weight | Block 4 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 47 | blk.4.ffn_up.weight | Block 4 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
- Total elements in blk.4: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_5">Block 5 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:-------------------------|:-----------------------------------------------|:------------------|:----------------------|:--------|
| 48 | blk.5.attn_k.weight | Block 5 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_XXS |
| 49 | blk.5.attn_norm.weight | Block 5 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 50 | blk.5.attn_output.weight | Block 5 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 51 | blk.5.attn_q.weight | Block 5 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_XXS |
| 52 | blk.5.attn_v.weight | Block 5 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 53 | blk.5.ffn_down.weight | Block 5 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | IQ3_S |
| 54 | blk.5.ffn_gate.weight | Block 5 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
| 55 | blk.5.ffn_norm.weight | Block 5 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 56 | blk.5.ffn_up.weight | Block 5 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
- Total elements in blk.5: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_6">Block 6 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:-------------------------|:-----------------------------------------------|:------------------|:----------------------|:--------|
| 57 | blk.6.attn_k.weight | Block 6 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_XXS |
| 58 | blk.6.attn_norm.weight | Block 6 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 59 | blk.6.attn_output.weight | Block 6 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 60 | blk.6.attn_q.weight | Block 6 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_XXS |
| 61 | blk.6.attn_v.weight | Block 6 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 62 | blk.6.ffn_down.weight | Block 6 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | IQ3_S |
| 63 | blk.6.ffn_gate.weight | Block 6 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
| 64 | blk.6.ffn_norm.weight | Block 6 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 65 | blk.6.ffn_up.weight | Block 6 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
- Total elements in blk.6: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_7">Block 7 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:-------------------------|:-----------------------------------------------|:------------------|:----------------------|:--------|
| 66 | blk.7.attn_k.weight | Block 7 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_XXS |
| 67 | blk.7.attn_norm.weight | Block 7 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 68 | blk.7.attn_output.weight | Block 7 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 69 | blk.7.attn_q.weight | Block 7 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_XXS |
| 70 | blk.7.attn_v.weight | Block 7 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 71 | blk.7.ffn_down.weight | Block 7 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | IQ3_S |
| 72 | blk.7.ffn_gate.weight | Block 7 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
| 73 | blk.7.ffn_norm.weight | Block 7 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 74 | blk.7.ffn_up.weight | Block 7 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
- Total elements in blk.7: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_8">Block 8 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:-------------------------|:-----------------------------------------------|:------------------|:----------------------|:--------|
| 75 | blk.8.attn_k.weight | Block 8 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_XXS |
| 76 | blk.8.attn_norm.weight | Block 8 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 77 | blk.8.attn_output.weight | Block 8 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 78 | blk.8.attn_q.weight | Block 8 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_XXS |
| 79 | blk.8.attn_v.weight | Block 8 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 80 | blk.8.ffn_down.weight | Block 8 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | IQ3_S |
| 81 | blk.8.ffn_gate.weight | Block 8 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
| 82 | blk.8.ffn_norm.weight | Block 8 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 83 | blk.8.ffn_up.weight | Block 8 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
- Total elements in blk.8: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_9">Block 9 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:-------------------------|:-----------------------------------------------|:------------------|:----------------------|:--------|
| 84 | blk.9.attn_k.weight | Block 9 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_XXS |
| 85 | blk.9.attn_norm.weight | Block 9 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 86 | blk.9.attn_output.weight | Block 9 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 87 | blk.9.attn_q.weight | Block 9 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_XXS |
| 88 | blk.9.attn_v.weight | Block 9 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 89 | blk.9.ffn_down.weight | Block 9 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | IQ3_S |
| 90 | blk.9.ffn_gate.weight | Block 9 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
| 91 | blk.9.ffn_norm.weight | Block 9 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 92 | blk.9.ffn_up.weight | Block 9 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
- Total elements in blk.9: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_10">Block 10 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:--------------------------|:------------------------------------------------|:------------------|:----------------------|:--------|
| 93 | blk.10.attn_k.weight | Block 10 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_XXS |
| 94 | blk.10.attn_norm.weight | Block 10 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 95 | blk.10.attn_output.weight | Block 10 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 96 | blk.10.attn_q.weight | Block 10 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_XXS |
| 97 | blk.10.attn_v.weight | Block 10 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 98 | blk.10.ffn_down.weight | Block 10 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | IQ3_S |
| 99 | blk.10.ffn_gate.weight | Block 10 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
| 100 | blk.10.ffn_norm.weight | Block 10 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 101 | blk.10.ffn_up.weight | Block 10 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
- Total elements in blk.10: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_11">Block 11 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:--------------------------|:------------------------------------------------|:------------------|:----------------------|:--------|
| 102 | blk.11.attn_k.weight | Block 11 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_XXS |
| 103 | blk.11.attn_norm.weight | Block 11 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 104 | blk.11.attn_output.weight | Block 11 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 105 | blk.11.attn_q.weight | Block 11 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_XXS |
| 106 | blk.11.attn_v.weight | Block 11 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 107 | blk.11.ffn_down.weight | Block 11 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | IQ3_S |
| 108 | blk.11.ffn_gate.weight | Block 11 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
| 109 | blk.11.ffn_norm.weight | Block 11 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 110 | blk.11.ffn_up.weight | Block 11 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
- Total elements in blk.11: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_12">Block 12 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:--------------------------|:------------------------------------------------|:------------------|:----------------------|:--------|
| 111 | blk.12.attn_k.weight | Block 12 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_XXS |
| 112 | blk.12.attn_norm.weight | Block 12 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 113 | blk.12.attn_output.weight | Block 12 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 114 | blk.12.attn_q.weight | Block 12 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_XXS |
| 115 | blk.12.attn_v.weight | Block 12 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 116 | blk.12.ffn_down.weight | Block 12 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | IQ3_S |
| 117 | blk.12.ffn_gate.weight | Block 12 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
| 118 | blk.12.ffn_norm.weight | Block 12 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 119 | blk.12.ffn_up.weight | Block 12 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
- Total elements in blk.12: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_13">Block 13 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:--------------------------|:------------------------------------------------|:------------------|:----------------------|:--------|
| 120 | blk.13.attn_k.weight | Block 13 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_XXS |
| 121 | blk.13.attn_norm.weight | Block 13 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 122 | blk.13.attn_output.weight | Block 13 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 123 | blk.13.attn_q.weight | Block 13 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_XXS |
| 124 | blk.13.attn_v.weight | Block 13 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 125 | blk.13.ffn_down.weight | Block 13 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | IQ3_S |
| 126 | blk.13.ffn_gate.weight | Block 13 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
| 127 | blk.13.ffn_norm.weight | Block 13 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 128 | blk.13.ffn_up.weight | Block 13 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
- Total elements in blk.13: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_14">Block 14 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:--------------------------|:------------------------------------------------|:------------------|:----------------------|:--------|
| 129 | blk.14.attn_k.weight | Block 14 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_XXS |
| 130 | blk.14.attn_norm.weight | Block 14 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 131 | blk.14.attn_output.weight | Block 14 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 132 | blk.14.attn_q.weight | Block 14 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_XXS |
| 133 | blk.14.attn_v.weight | Block 14 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 134 | blk.14.ffn_down.weight | Block 14 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | IQ3_S |
| 135 | blk.14.ffn_gate.weight | Block 14 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
| 136 | blk.14.ffn_norm.weight | Block 14 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 137 | blk.14.ffn_up.weight | Block 14 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
- Total elements in blk.14: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_15">Block 15 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:--------------------------|:------------------------------------------------|:------------------|:----------------------|:--------|
| 138 | blk.15.attn_k.weight | Block 15 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_XXS |
| 139 | blk.15.attn_norm.weight | Block 15 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 140 | blk.15.attn_output.weight | Block 15 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 141 | blk.15.attn_q.weight | Block 15 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_XXS |
| 142 | blk.15.attn_v.weight | Block 15 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 143 | blk.15.ffn_down.weight | Block 15 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | IQ3_S |
| 144 | blk.15.ffn_gate.weight | Block 15 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
| 145 | blk.15.ffn_norm.weight | Block 15 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 146 | blk.15.ffn_up.weight | Block 15 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
- Total elements in blk.15: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_16">Block 16 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:--------------------------|:------------------------------------------------|:------------------|:----------------------|:--------|
| 147 | blk.16.attn_k.weight | Block 16 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_XXS |
| 148 | blk.16.attn_norm.weight | Block 16 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 149 | blk.16.attn_output.weight | Block 16 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 150 | blk.16.attn_q.weight | Block 16 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_XXS |
| 151 | blk.16.attn_v.weight | Block 16 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 152 | blk.16.ffn_down.weight | Block 16 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | IQ3_S |
| 153 | blk.16.ffn_gate.weight | Block 16 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
| 154 | blk.16.ffn_norm.weight | Block 16 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 155 | blk.16.ffn_up.weight | Block 16 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
- Total elements in blk.16: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_17">Block 17 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:--------------------------|:------------------------------------------------|:------------------|:----------------------|:--------|
| 156 | blk.17.attn_k.weight | Block 17 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 157 | blk.17.attn_norm.weight | Block 17 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 158 | blk.17.attn_output.weight | Block 17 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 159 | blk.17.attn_q.weight | Block 17 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_S |
| 160 | blk.17.attn_v.weight | Block 17 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ4_NL |
| 161 | blk.17.ffn_down.weight | Block 17 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | IQ3_S |
| 162 | blk.17.ffn_gate.weight | Block 17 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
| 163 | blk.17.ffn_norm.weight | Block 17 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 164 | blk.17.ffn_up.weight | Block 17 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
- Total elements in blk.17: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_18">Block 18 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:--------------------------|:------------------------------------------------|:------------------|:----------------------|:--------|
| 165 | blk.18.attn_k.weight | Block 18 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 166 | blk.18.attn_norm.weight | Block 18 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 167 | blk.18.attn_output.weight | Block 18 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 168 | blk.18.attn_q.weight | Block 18 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_S |
| 169 | blk.18.attn_v.weight | Block 18 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ4_NL |
| 170 | blk.18.ffn_down.weight | Block 18 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | IQ3_S |
| 171 | blk.18.ffn_gate.weight | Block 18 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
| 172 | blk.18.ffn_norm.weight | Block 18 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 173 | blk.18.ffn_up.weight | Block 18 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
- Total elements in blk.18: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_19">Block 19 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:--------------------------|:------------------------------------------------|:------------------|:----------------------|:--------|
| 174 | blk.19.attn_k.weight | Block 19 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_XXS |
| 175 | blk.19.attn_norm.weight | Block 19 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 176 | blk.19.attn_output.weight | Block 19 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 177 | blk.19.attn_q.weight | Block 19 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_XXS |
| 178 | blk.19.attn_v.weight | Block 19 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 179 | blk.19.ffn_down.weight | Block 19 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | IQ3_S |
| 180 | blk.19.ffn_gate.weight | Block 19 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
| 181 | blk.19.ffn_norm.weight | Block 19 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 182 | blk.19.ffn_up.weight | Block 19 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_XXS |
- Total elements in blk.19: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_20">Block 20 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:--------------------------|:------------------------------------------------|:------------------|:----------------------|:-------|
| 183 | blk.20.attn_k.weight | Block 20 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 184 | blk.20.attn_norm.weight | Block 20 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 185 | blk.20.attn_output.weight | Block 20 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 186 | blk.20.attn_q.weight | Block 20 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_S |
| 187 | blk.20.attn_v.weight | Block 20 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ4_NL |
| 188 | blk.20.ffn_down.weight | Block 20 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | IQ3_S |
| 189 | blk.20.ffn_gate.weight | Block 20 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
| 190 | blk.20.ffn_norm.weight | Block 20 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 191 | blk.20.ffn_up.weight | Block 20 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
- Total elements in blk.20: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_21">Block 21 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:--------------------------|:------------------------------------------------|:------------------|:----------------------|:--------|
| 192 | blk.21.attn_k.weight | Block 21 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_XXS |
| 193 | blk.21.attn_norm.weight | Block 21 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 194 | blk.21.attn_output.weight | Block 21 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 195 | blk.21.attn_q.weight | Block 21 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_XXS |
| 196 | blk.21.attn_v.weight | Block 21 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 197 | blk.21.ffn_down.weight | Block 21 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | IQ3_S |
| 198 | blk.21.ffn_gate.weight | Block 21 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
| 199 | blk.21.ffn_norm.weight | Block 21 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 200 | blk.21.ffn_up.weight | Block 21 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
- Total elements in blk.21: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_22">Block 22 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:--------------------------|:------------------------------------------------|:------------------|:----------------------|:-------|
| 201 | blk.22.attn_k.weight | Block 22 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 202 | blk.22.attn_norm.weight | Block 22 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 203 | blk.22.attn_output.weight | Block 22 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 204 | blk.22.attn_q.weight | Block 22 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_S |
| 205 | blk.22.attn_v.weight | Block 22 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ4_NL |
| 206 | blk.22.ffn_down.weight | Block 22 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | IQ3_S |
| 207 | blk.22.ffn_gate.weight | Block 22 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
| 208 | blk.22.ffn_norm.weight | Block 22 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 209 | blk.22.ffn_up.weight | Block 22 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
- Total elements in blk.22: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_23">Block 23 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:--------------------------|:------------------------------------------------|:------------------|:----------------------|:-------|
| 210 | blk.23.attn_k.weight | Block 23 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 211 | blk.23.attn_norm.weight | Block 23 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 212 | blk.23.attn_output.weight | Block 23 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 213 | blk.23.attn_q.weight | Block 23 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_S |
| 214 | blk.23.attn_v.weight | Block 23 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ4_NL |
| 215 | blk.23.ffn_down.weight | Block 23 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | IQ3_S |
| 216 | blk.23.ffn_gate.weight | Block 23 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
| 217 | blk.23.ffn_norm.weight | Block 23 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 218 | blk.23.ffn_up.weight | Block 23 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
- Total elements in blk.23: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_24">Block 24 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:--------------------------|:------------------------------------------------|:------------------|:----------------------|:-------|
| 219 | blk.24.attn_k.weight | Block 24 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 220 | blk.24.attn_norm.weight | Block 24 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 221 | blk.24.attn_output.weight | Block 24 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 222 | blk.24.attn_q.weight | Block 24 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_S |
| 223 | blk.24.attn_v.weight | Block 24 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ4_NL |
| 224 | blk.24.ffn_down.weight | Block 24 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | IQ3_S |
| 225 | blk.24.ffn_gate.weight | Block 24 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
| 226 | blk.24.ffn_norm.weight | Block 24 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 227 | blk.24.ffn_up.weight | Block 24 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
- Total elements in blk.24: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_25">Block 25 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:--------------------------|:------------------------------------------------|:------------------|:----------------------|:-------|
| 228 | blk.25.attn_k.weight | Block 25 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 229 | blk.25.attn_norm.weight | Block 25 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 230 | blk.25.attn_output.weight | Block 25 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 231 | blk.25.attn_q.weight | Block 25 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_S |
| 232 | blk.25.attn_v.weight | Block 25 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ4_NL |
| 233 | blk.25.ffn_down.weight | Block 25 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | IQ3_S |
| 234 | blk.25.ffn_gate.weight | Block 25 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
| 235 | blk.25.ffn_norm.weight | Block 25 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 236 | blk.25.ffn_up.weight | Block 25 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
- Total elements in blk.25: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_26">Block 26 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:--------------------------|:------------------------------------------------|:------------------|:----------------------|:-------|
| 237 | blk.26.attn_k.weight | Block 26 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 238 | blk.26.attn_norm.weight | Block 26 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 239 | blk.26.attn_output.weight | Block 26 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 240 | blk.26.attn_q.weight | Block 26 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_S |
| 241 | blk.26.attn_v.weight | Block 26 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ4_NL |
| 242 | blk.26.ffn_down.weight | Block 26 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | IQ3_S |
| 243 | blk.26.ffn_gate.weight | Block 26 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
| 244 | blk.26.ffn_norm.weight | Block 26 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 245 | blk.26.ffn_up.weight | Block 26 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
- Total elements in blk.26: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_27">Block 27 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:--------------------------|:------------------------------------------------|:------------------|:----------------------|:--------|
| 246 | blk.27.attn_k.weight | Block 27 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_XXS |
| 247 | blk.27.attn_norm.weight | Block 27 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 248 | blk.27.attn_output.weight | Block 27 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 249 | blk.27.attn_q.weight | Block 27 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_XXS |
| 250 | blk.27.attn_v.weight | Block 27 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 251 | blk.27.ffn_down.weight | Block 27 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | IQ3_S |
| 252 | blk.27.ffn_gate.weight | Block 27 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
| 253 | blk.27.ffn_norm.weight | Block 27 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 254 | blk.27.ffn_up.weight | Block 27 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
- Total elements in blk.27: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_28">Block 28 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:--------------------------|:------------------------------------------------|:------------------|:----------------------|:-------|
| 255 | blk.28.attn_k.weight | Block 28 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 256 | blk.28.attn_norm.weight | Block 28 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 257 | blk.28.attn_output.weight | Block 28 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 258 | blk.28.attn_q.weight | Block 28 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_S |
| 259 | blk.28.attn_v.weight | Block 28 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ4_NL |
| 260 | blk.28.ffn_down.weight | Block 28 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | IQ3_S |
| 261 | blk.28.ffn_gate.weight | Block 28 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
| 262 | blk.28.ffn_norm.weight | Block 28 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 263 | blk.28.ffn_up.weight | Block 28 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
- Total elements in blk.28: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_29">Block 29 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:--------------------------|:------------------------------------------------|:------------------|:----------------------|:-------|
| 264 | blk.29.attn_k.weight | Block 29 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 265 | blk.29.attn_norm.weight | Block 29 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 266 | blk.29.attn_output.weight | Block 29 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 267 | blk.29.attn_q.weight | Block 29 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_S |
| 268 | blk.29.attn_v.weight | Block 29 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ4_NL |
| 269 | blk.29.ffn_down.weight | Block 29 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | IQ3_S |
| 270 | blk.29.ffn_gate.weight | Block 29 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
| 271 | blk.29.ffn_norm.weight | Block 29 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 272 | blk.29.ffn_up.weight | Block 29 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
- Total elements in blk.29: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_30">Block 30 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:--------------------------|:------------------------------------------------|:------------------|:----------------------|:-------|
| 273 | blk.30.attn_k.weight | Block 30 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 274 | blk.30.attn_norm.weight | Block 30 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 275 | blk.30.attn_output.weight | Block 30 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 276 | blk.30.attn_q.weight | Block 30 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_S |
| 277 | blk.30.attn_v.weight | Block 30 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ4_NL |
| 278 | blk.30.ffn_down.weight | Block 30 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | IQ3_S |
| 279 | blk.30.ffn_gate.weight | Block 30 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
| 280 | blk.30.ffn_norm.weight | Block 30 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 281 | blk.30.ffn_up.weight | Block 30 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
- Total elements in blk.30: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_31">Block 31 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:--------------------------|:------------------------------------------------|:------------------|:----------------------|:-------|
| 282 | blk.31.attn_k.weight | Block 31 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 283 | blk.31.attn_norm.weight | Block 31 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 284 | blk.31.attn_output.weight | Block 31 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 285 | blk.31.attn_q.weight | Block 31 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_S |
| 286 | blk.31.attn_v.weight | Block 31 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ4_NL |
| 287 | blk.31.ffn_down.weight | Block 31 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | IQ3_S |
| 288 | blk.31.ffn_gate.weight | Block 31 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
| 289 | blk.31.ffn_norm.weight | Block 31 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 290 | blk.31.ffn_up.weight | Block 31 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
- Total elements in blk.31: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_32">Block 32 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:--------------------------|:------------------------------------------------|:------------------|:----------------------|:-------|
| 291 | blk.32.attn_k.weight | Block 32 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 292 | blk.32.attn_norm.weight | Block 32 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 293 | blk.32.attn_output.weight | Block 32 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 294 | blk.32.attn_q.weight | Block 32 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_S |
| 295 | blk.32.attn_v.weight | Block 32 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ4_NL |
| 296 | blk.32.ffn_down.weight | Block 32 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | IQ3_S |
| 297 | blk.32.ffn_gate.weight | Block 32 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
| 298 | blk.32.ffn_norm.weight | Block 32 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 299 | blk.32.ffn_up.weight | Block 32 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
- Total elements in blk.32: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_33">Block 33 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:--------------------------|:------------------------------------------------|:------------------|:----------------------|:-------|
| 300 | blk.33.attn_k.weight | Block 33 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 301 | blk.33.attn_norm.weight | Block 33 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 302 | blk.33.attn_output.weight | Block 33 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 303 | blk.33.attn_q.weight | Block 33 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_S |
| 304 | blk.33.attn_v.weight | Block 33 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ4_NL |
| 305 | blk.33.ffn_down.weight | Block 33 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | IQ3_S |
| 306 | blk.33.ffn_gate.weight | Block 33 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
| 307 | blk.33.ffn_norm.weight | Block 33 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 308 | blk.33.ffn_up.weight | Block 33 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
- Total elements in blk.33: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_34">Block 34 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:--------------------------|:------------------------------------------------|:------------------|:----------------------|:-------|
| 309 | blk.34.attn_k.weight | Block 34 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 310 | blk.34.attn_norm.weight | Block 34 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 311 | blk.34.attn_output.weight | Block 34 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 312 | blk.34.attn_q.weight | Block 34 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_S |
| 313 | blk.34.attn_v.weight | Block 34 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ4_NL |
| 314 | blk.34.ffn_down.weight | Block 34 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | IQ3_S |
| 315 | blk.34.ffn_gate.weight | Block 34 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
| 316 | blk.34.ffn_norm.weight | Block 34 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 317 | blk.34.ffn_up.weight | Block 34 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
- Total elements in blk.34: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_35">Block 35 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:--------------------------|:------------------------------------------------|:------------------|:----------------------|:-------|
| 318 | blk.35.attn_k.weight | Block 35 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 319 | blk.35.attn_norm.weight | Block 35 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 320 | blk.35.attn_output.weight | Block 35 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 321 | blk.35.attn_q.weight | Block 35 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_S |
| 322 | blk.35.attn_v.weight | Block 35 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ4_NL |
| 323 | blk.35.ffn_down.weight | Block 35 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | IQ3_S |
| 324 | blk.35.ffn_gate.weight | Block 35 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
| 325 | blk.35.ffn_norm.weight | Block 35 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 326 | blk.35.ffn_up.weight | Block 35 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
- Total elements in blk.35: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_36">Block 36 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:--------------------------|:------------------------------------------------|:------------------|:----------------------|:-------|
| 327 | blk.36.attn_k.weight | Block 36 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 328 | blk.36.attn_norm.weight | Block 36 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 329 | blk.36.attn_output.weight | Block 36 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 330 | blk.36.attn_q.weight | Block 36 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_S |
| 331 | blk.36.attn_v.weight | Block 36 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ4_NL |
| 332 | blk.36.ffn_down.weight | Block 36 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | IQ3_S |
| 333 | blk.36.ffn_gate.weight | Block 36 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
| 334 | blk.36.ffn_norm.weight | Block 36 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 335 | blk.36.ffn_up.weight | Block 36 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
- Total elements in blk.36: (~556M) 555755520
- Percentage of total elements: 2.47%
### <a name="blk_37">Block 37 Tensor Group : ~556M Elements</a>
| T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
|-----:|:--------------------------|:------------------------------------------------|:------------------|:----------------------|:-------|
| 336 | blk.37.attn_k.weight | Block 37 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ3_S |
| 337 | blk.37.attn_norm.weight | Block 37 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 338 | blk.37.attn_output.weight | Block 37 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
| 339 | blk.37.attn_q.weight | Block 37 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | IQ3_S |
| 340 | blk.37.attn_v.weight | Block 37 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | IQ4_NL |
| 341 | blk.37.ffn_down.weight | Block 37 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | IQ3_S |
| 342 | blk.37.ffn_gate.weight | Block 37 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
| 343 | blk.37.ffn_norm.weight | Block 37 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
| 344 | blk.37.ffn_up.weight | Block 37 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | IQ3_S |
- Total elements in blk.37: (~556M) 555755520
- Percentage of total elements: 2.47%