初始化项目,由ModelHub XC社区提供模型

Model: jfarray/Model_distiluse-base-multilingual-cased-v1_1_Epochs
Source: Original Platform
This commit is contained in:
ModelHub XC
2026-05-13 18:20:57 +08:00
commit 04aa5f3c57
16 changed files with 239458 additions and 0 deletions

29
.gitattributes vendored Normal file
View File

@@ -0,0 +1,29 @@
*.7z filter=lfs diff=lfs merge=lfs -text
*.arrow filter=lfs diff=lfs merge=lfs -text
*.bin filter=lfs diff=lfs merge=lfs -text
*.bin.* filter=lfs diff=lfs merge=lfs -text
*.bz2 filter=lfs diff=lfs merge=lfs -text
*.ftz filter=lfs diff=lfs merge=lfs -text
*.gz filter=lfs diff=lfs merge=lfs -text
*.h5 filter=lfs diff=lfs merge=lfs -text
*.joblib filter=lfs diff=lfs merge=lfs -text
*.lfs.* filter=lfs diff=lfs merge=lfs -text
*.model filter=lfs diff=lfs merge=lfs -text
*.msgpack filter=lfs diff=lfs merge=lfs -text
*.onnx filter=lfs diff=lfs merge=lfs -text
*.ot filter=lfs diff=lfs merge=lfs -text
*.parquet filter=lfs diff=lfs merge=lfs -text
*.pb filter=lfs diff=lfs merge=lfs -text
*.pt filter=lfs diff=lfs merge=lfs -text
*.pth filter=lfs diff=lfs merge=lfs -text
*.rar filter=lfs diff=lfs merge=lfs -text
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
*.tar.* filter=lfs diff=lfs merge=lfs -text
*.tflite filter=lfs diff=lfs merge=lfs -text
*.tgz filter=lfs diff=lfs merge=lfs -text
*.xz filter=lfs diff=lfs merge=lfs -text
*.zip filter=lfs diff=lfs merge=lfs -text
*.zstandard filter=lfs diff=lfs merge=lfs -text
*tfevents* filter=lfs diff=lfs merge=lfs -text
pytorch_model.bin filter=lfs diff=lfs merge=lfs -text
.git/lfs/objects/ec/d9/ecd92f4c8476a369fae14fcc6f6e88cc4a5db04f66040000017b9c46b256dcfa filter=lfs diff=lfs merge=lfs -text

7
1_Pooling/config.json Normal file
View File

@@ -0,0 +1,7 @@
{
"word_embedding_dimension": 768,
"pooling_mode_cls_token": false,
"pooling_mode_mean_tokens": true,
"pooling_mode_max_tokens": false,
"pooling_mode_mean_sqrt_len_tokens": false
}

1
2_Dense/config.json Normal file
View File

@@ -0,0 +1 @@
{"in_features": 768, "out_features": 512, "bias": true, "activation_function": "torch.nn.modules.activation.Tanh"}

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:a48c18a6b1c5cc997b8e2c450cc635fd6ab5623a3b663bd6df90d89d1d58abb0
size 1575975

87
README.md Normal file
View File

@@ -0,0 +1,87 @@
---
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
---
# {MODEL_NAME}
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 512 dimensional dense vector space and can be used for tasks like clustering or semantic search.
<!--- Describe your model here -->
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('{MODEL_NAME}')
embeddings = model.encode(sentences)
print(embeddings)
```
## Evaluation Results
<!--- Describe how your model was evaluated -->
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name={MODEL_NAME})
## Training
The model was trained with the parameters:
**DataLoader**:
`torch.utils.data.dataloader.DataLoader` of length 11 with parameters:
```
{'batch_size': 15, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
```
**Loss**:
`sentence_transformers.losses.CosineSimilarityLoss.CosineSimilarityLoss`
Parameters of the fit()-Method:
```
{
"epochs": 1,
"evaluation_steps": 1,
"evaluator": "sentence_transformers.evaluation.EmbeddingSimilarityEvaluator.EmbeddingSimilarityEvaluator",
"max_grad_norm": 1,
"optimizer_class": "<class 'transformers.optimization.AdamW'>",
"optimizer_params": {
"lr": 2e-05
},
"scheduler": "WarmupLinear",
"steps_per_epoch": null,
"warmup_steps": 2,
"weight_decay": 0.01
}
```
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: DistilBertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
(2): Dense({'in_features': 768, 'out_features': 512, 'bias': True, 'activation_function': 'torch.nn.modules.activation.Tanh'})
)
```
## Citing & Authors
<!--- Describe where people can find more information -->

24
config.json Normal file
View File

@@ -0,0 +1,24 @@
{
"_name_or_path": "/root/.cache/torch/sentence_transformers/sentence-transformers_distiluse-base-multilingual-cased-v1/",
"activation": "gelu",
"architectures": [
"DistilBertModel"
],
"attention_dropout": 0.1,
"dim": 768,
"dropout": 0.1,
"hidden_dim": 3072,
"initializer_range": 0.02,
"max_position_embeddings": 512,
"model_type": "distilbert",
"n_heads": 12,
"n_layers": 6,
"pad_token_id": 0,
"qa_dropout": 0.1,
"seq_classif_dropout": 0.2,
"sinusoidal_pos_embds": false,
"tie_weights_": true,
"torch_dtype": "float32",
"transformers_version": "4.18.0",
"vocab_size": 119547
}

View File

@@ -0,0 +1,7 @@
{
"__version__": {
"sentence_transformers": "2.0.0",
"transformers": "4.7.0",
"pytorch": "1.9.0+cu102"
}
}

View File

@@ -0,0 +1,13 @@
epoch,steps,cosine_pearson,cosine_spearman,euclidean_pearson,euclidean_spearman,manhattan_pearson,manhattan_spearman,dot_pearson,dot_spearman
0,1,0.30257183823495615,0.4246848658033177,0.30613213198210787,0.3839964954269519,0.291456965945577,0.33695056717927896,0.31795452080383496,0.37382440283286045
0,2,0.294302385954249,0.3789104491299062,0.3024660330412783,0.3344075440307561,0.2912926255494201,0.3560232407932004,0.29183787082506146,0.35220870607041616
0,3,0.23060942767317225,0.3394935903278018,0.2648782249711757,0.33059300930797186,0.2595600986796117,0.33059300930797186,0.08016983363513724,0.01652965046539859
0,4,0.1500913832681685,0.07501918288142438,0.20155891167283593,0.3089773125455275,0.19582524606295137,0.3089773125455275,-0.03576855207889765,-0.23395812966410312
0,5,0.10590610515789566,-0.01652965046539859,0.1570469993841901,0.02797325463375146,0.14697116099262056,0.02797325463375146,-0.08120755264418195,-0.18818371299069164
0,6,0.09794280000746425,-0.05848953241602578,0.13845254857011052,-0.031787789356535756,0.12433267126350725,-0.057218020841764354,-0.08688343300888521,-0.18182615511938452
0,7,0.08800609490925691,-0.17038255095103164,0.12031926904098608,-0.06357557871307151,0.10193902320001877,-0.043231393524888626,-0.09148081809081532,-0.18182615511938452
0,8,0.08718976858275106,-0.2326866180898417,0.11210240323047707,-0.07374767130716295,0.09108412190754916,-0.05594650926750292,-0.08968672750349271,-0.17419708567381592
0,9,0.08961820230553723,-0.23141510651558028,0.11088987471668936,-0.06993313658437866,0.08777226195941984,-0.05594650926750292,-0.08606435628655995,-0.17419708567381592
0,10,0.09282658298408078,-0.21361394447592028,0.11183110646361266,-0.06993313658437866,0.08759029451628514,-0.05594650926750292,-0.08224336973932485,-0.17419708567381592
0,11,0.09298540734162918,-0.21361394447592028,0.11033087656390748,-0.06993313658437866,0.08560911088889919,-0.05594650926750292,-0.08106692131400356,-0.16275348150546307
0,-1,0.09298540734162918,-0.21361394447592028,0.11033087656390748,-0.06993313658437866,0.08560911088889919,-0.05594650926750292,-0.08106692131400356,-0.16275348150546307
1 epoch steps cosine_pearson cosine_spearman euclidean_pearson euclidean_spearman manhattan_pearson manhattan_spearman dot_pearson dot_spearman
2 0 1 0.30257183823495615 0.4246848658033177 0.30613213198210787 0.3839964954269519 0.291456965945577 0.33695056717927896 0.31795452080383496 0.37382440283286045
3 0 2 0.294302385954249 0.3789104491299062 0.3024660330412783 0.3344075440307561 0.2912926255494201 0.3560232407932004 0.29183787082506146 0.35220870607041616
4 0 3 0.23060942767317225 0.3394935903278018 0.2648782249711757 0.33059300930797186 0.2595600986796117 0.33059300930797186 0.08016983363513724 0.01652965046539859
5 0 4 0.1500913832681685 0.07501918288142438 0.20155891167283593 0.3089773125455275 0.19582524606295137 0.3089773125455275 -0.03576855207889765 -0.23395812966410312
6 0 5 0.10590610515789566 -0.01652965046539859 0.1570469993841901 0.02797325463375146 0.14697116099262056 0.02797325463375146 -0.08120755264418195 -0.18818371299069164
7 0 6 0.09794280000746425 -0.05848953241602578 0.13845254857011052 -0.031787789356535756 0.12433267126350725 -0.057218020841764354 -0.08688343300888521 -0.18182615511938452
8 0 7 0.08800609490925691 -0.17038255095103164 0.12031926904098608 -0.06357557871307151 0.10193902320001877 -0.043231393524888626 -0.09148081809081532 -0.18182615511938452
9 0 8 0.08718976858275106 -0.2326866180898417 0.11210240323047707 -0.07374767130716295 0.09108412190754916 -0.05594650926750292 -0.08968672750349271 -0.17419708567381592
10 0 9 0.08961820230553723 -0.23141510651558028 0.11088987471668936 -0.06993313658437866 0.08777226195941984 -0.05594650926750292 -0.08606435628655995 -0.17419708567381592
11 0 10 0.09282658298408078 -0.21361394447592028 0.11183110646361266 -0.06993313658437866 0.08759029451628514 -0.05594650926750292 -0.08224336973932485 -0.17419708567381592
12 0 11 0.09298540734162918 -0.21361394447592028 0.11033087656390748 -0.06993313658437866 0.08560911088889919 -0.05594650926750292 -0.08106692131400356 -0.16275348150546307
13 0 -1 0.09298540734162918 -0.21361394447592028 0.11033087656390748 -0.06993313658437866 0.08560911088889919 -0.05594650926750292 -0.08106692131400356 -0.16275348150546307

20
modules.json Normal file
View File

@@ -0,0 +1,20 @@
[
{
"idx": 0,
"name": "0",
"path": "",
"type": "sentence_transformers.models.Transformer"
},
{
"idx": 1,
"name": "1",
"path": "1_Pooling",
"type": "sentence_transformers.models.Pooling"
},
{
"idx": 2,
"name": "2",
"path": "2_Dense",
"type": "sentence_transformers.models.Dense"
}
]

3
pytorch_model.bin Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:623918c26bb3215d63127569069b11a9e7c44d69057ccf7eaf408281edb8d4d0
size 538968313

View File

@@ -0,0 +1,4 @@
{
"max_seq_length": 128,
"do_lower_case": false
}

View File

@@ -0,0 +1,2 @@
epoch,steps,cosine_pearson,cosine_spearman,euclidean_pearson,euclidean_spearman,manhattan_pearson,manhattan_spearman,dot_pearson,dot_spearman
-1,-1,0.7196434688031887,0.319190772367219,0.6909147720915559,0.32486216935103734,0.6747856036858331,0.3276897426323679,0.7107332528260147,0.289176244490106
1 epoch steps cosine_pearson cosine_spearman euclidean_pearson euclidean_spearman manhattan_pearson manhattan_spearman dot_pearson dot_spearman
2 -1 -1 0.7196434688031887 0.319190772367219 0.6909147720915559 0.32486216935103734 0.6747856036858331 0.3276897426323679 0.7107332528260147 0.289176244490106

1
special_tokens_map.json Normal file
View File

@@ -0,0 +1 @@
{"unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]"}

119709
tokenizer.json Normal file

File diff suppressed because it is too large Load Diff

1
tokenizer_config.json Normal file
View File

@@ -0,0 +1 @@
{"do_lower_case": false, "unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]", "tokenize_chinese_chars": true, "strip_accents": null, "max_len": 512, "special_tokens_map_file": "old_models/distiluse-base-multilingual-cased-v1/0_Transformer/special_tokens_map.json", "name_or_path": "/root/.cache/torch/sentence_transformers/sentence-transformers_distiluse-base-multilingual-cased-v1/", "do_basic_tokenize": true, "never_split": null, "tokenizer_class": "DistilBertTokenizer"}

119547
vocab.txt Normal file

File diff suppressed because it is too large Load Diff