初始化项目,由ModelHub XC社区提供模型

Model: jfarray/Model_dccuchile_bert-base-spanish-wwm-uncased_5_Epochs
Source: Original Platform
This commit is contained in:
ModelHub XC
2026-05-13 17:56:34 +08:00
commit 04a4fa320c
16 changed files with 62418 additions and 0 deletions

28
.gitattributes vendored Normal file
View File

@@ -0,0 +1,28 @@
*.7z filter=lfs diff=lfs merge=lfs -text
*.arrow filter=lfs diff=lfs merge=lfs -text
*.bin filter=lfs diff=lfs merge=lfs -text
*.bin.* filter=lfs diff=lfs merge=lfs -text
*.bz2 filter=lfs diff=lfs merge=lfs -text
*.ftz filter=lfs diff=lfs merge=lfs -text
*.gz filter=lfs diff=lfs merge=lfs -text
*.h5 filter=lfs diff=lfs merge=lfs -text
*.joblib filter=lfs diff=lfs merge=lfs -text
*.lfs.* filter=lfs diff=lfs merge=lfs -text
*.model filter=lfs diff=lfs merge=lfs -text
*.msgpack filter=lfs diff=lfs merge=lfs -text
*.onnx filter=lfs diff=lfs merge=lfs -text
*.ot filter=lfs diff=lfs merge=lfs -text
*.parquet filter=lfs diff=lfs merge=lfs -text
*.pb filter=lfs diff=lfs merge=lfs -text
*.pt filter=lfs diff=lfs merge=lfs -text
*.pth filter=lfs diff=lfs merge=lfs -text
*.rar filter=lfs diff=lfs merge=lfs -text
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
*.tar.* filter=lfs diff=lfs merge=lfs -text
*.tflite filter=lfs diff=lfs merge=lfs -text
*.tgz filter=lfs diff=lfs merge=lfs -text
*.xz filter=lfs diff=lfs merge=lfs -text
*.zip filter=lfs diff=lfs merge=lfs -text
*.zstandard filter=lfs diff=lfs merge=lfs -text
*tfevents* filter=lfs diff=lfs merge=lfs -text
pytorch_model.bin filter=lfs diff=lfs merge=lfs -text

7
1_Pooling/config.json Normal file
View File

@@ -0,0 +1,7 @@
{
"word_embedding_dimension": 768,
"pooling_mode_cls_token": false,
"pooling_mode_mean_tokens": true,
"pooling_mode_max_tokens": false,
"pooling_mode_mean_sqrt_len_tokens": false
}

1
2_Dense/config.json Normal file
View File

@@ -0,0 +1 @@
{"in_features": 768, "out_features": 256, "bias": true, "activation_function": "torch.nn.modules.activation.Tanh"}

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:f1edfe4ccb592c5f55e9bc6a8760f2cf5cfe78c60795a180d6146b82c37c35a6
size 788519

87
README.md Normal file
View File

@@ -0,0 +1,87 @@
---
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
---
# {MODEL_NAME}
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 256 dimensional dense vector space and can be used for tasks like clustering or semantic search.
<!--- Describe your model here -->
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('{MODEL_NAME}')
embeddings = model.encode(sentences)
print(embeddings)
```
## Evaluation Results
<!--- Describe how your model was evaluated -->
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name={MODEL_NAME})
## Training
The model was trained with the parameters:
**DataLoader**:
`torch.utils.data.dataloader.DataLoader` of length 11 with parameters:
```
{'batch_size': 15, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
```
**Loss**:
`sentence_transformers.losses.CosineSimilarityLoss.CosineSimilarityLoss`
Parameters of the fit()-Method:
```
{
"epochs": 5,
"evaluation_steps": 1,
"evaluator": "sentence_transformers.evaluation.EmbeddingSimilarityEvaluator.EmbeddingSimilarityEvaluator",
"max_grad_norm": 1,
"optimizer_class": "<class 'transformers.optimization.AdamW'>",
"optimizer_params": {
"lr": 2e-05
},
"scheduler": "WarmupLinear",
"steps_per_epoch": null,
"warmup_steps": 6,
"weight_decay": 0.01
}
```
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
(2): Dense({'in_features': 768, 'out_features': 256, 'bias': True, 'activation_function': 'torch.nn.modules.activation.Tanh'})
)
```
## Citing & Authors
<!--- Describe where people can find more information -->

27
config.json Normal file
View File

@@ -0,0 +1,27 @@
{
"_name_or_path": "dccuchile/bert-base-spanish-wwm-uncased",
"architectures": [
"BertModel"
],
"attention_probs_dropout_prob": 0.1,
"classifier_dropout": null,
"gradient_checkpointing": false,
"hidden_act": "gelu",
"hidden_dropout_prob": 0.1,
"hidden_size": 768,
"initializer_range": 0.02,
"intermediate_size": 3072,
"layer_norm_eps": 1e-12,
"max_position_embeddings": 512,
"model_type": "bert",
"num_attention_heads": 12,
"num_hidden_layers": 12,
"output_past": true,
"pad_token_id": 1,
"position_embedding_type": "absolute",
"torch_dtype": "float32",
"transformers_version": "4.16.2",
"type_vocab_size": 2,
"use_cache": true,
"vocab_size": 31002
}

View File

@@ -0,0 +1,7 @@
{
"__version__": {
"sentence_transformers": "2.2.0",
"transformers": "4.16.2",
"pytorch": "1.10.0+cu111"
}
}

View File

@@ -0,0 +1,61 @@
epoch,steps,cosine_pearson,cosine_spearman,euclidean_pearson,euclidean_spearman,manhattan_pearson,manhattan_spearman,dot_pearson,dot_spearman
0,1,0.3108319859681156,0.3216924282881418,0.24478008048803374,0.2199715023472274,0.22843823713214909,0.19962731715904453,0.22828213082899587,0.2937191736543904
0,2,0.32896683800950977,0.3801819607041676,0.2592018787828544,0.2530308032780246,0.24475425880846177,0.2326866180898417,0.23285179120501534,0.354751729218939
0,3,0.36337870645431064,0.4259563773775791,0.286947427842334,0.29244766208012896,0.2766451768197279,0.25175929170376316,0.24181973954428423,0.3661953333872919
0,4,0.40429570542540616,0.49080346766491206,0.32354694661782096,0.3064342893970047,0.31852686636397054,0.2949906852286518,0.2569737124044797,0.3839964954269519
0,5,0.4433298791467789,0.5442069537838921,0.3640674641314855,0.4170557963577491,0.3624027015560045,0.4056121921893962,0.28214731665640674,0.3839964954269519
0,6,0.48194846635313754,0.5772662547146893,0.4084595753991238,0.464101724605422,0.4052036850790403,0.46028718988263767,0.3176573243813614,0.42849940052610197
0,7,0.5137291239872959,0.5963389283286107,0.48470453752800796,0.4882604445163891,0.47831518056645855,0.46283021303116056,0.3446401059289609,0.4513866088628077
0,8,0.5239001526579894,0.5925243936058264,0.49599764800169777,0.5162336991501406,0.486522727555486,0.46283021303116056,0.38467751326263355,0.46028718988263767
0,9,0.5280310062539452,0.6204976482395779,0.5272556463573952,0.5454784653581536,0.515098843187698,0.5238627685957092,0.39741536994668236,0.46028718988263767
0,10,0.5390951309567703,0.5454784653581536,0.4923685799208778,0.49080346766491206,0.4778530445316493,0.4412145162687163,0.46367195788612714,0.5302203264670163
0,11,0.47047569629723124,0.4691877709024677,0.4065863559983096,0.31024882411978894,0.3962551563871312,0.2937191736543904,0.5118828529342409,0.5848953241602579
0,-1,0.47047569629723124,0.4691877709024677,0.4065863559983096,0.31024882411978894,0.3962551563871312,0.2937191736543904,0.5118828529342409,0.5848953241602579
1,1,0.43587867232753813,0.3674668449615533,0.3698620004900434,0.2695604537434232,0.3615149693314503,0.2746465000404689,0.5199434094114423,0.6319412524079308
1,2,0.4598068067563758,0.3916255648725205,0.39684228220223206,0.2733749884662075,0.3881678025953699,0.2873616157830832,0.53330833795239,0.6319412524079308
1,3,0.5529414662056736,0.5531075348037221,0.4990928286955647,0.4361284699716705,0.4907255539577539,0.4933464908134349,0.5403086413041641,0.617954625091055
1,4,0.6144468095906074,0.6204976482395779,0.5772291789062844,0.5391209074868464,0.5757059126404819,0.546749976932415,0.521414127034061,0.6548284607446365
1,5,0.6424395506003274,0.6904307848239566,0.6288804839724441,0.6204976482395779,0.631888469098824,0.6026964861999179,0.49630954755717693,0.6255836945366237
1,6,0.6395409687525047,0.7323906667745838,0.6517158978239832,0.6904307848239566,0.6574143220652291,0.7018743889923095,0.47271191310668503,0.6154116019425322
1,7,0.6336338175998731,0.736205201497368,0.6551019842469132,0.7044174121408323,0.6614641433824769,0.7285761320517995,0.4706907923814303,0.6154116019425322
1,8,0.6268341391170168,0.7171325278834465,0.654389533351394,0.7323906667745838,0.6616881167058865,0.7285761320517995,0.47098120585078895,0.6154116019425322
1,9,0.5983079348489586,0.691702296398218,0.6382909612103717,0.7323906667745838,0.6447081927351945,0.7374767130716295,0.45744636785402926,0.5759947431404279
1,10,0.5689832380039443,0.6548284607446365,0.6180683226269311,0.7133179931606624,0.6240867429428315,0.7247615973290151,0.44450957491745907,0.5531075348037221
1,11,0.5162625045701104,0.638298810279238,0.5779216322042083,0.6713581112100352,0.5821759714967701,0.6929738079724794,0.41763491060006874,0.537849395912585
1,-1,0.5162625045701104,0.638298810279238,0.5779216322042083,0.6713581112100352,0.5821759714967701,0.6929738079724794,0.41763491060006874,0.537849395912585
2,1,0.46473649893162766,0.6217691598138393,0.5334766859987946,0.6548284607446365,0.535693288652834,0.6764441575070808,0.39118926649237384,0.5454784653581536
2,2,0.4431405760965195,0.5988819514771336,0.5124588450989886,0.6599145070416822,0.5130584173001388,0.6599145070416822,0.3819572006414451,0.5454784653581536
2,3,0.4393331335811442,0.6204976482395779,0.5071001511199104,0.6332127639821923,0.5074478872876883,0.6599145070416822,0.3835151286643572,0.5340348611898006
2,4,0.44870876650365754,0.638298810279238,0.5133856799759213,0.6497424144475908,0.5150417735450201,0.6688150880615122,0.39295555438190544,0.5340348611898006
2,5,0.4552336141065715,0.6433848565762836,0.5162452680672921,0.6497424144475908,0.519812086635828,0.6459278797248065,0.4013480785913682,0.5340348611898006
2,6,0.4696446126598315,0.6433848565762836,0.5259905146320221,0.6662720649129894,0.5324851259669662,0.647199391299068,0.4161415996385177,0.5556505579522449
2,7,0.49405829888514535,0.6789871806556036,0.5418555542093901,0.6980598542695251,0.5516874948617178,0.647199391299068,0.4402648980446127,0.6065110209227021
2,8,0.5026297849515802,0.7196755510319695,0.5450556429105228,0.7400197362201524,0.5585820044951249,0.6789871806556036,0.45329636868785467,0.6065110209227021
2,9,0.5081824666759697,0.7158610163091852,0.5443258246131687,0.7527348519627666,0.5623333526089962,0.7209470626062309,0.4661934679116899,0.6090540440712251
2,10,0.5130357229187649,0.7489203172399823,0.5428839924910025,0.7527348519627666,0.5656139526161222,0.7451057825171981,0.47971887876444524,0.6065110209227021
2,11,0.5132001270637361,0.7451057825171981,0.5395535008393517,0.7489203172399823,0.5642202769308216,0.6802586922298651,0.4833590991502187,0.6065110209227021
2,-1,0.5132001270637361,0.7451057825171981,0.5395535008393517,0.7489203172399823,0.5642202769308216,0.6802586922298651,0.4833590991502187,0.6065110209227021
3,1,0.5119737858144969,0.6357557871307151,0.5358159144362294,0.6866162501011722,0.5615567898954573,0.6802586922298651,0.48450646626526583,0.6065110209227021
3,2,0.5046940938276901,0.6293982292594079,0.5268900289779362,0.6611860186159437,0.5527049751283294,0.6586429954674208,0.48134520114158774,0.5848953241602579
3,3,0.4970640334022354,0.6357557871307151,0.5199934389415214,0.6611860186159437,0.5437755341932087,0.6586429954674208,0.47640903453148503,0.6281267176851465
3,4,0.4861248809657005,0.6395703218534994,0.5104907087025273,0.6395703218534994,0.5314095598343486,0.6586429954674208,0.46968607741258545,0.6281267176851465
3,5,0.4776651349648999,0.6395703218534994,0.502607365784013,0.6395703218534994,0.5220942443266136,0.6332127639821923,0.4653095942277782,0.6065110209227021
3,6,0.471453669858453,0.6421133450020222,0.49688986688163733,0.6204976482395779,0.5154130834063639,0.6332127639821923,0.4617445979012138,0.6065110209227021
3,7,0.4634131093724732,0.6459278797248065,0.49118514511475925,0.6243121829623621,0.5091173418722049,0.6332127639821923,0.4563166691020609,0.6065110209227021
3,8,0.4550051162606645,0.638298810279238,0.48620325231107847,0.6166831135167936,0.5028983623137382,0.6332127639821923,0.4493061002642051,0.5899813704573036
3,9,0.44505538996753186,0.6306697408336693,0.48122579065652676,0.638298810279238,0.49632746998950145,0.6802586922298651,0.44064091408090805,0.573451719991905
3,10,0.4348210488874449,0.6522854375961137,0.4766802803438675,0.6637290417644666,0.4904023903603795,0.7018743889923095,0.43180204368077524,0.5709086968433822
3,11,0.4276015960825411,0.6522854375961137,0.4717793882186167,0.6637290417644666,0.48543618715307224,0.6815302038041265,0.4271390540599405,0.5709086968433822
3,-1,0.4276015960825411,0.6522854375961137,0.4717793882186167,0.6637290417644666,0.48543618715307224,0.6815302038041265,0.4271390540599405,0.5709086968433822
4,1,0.4214239914175457,0.6522854375961137,0.46865185069274773,0.6433848565762836,0.48193724414031425,0.6815302038041265,0.42254227189527593,0.5899813704573036
4,2,0.4191327427060846,0.6522854375961137,0.46799568205581066,0.6306697408336693,0.4811730082862524,0.6815302038041265,0.4209502267898128,0.5899813704573036
4,3,0.4177497228755946,0.6522854375961137,0.4672704266606563,0.6306697408336693,0.48046145518131045,0.662457530190205,0.4205336189755251,0.5899813704573036
4,4,0.41278138052242225,0.6522854375961137,0.4641914001937418,0.6306697408336693,0.47692992354815494,0.662457530190205,0.417234061877104,0.5899813704573036
4,5,0.4084090262303802,0.6522854375961137,0.4621233524420026,0.6306697408336693,0.47432753029063485,0.662457530190205,0.41376405626271445,0.5899813704573036
4,6,0.4047743004972669,0.6522854375961137,0.4601638008541454,0.6522854375961137,0.47226688906536857,0.662457530190205,0.410876474462412,0.5899813704573036
4,7,0.40290509826355414,0.6522854375961137,0.4595599937813398,0.6522854375961137,0.4716438963000161,0.662457530190205,0.4091652073540324,0.5899813704573036
4,8,0.4015618901718633,0.6522854375961137,0.45895194633326486,0.6522854375961137,0.4710175591054945,0.662457530190205,0.40796382884516,0.5899813704573036
4,9,0.4006955092468836,0.6522854375961137,0.458540064681458,0.6522854375961137,0.4706556692406096,0.662457530190205,0.40705849491130947,0.5899813704573036
4,10,0.40019220934913363,0.6522854375961137,0.4583861098107952,0.6522854375961137,0.47054695265617924,0.662457530190205,0.40648849636892403,0.5899813704573036
4,11,0.4002625503328767,0.6522854375961137,0.45847018403200823,0.6522854375961137,0.47069029268484636,0.662457530190205,0.40649131146535,0.5899813704573036
4,-1,0.4002625503328767,0.6522854375961137,0.45847018403200823,0.6522854375961137,0.47069029268484636,0.662457530190205,0.40649131146535,0.5899813704573036
1 epoch steps cosine_pearson cosine_spearman euclidean_pearson euclidean_spearman manhattan_pearson manhattan_spearman dot_pearson dot_spearman
2 0 1 0.3108319859681156 0.3216924282881418 0.24478008048803374 0.2199715023472274 0.22843823713214909 0.19962731715904453 0.22828213082899587 0.2937191736543904
3 0 2 0.32896683800950977 0.3801819607041676 0.2592018787828544 0.2530308032780246 0.24475425880846177 0.2326866180898417 0.23285179120501534 0.354751729218939
4 0 3 0.36337870645431064 0.4259563773775791 0.286947427842334 0.29244766208012896 0.2766451768197279 0.25175929170376316 0.24181973954428423 0.3661953333872919
5 0 4 0.40429570542540616 0.49080346766491206 0.32354694661782096 0.3064342893970047 0.31852686636397054 0.2949906852286518 0.2569737124044797 0.3839964954269519
6 0 5 0.4433298791467789 0.5442069537838921 0.3640674641314855 0.4170557963577491 0.3624027015560045 0.4056121921893962 0.28214731665640674 0.3839964954269519
7 0 6 0.48194846635313754 0.5772662547146893 0.4084595753991238 0.464101724605422 0.4052036850790403 0.46028718988263767 0.3176573243813614 0.42849940052610197
8 0 7 0.5137291239872959 0.5963389283286107 0.48470453752800796 0.4882604445163891 0.47831518056645855 0.46283021303116056 0.3446401059289609 0.4513866088628077
9 0 8 0.5239001526579894 0.5925243936058264 0.49599764800169777 0.5162336991501406 0.486522727555486 0.46283021303116056 0.38467751326263355 0.46028718988263767
10 0 9 0.5280310062539452 0.6204976482395779 0.5272556463573952 0.5454784653581536 0.515098843187698 0.5238627685957092 0.39741536994668236 0.46028718988263767
11 0 10 0.5390951309567703 0.5454784653581536 0.4923685799208778 0.49080346766491206 0.4778530445316493 0.4412145162687163 0.46367195788612714 0.5302203264670163
12 0 11 0.47047569629723124 0.4691877709024677 0.4065863559983096 0.31024882411978894 0.3962551563871312 0.2937191736543904 0.5118828529342409 0.5848953241602579
13 0 -1 0.47047569629723124 0.4691877709024677 0.4065863559983096 0.31024882411978894 0.3962551563871312 0.2937191736543904 0.5118828529342409 0.5848953241602579
14 1 1 0.43587867232753813 0.3674668449615533 0.3698620004900434 0.2695604537434232 0.3615149693314503 0.2746465000404689 0.5199434094114423 0.6319412524079308
15 1 2 0.4598068067563758 0.3916255648725205 0.39684228220223206 0.2733749884662075 0.3881678025953699 0.2873616157830832 0.53330833795239 0.6319412524079308
16 1 3 0.5529414662056736 0.5531075348037221 0.4990928286955647 0.4361284699716705 0.4907255539577539 0.4933464908134349 0.5403086413041641 0.617954625091055
17 1 4 0.6144468095906074 0.6204976482395779 0.5772291789062844 0.5391209074868464 0.5757059126404819 0.546749976932415 0.521414127034061 0.6548284607446365
18 1 5 0.6424395506003274 0.6904307848239566 0.6288804839724441 0.6204976482395779 0.631888469098824 0.6026964861999179 0.49630954755717693 0.6255836945366237
19 1 6 0.6395409687525047 0.7323906667745838 0.6517158978239832 0.6904307848239566 0.6574143220652291 0.7018743889923095 0.47271191310668503 0.6154116019425322
20 1 7 0.6336338175998731 0.736205201497368 0.6551019842469132 0.7044174121408323 0.6614641433824769 0.7285761320517995 0.4706907923814303 0.6154116019425322
21 1 8 0.6268341391170168 0.7171325278834465 0.654389533351394 0.7323906667745838 0.6616881167058865 0.7285761320517995 0.47098120585078895 0.6154116019425322
22 1 9 0.5983079348489586 0.691702296398218 0.6382909612103717 0.7323906667745838 0.6447081927351945 0.7374767130716295 0.45744636785402926 0.5759947431404279
23 1 10 0.5689832380039443 0.6548284607446365 0.6180683226269311 0.7133179931606624 0.6240867429428315 0.7247615973290151 0.44450957491745907 0.5531075348037221
24 1 11 0.5162625045701104 0.638298810279238 0.5779216322042083 0.6713581112100352 0.5821759714967701 0.6929738079724794 0.41763491060006874 0.537849395912585
25 1 -1 0.5162625045701104 0.638298810279238 0.5779216322042083 0.6713581112100352 0.5821759714967701 0.6929738079724794 0.41763491060006874 0.537849395912585
26 2 1 0.46473649893162766 0.6217691598138393 0.5334766859987946 0.6548284607446365 0.535693288652834 0.6764441575070808 0.39118926649237384 0.5454784653581536
27 2 2 0.4431405760965195 0.5988819514771336 0.5124588450989886 0.6599145070416822 0.5130584173001388 0.6599145070416822 0.3819572006414451 0.5454784653581536
28 2 3 0.4393331335811442 0.6204976482395779 0.5071001511199104 0.6332127639821923 0.5074478872876883 0.6599145070416822 0.3835151286643572 0.5340348611898006
29 2 4 0.44870876650365754 0.638298810279238 0.5133856799759213 0.6497424144475908 0.5150417735450201 0.6688150880615122 0.39295555438190544 0.5340348611898006
30 2 5 0.4552336141065715 0.6433848565762836 0.5162452680672921 0.6497424144475908 0.519812086635828 0.6459278797248065 0.4013480785913682 0.5340348611898006
31 2 6 0.4696446126598315 0.6433848565762836 0.5259905146320221 0.6662720649129894 0.5324851259669662 0.647199391299068 0.4161415996385177 0.5556505579522449
32 2 7 0.49405829888514535 0.6789871806556036 0.5418555542093901 0.6980598542695251 0.5516874948617178 0.647199391299068 0.4402648980446127 0.6065110209227021
33 2 8 0.5026297849515802 0.7196755510319695 0.5450556429105228 0.7400197362201524 0.5585820044951249 0.6789871806556036 0.45329636868785467 0.6065110209227021
34 2 9 0.5081824666759697 0.7158610163091852 0.5443258246131687 0.7527348519627666 0.5623333526089962 0.7209470626062309 0.4661934679116899 0.6090540440712251
35 2 10 0.5130357229187649 0.7489203172399823 0.5428839924910025 0.7527348519627666 0.5656139526161222 0.7451057825171981 0.47971887876444524 0.6065110209227021
36 2 11 0.5132001270637361 0.7451057825171981 0.5395535008393517 0.7489203172399823 0.5642202769308216 0.6802586922298651 0.4833590991502187 0.6065110209227021
37 2 -1 0.5132001270637361 0.7451057825171981 0.5395535008393517 0.7489203172399823 0.5642202769308216 0.6802586922298651 0.4833590991502187 0.6065110209227021
38 3 1 0.5119737858144969 0.6357557871307151 0.5358159144362294 0.6866162501011722 0.5615567898954573 0.6802586922298651 0.48450646626526583 0.6065110209227021
39 3 2 0.5046940938276901 0.6293982292594079 0.5268900289779362 0.6611860186159437 0.5527049751283294 0.6586429954674208 0.48134520114158774 0.5848953241602579
40 3 3 0.4970640334022354 0.6357557871307151 0.5199934389415214 0.6611860186159437 0.5437755341932087 0.6586429954674208 0.47640903453148503 0.6281267176851465
41 3 4 0.4861248809657005 0.6395703218534994 0.5104907087025273 0.6395703218534994 0.5314095598343486 0.6586429954674208 0.46968607741258545 0.6281267176851465
42 3 5 0.4776651349648999 0.6395703218534994 0.502607365784013 0.6395703218534994 0.5220942443266136 0.6332127639821923 0.4653095942277782 0.6065110209227021
43 3 6 0.471453669858453 0.6421133450020222 0.49688986688163733 0.6204976482395779 0.5154130834063639 0.6332127639821923 0.4617445979012138 0.6065110209227021
44 3 7 0.4634131093724732 0.6459278797248065 0.49118514511475925 0.6243121829623621 0.5091173418722049 0.6332127639821923 0.4563166691020609 0.6065110209227021
45 3 8 0.4550051162606645 0.638298810279238 0.48620325231107847 0.6166831135167936 0.5028983623137382 0.6332127639821923 0.4493061002642051 0.5899813704573036
46 3 9 0.44505538996753186 0.6306697408336693 0.48122579065652676 0.638298810279238 0.49632746998950145 0.6802586922298651 0.44064091408090805 0.573451719991905
47 3 10 0.4348210488874449 0.6522854375961137 0.4766802803438675 0.6637290417644666 0.4904023903603795 0.7018743889923095 0.43180204368077524 0.5709086968433822
48 3 11 0.4276015960825411 0.6522854375961137 0.4717793882186167 0.6637290417644666 0.48543618715307224 0.6815302038041265 0.4271390540599405 0.5709086968433822
49 3 -1 0.4276015960825411 0.6522854375961137 0.4717793882186167 0.6637290417644666 0.48543618715307224 0.6815302038041265 0.4271390540599405 0.5709086968433822
50 4 1 0.4214239914175457 0.6522854375961137 0.46865185069274773 0.6433848565762836 0.48193724414031425 0.6815302038041265 0.42254227189527593 0.5899813704573036
51 4 2 0.4191327427060846 0.6522854375961137 0.46799568205581066 0.6306697408336693 0.4811730082862524 0.6815302038041265 0.4209502267898128 0.5899813704573036
52 4 3 0.4177497228755946 0.6522854375961137 0.4672704266606563 0.6306697408336693 0.48046145518131045 0.662457530190205 0.4205336189755251 0.5899813704573036
53 4 4 0.41278138052242225 0.6522854375961137 0.4641914001937418 0.6306697408336693 0.47692992354815494 0.662457530190205 0.417234061877104 0.5899813704573036
54 4 5 0.4084090262303802 0.6522854375961137 0.4621233524420026 0.6306697408336693 0.47432753029063485 0.662457530190205 0.41376405626271445 0.5899813704573036
55 4 6 0.4047743004972669 0.6522854375961137 0.4601638008541454 0.6522854375961137 0.47226688906536857 0.662457530190205 0.410876474462412 0.5899813704573036
56 4 7 0.40290509826355414 0.6522854375961137 0.4595599937813398 0.6522854375961137 0.4716438963000161 0.662457530190205 0.4091652073540324 0.5899813704573036
57 4 8 0.4015618901718633 0.6522854375961137 0.45895194633326486 0.6522854375961137 0.4710175591054945 0.662457530190205 0.40796382884516 0.5899813704573036
58 4 9 0.4006955092468836 0.6522854375961137 0.458540064681458 0.6522854375961137 0.4706556692406096 0.662457530190205 0.40705849491130947 0.5899813704573036
59 4 10 0.40019220934913363 0.6522854375961137 0.4583861098107952 0.6522854375961137 0.47054695265617924 0.662457530190205 0.40648849636892403 0.5899813704573036
60 4 11 0.4002625503328767 0.6522854375961137 0.45847018403200823 0.6522854375961137 0.47069029268484636 0.662457530190205 0.40649131146535 0.5899813704573036
61 4 -1 0.4002625503328767 0.6522854375961137 0.45847018403200823 0.6522854375961137 0.47069029268484636 0.662457530190205 0.40649131146535 0.5899813704573036

20
modules.json Normal file
View File

@@ -0,0 +1,20 @@
[
{
"idx": 0,
"name": "0",
"path": "",
"type": "sentence_transformers.models.Transformer"
},
{
"idx": 1,
"name": "1",
"path": "1_Pooling",
"type": "sentence_transformers.models.Pooling"
},
{
"idx": 2,
"name": "2",
"path": "2_Dense",
"type": "sentence_transformers.models.Dense"
}
]

3
pytorch_model.bin Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:530201cc4bdbd3f0e72e57341b92d151831f198db6155835b138999f0bb81ee4
size 439484849

View File

@@ -0,0 +1,4 @@
{
"max_seq_length": 256,
"do_lower_case": false
}

View File

@@ -0,0 +1,2 @@
epoch,steps,cosine_pearson,cosine_spearman,euclidean_pearson,euclidean_spearman,manhattan_pearson,manhattan_spearman,dot_pearson,dot_spearman
-1,-1,0.8112052968349174,0.502398020492055,0.8124335366238343,0.5049980878771866,0.814252374975173,0.5090931940087687,0.7911350575329102,0.5122132748709267
1 epoch steps cosine_pearson cosine_spearman euclidean_pearson euclidean_spearman manhattan_pearson manhattan_spearman dot_pearson dot_spearman
2 -1 -1 0.8112052968349174 0.502398020492055 0.8124335366238343 0.5049980878771866 0.814252374975173 0.5090931940087687 0.7911350575329102 0.5122132748709267

1
special_tokens_map.json Normal file
View File

@@ -0,0 +1 @@
{"unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]"}

31164
tokenizer.json Normal file

File diff suppressed because it is too large Load Diff

1
tokenizer_config.json Normal file
View File

@@ -0,0 +1 @@
{"do_lower_case": true, "unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]", "tokenize_chinese_chars": true, "strip_accents": false, "do_basic_tokenize": true, "never_split": null, "model_max_length": 512, "special_tokens_map_file": "/root/.cache/huggingface/transformers/78141ed1e8dcc5ff370950397ca0d1c5c9da478f54ec14544187d8a93eff1a26.f982506b52498d4adb4bd491f593dc92b2ef6be61bfdbe9d30f53f963f9f5b66", "name_or_path": "dccuchile/bert-base-spanish-wwm-uncased", "tokenizer_class": "BertTokenizer"}

31002
vocab.txt Normal file

File diff suppressed because it is too large Load Diff