初始化项目,由ModelHub XC社区提供模型
Model: jfarray/Model_paraphrase-multilingual-mpnet-base-v2_5_Epochs Source: Original Platform
This commit is contained in:
29
.gitattributes
vendored
Normal file
29
.gitattributes
vendored
Normal file
@@ -0,0 +1,29 @@
|
||||
*.7z filter=lfs diff=lfs merge=lfs -text
|
||||
*.arrow filter=lfs diff=lfs merge=lfs -text
|
||||
*.bin filter=lfs diff=lfs merge=lfs -text
|
||||
*.bin.* filter=lfs diff=lfs merge=lfs -text
|
||||
*.bz2 filter=lfs diff=lfs merge=lfs -text
|
||||
*.ftz filter=lfs diff=lfs merge=lfs -text
|
||||
*.gz filter=lfs diff=lfs merge=lfs -text
|
||||
*.h5 filter=lfs diff=lfs merge=lfs -text
|
||||
*.joblib filter=lfs diff=lfs merge=lfs -text
|
||||
*.lfs.* filter=lfs diff=lfs merge=lfs -text
|
||||
*.model filter=lfs diff=lfs merge=lfs -text
|
||||
*.msgpack filter=lfs diff=lfs merge=lfs -text
|
||||
*.onnx filter=lfs diff=lfs merge=lfs -text
|
||||
*.ot filter=lfs diff=lfs merge=lfs -text
|
||||
*.parquet filter=lfs diff=lfs merge=lfs -text
|
||||
*.pb filter=lfs diff=lfs merge=lfs -text
|
||||
*.pt filter=lfs diff=lfs merge=lfs -text
|
||||
*.pth filter=lfs diff=lfs merge=lfs -text
|
||||
*.rar filter=lfs diff=lfs merge=lfs -text
|
||||
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
||||
*.tar.* filter=lfs diff=lfs merge=lfs -text
|
||||
*.tflite filter=lfs diff=lfs merge=lfs -text
|
||||
*.tgz filter=lfs diff=lfs merge=lfs -text
|
||||
*.xz filter=lfs diff=lfs merge=lfs -text
|
||||
*.zip filter=lfs diff=lfs merge=lfs -text
|
||||
*.zstandard filter=lfs diff=lfs merge=lfs -text
|
||||
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
||||
tokenizer.json filter=lfs diff=lfs merge=lfs -text
|
||||
pytorch_model.bin filter=lfs diff=lfs merge=lfs -text
|
||||
7
1_Pooling/config.json
Normal file
7
1_Pooling/config.json
Normal file
@@ -0,0 +1,7 @@
|
||||
{
|
||||
"word_embedding_dimension": 768,
|
||||
"pooling_mode_cls_token": false,
|
||||
"pooling_mode_mean_tokens": true,
|
||||
"pooling_mode_max_tokens": false,
|
||||
"pooling_mode_mean_sqrt_len_tokens": false
|
||||
}
|
||||
125
README.md
Normal file
125
README.md
Normal file
@@ -0,0 +1,125 @@
|
||||
---
|
||||
pipeline_tag: sentence-similarity
|
||||
tags:
|
||||
- sentence-transformers
|
||||
- feature-extraction
|
||||
- sentence-similarity
|
||||
- transformers
|
||||
---
|
||||
|
||||
# {MODEL_NAME}
|
||||
|
||||
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.
|
||||
|
||||
<!--- Describe your model here -->
|
||||
|
||||
## Usage (Sentence-Transformers)
|
||||
|
||||
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
|
||||
|
||||
```
|
||||
pip install -U sentence-transformers
|
||||
```
|
||||
|
||||
Then you can use the model like this:
|
||||
|
||||
```python
|
||||
from sentence_transformers import SentenceTransformer
|
||||
sentences = ["This is an example sentence", "Each sentence is converted"]
|
||||
|
||||
model = SentenceTransformer('{MODEL_NAME}')
|
||||
embeddings = model.encode(sentences)
|
||||
print(embeddings)
|
||||
```
|
||||
|
||||
|
||||
|
||||
## Usage (HuggingFace Transformers)
|
||||
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
|
||||
|
||||
```python
|
||||
from transformers import AutoTokenizer, AutoModel
|
||||
import torch
|
||||
|
||||
|
||||
#Mean Pooling - Take attention mask into account for correct averaging
|
||||
def mean_pooling(model_output, attention_mask):
|
||||
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
|
||||
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
|
||||
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
|
||||
|
||||
|
||||
# Sentences we want sentence embeddings for
|
||||
sentences = ['This is an example sentence', 'Each sentence is converted']
|
||||
|
||||
# Load model from HuggingFace Hub
|
||||
tokenizer = AutoTokenizer.from_pretrained('{MODEL_NAME}')
|
||||
model = AutoModel.from_pretrained('{MODEL_NAME}')
|
||||
|
||||
# Tokenize sentences
|
||||
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
|
||||
|
||||
# Compute token embeddings
|
||||
with torch.no_grad():
|
||||
model_output = model(**encoded_input)
|
||||
|
||||
# Perform pooling. In this case, mean pooling.
|
||||
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
|
||||
|
||||
print("Sentence embeddings:")
|
||||
print(sentence_embeddings)
|
||||
```
|
||||
|
||||
|
||||
|
||||
## Evaluation Results
|
||||
|
||||
<!--- Describe how your model was evaluated -->
|
||||
|
||||
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name={MODEL_NAME})
|
||||
|
||||
|
||||
## Training
|
||||
The model was trained with the parameters:
|
||||
|
||||
**DataLoader**:
|
||||
|
||||
`torch.utils.data.dataloader.DataLoader` of length 11 with parameters:
|
||||
```
|
||||
{'batch_size': 15, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
|
||||
```
|
||||
|
||||
**Loss**:
|
||||
|
||||
`sentence_transformers.losses.CosineSimilarityLoss.CosineSimilarityLoss`
|
||||
|
||||
Parameters of the fit()-Method:
|
||||
```
|
||||
{
|
||||
"epochs": 5,
|
||||
"evaluation_steps": 1,
|
||||
"evaluator": "sentence_transformers.evaluation.EmbeddingSimilarityEvaluator.EmbeddingSimilarityEvaluator",
|
||||
"max_grad_norm": 1,
|
||||
"optimizer_class": "<class 'transformers.optimization.AdamW'>",
|
||||
"optimizer_params": {
|
||||
"lr": 2e-05
|
||||
},
|
||||
"scheduler": "WarmupLinear",
|
||||
"steps_per_epoch": null,
|
||||
"warmup_steps": 6,
|
||||
"weight_decay": 0.01
|
||||
}
|
||||
```
|
||||
|
||||
|
||||
## Full Model Architecture
|
||||
```
|
||||
SentenceTransformer(
|
||||
(0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: XLMRobertaModel
|
||||
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
|
||||
)
|
||||
```
|
||||
|
||||
## Citing & Authors
|
||||
|
||||
<!--- Describe where people can find more information -->
|
||||
29
config.json
Normal file
29
config.json
Normal file
@@ -0,0 +1,29 @@
|
||||
{
|
||||
"_name_or_path": "/root/.cache/torch/sentence_transformers/sentence-transformers_paraphrase-multilingual-mpnet-base-v2/",
|
||||
"architectures": [
|
||||
"XLMRobertaModel"
|
||||
],
|
||||
"attention_probs_dropout_prob": 0.1,
|
||||
"bos_token_id": 0,
|
||||
"classifier_dropout": null,
|
||||
"eos_token_id": 2,
|
||||
"gradient_checkpointing": false,
|
||||
"hidden_act": "gelu",
|
||||
"hidden_dropout_prob": 0.1,
|
||||
"hidden_size": 768,
|
||||
"initializer_range": 0.02,
|
||||
"intermediate_size": 3072,
|
||||
"layer_norm_eps": 1e-05,
|
||||
"max_position_embeddings": 514,
|
||||
"model_type": "xlm-roberta",
|
||||
"num_attention_heads": 12,
|
||||
"num_hidden_layers": 12,
|
||||
"output_past": true,
|
||||
"pad_token_id": 1,
|
||||
"position_embedding_type": "absolute",
|
||||
"torch_dtype": "float32",
|
||||
"transformers_version": "4.16.2",
|
||||
"type_vocab_size": 1,
|
||||
"use_cache": true,
|
||||
"vocab_size": 250002
|
||||
}
|
||||
7
config_sentence_transformers.json
Normal file
7
config_sentence_transformers.json
Normal file
@@ -0,0 +1,7 @@
|
||||
{
|
||||
"__version__": {
|
||||
"sentence_transformers": "2.0.0",
|
||||
"transformers": "4.7.0",
|
||||
"pytorch": "1.9.0+cu102"
|
||||
}
|
||||
}
|
||||
61
eval/similarity_evaluation_results.csv
Normal file
61
eval/similarity_evaluation_results.csv
Normal file
@@ -0,0 +1,61 @@
|
||||
epoch,steps,cosine_pearson,cosine_spearman,euclidean_pearson,euclidean_spearman,manhattan_pearson,manhattan_spearman,dot_pearson,dot_spearman
|
||||
0,1,0.24428484492337774,0.1856406898421688,0.2726824721314313,0.17419708567381592,0.20191195613958324,0.18182615511938452,-0.10491629431153585,-0.05976104399028721
|
||||
0,2,0.20760628195495687,0.11570755325779014,0.23711421801180643,0.15512441205989447,0.1525442605032658,0.0890058101983001,-0.10481777399423998,-0.0762906944556858
|
||||
0,3,0.09968795484366545,-0.00381453472278429,0.1358901448908031,0.02797325463375146,0.03965947880495599,0.00762906944556858,-0.12308696197824481,-0.10807848381222157
|
||||
0,4,-0.006361124650332465,-0.052131974544718636,0.034078483687493354,-0.005086046297045721,-0.04812073111354643,-0.11316453010926729,-0.12646118021947494,-0.17546859724807737
|
||||
0,5,-0.09005348352309914,-0.12333662270335873,-0.062209937933416753,-0.14495231946580303,-0.12570472793922574,-0.19326975928773737,-0.14388218803045597,-0.2937191736543904
|
||||
0,6,-0.13097095164169384,-0.2949906852286518,-0.1311934698949744,-0.19962731715904453,-0.18221433942604204,-0.26447440744637746,-0.13336804796646234,-0.22632906021853458
|
||||
0,7,-0.11875749815008411,-0.25175929170376316,-0.14887899381674394,-0.1729255740995545,-0.20652116782366475,-0.3038912662484818,-0.06831298378061908,-0.15385290048563305
|
||||
0,8,-0.10311282698582203,-0.19326975928773737,-0.14926506878676526,-0.19326975928773737,-0.20866442957211304,-0.2810040579117761,-0.01414150626962834,-0.13986627316875733
|
||||
0,9,-0.07249097523195104,-0.1729255740995545,-0.13950006250752797,-0.18436917826790739,-0.2085275970325456,-0.2593883611493318,0.10929321828862724,0.05594650926750292
|
||||
0,10,0.002166526781181519,0.03305930093079718,-0.10233868363195012,-0.07374767130716295,-0.18526337938584458,-0.15003836576284876,0.2939847088769777,0.4132412616349648
|
||||
0,11,-0.06542853066407825,-0.045774416673411485,-0.15806612686656948,-0.14876685418858734,-0.2397147419212809,-0.19962731715904453,0.23274170943450698,0.41069823848644194
|
||||
0,-1,-0.06542853066407825,-0.045774416673411485,-0.15806612686656948,-0.14876685418858734,-0.2397147419212809,-0.19962731715904453,0.23274170943450698,0.41069823848644194
|
||||
1,1,-0.09672644083942251,-0.034330812505058615,-0.18534513057094443,-0.15639592363415591,-0.2673191736430352,-0.19326975928773737,0.2097459552508294,0.41451277320922625
|
||||
1,2,-0.09676331546160145,0.12587964585188158,-0.1824138803793433,-0.12715115742614302,-0.2591086722938196,-0.2326866180898417,0.19351320233645955,0.3471226597733704
|
||||
1,3,-0.024634025118860904,0.20598487503035168,-0.10450478022594249,0.08519127547551582,-0.17234332771086547,0.03305930093079718,0.210201179751506,0.35220870607041616
|
||||
1,4,0.04634169199004134,0.3293214977337104,-0.013141313611438119,0.2682889421691617,-0.06824216400019185,0.1678395278025088,0.22110299379741627,0.4170557963577491
|
||||
1,5,0.06957049525800159,0.35729475236746183,0.019005325105763755,0.2962621968029132,-0.03134455317816696,0.25430231485228605,0.22544605394032152,0.4068837037636577
|
||||
1,6,0.09643517201025414,0.38526800700121333,0.0561908178590226,0.3000767315256975,0.010966975961853986,0.3051627778227432,0.23218646038277624,0.4068837037636577
|
||||
1,7,0.11495367462859014,0.38526800700121333,0.0844196565805232,0.31533487041683467,0.04448688640007318,0.36365231023876904,0.23254304497614164,0.40052614589235047
|
||||
1,8,0.12187682645363462,0.3420366134763247,0.09469220063298772,0.32296393986240324,0.05491086774544972,0.3420366134763247,0.2353217467609693,0.4068837037636577
|
||||
1,9,0.11896950359197529,0.33059300930797186,0.09276074762894111,0.33695056717927896,0.05226406371594157,0.28609010420882175,0.2349029468831924,0.43739998154593196
|
||||
1,10,0.059868676342374715,0.15766743520841733,0.03169227167123969,0.12079359955483586,-0.013329686097654305,0.12715115742614302,0.1846286526783525,0.2962621968029132
|
||||
1,11,0.012086158504150565,0.057218020841764354,-0.013770028451881562,-0.00890058101983001,-0.05494876227892883,0.0,0.12781653873721924,0.16021045835694017
|
||||
1,-1,0.012086158504150565,0.057218020841764354,-0.013770028451881562,-0.00890058101983001,-0.05494876227892883,0.0,0.12781653873721924,0.16021045835694017
|
||||
2,1,-0.010981167682463272,-0.034330812505058615,-0.03619559858893778,-0.0381453472278429,-0.07150618442783947,-0.045774416673411485,0.09540352160754426,0.19962731715904453
|
||||
2,2,0.021286352996122002,0.06484709028733295,-0.006888578381641333,-0.022887208336705742,-0.03855284498361854,-0.021615696762444313,0.12362310081924338,0.19454127086199882
|
||||
2,3,0.060044439248193165,0.10807848381222157,0.029620062567784284,0.0419598819506272,0.0002622806184738965,0.0419598819506272,0.15573871761093344,0.19326975928773737
|
||||
2,4,0.09758794589215579,0.12842266900040444,0.0644019606998401,0.09790639121813012,0.038385393451486914,0.0839197639012544,0.188510335331623,0.2695604537434232
|
||||
2,5,0.13703526519168727,0.19962731715904453,0.10293594928976885,0.15512441205989447,0.07953741235613597,0.20089882873330597,0.22012517073136859,0.27846103476325323
|
||||
2,6,0.1855235779875127,0.31406335884257325,0.15278310339181547,0.2657459190206389,0.1327446847118525,0.2606598727235932,0.25623551230120895,0.3038912662484818
|
||||
2,7,0.20949153030047835,0.3051627778227432,0.1777057479998335,0.2937191736543904,0.15992318625986915,0.29244766208012896,0.2715982446492133,0.3496656829218933
|
||||
2,8,0.23199992564118438,0.32042091671388034,0.20530707974604018,0.30261975467422036,0.18827658534181113,0.29244766208012896,0.2868592668443518,0.36492382181303046
|
||||
2,9,0.25014137747916165,0.3115203356940504,0.2301588518815574,0.31660638199109614,0.2143333296828058,0.30134824309995895,0.3003550674173104,0.4081552153379191
|
||||
2,10,0.24626884574788424,0.3331360324564947,0.228029906198482,0.28481859263456033,0.21309232660224187,0.30134824309995895,0.2993127230241374,0.41069823848644194
|
||||
2,11,0.2422087368553408,0.354751729218939,0.22519537991730973,0.28481859263456033,0.2113641531010709,0.30134824309995895,0.2978783619603186,0.41069823848644194
|
||||
2,-1,0.2422087368553408,0.354751729218939,0.22519537991730973,0.28481859263456033,0.2113641531010709,0.30134824309995895,0.2978783619603186,0.41069823848644194
|
||||
3,1,0.2222182611908739,0.3115203356940504,0.20111996032764892,0.31660638199109614,0.18706879566167503,0.28990463893160606,0.2847293342958604,0.4081552153379191
|
||||
3,2,0.20241490521995298,0.3496656829218933,0.17719373481351203,0.27718952318899176,0.16305046171228688,0.24285871068393314,0.27123582362445237,0.36110928709024614
|
||||
3,3,0.17393962117230172,0.2746465000404689,0.14563596641779133,0.2403156875354103,0.1307824825499519,0.21361394447592028,0.24940267570508626,0.35729475236746183
|
||||
3,4,0.15653031964207764,0.2568453380008089,0.12823087036082723,0.17928313197086163,0.11323176034983784,0.21361394447592028,0.23498577769212176,0.30261975467422036
|
||||
3,5,0.14822364036965754,0.25175929170376316,0.12079130334922693,0.2021703403075674,0.10481316109886843,0.18818371299069164,0.23141778448304892,0.3356790556050176
|
||||
3,6,0.13993968709065738,0.2021703403075674,0.11219267163316003,0.1296941805746659,0.095519182817243,0.18182615511938452,0.22686421598973858,0.3356790556050176
|
||||
3,7,0.12574280209136976,0.2568453380008089,0.09685878939273258,0.1296941805746659,0.08027403806250179,0.13350871529745015,0.21462125272856583,0.30261975467422036
|
||||
3,8,0.12156265705204788,0.22378603707001168,0.0924892685646278,0.16529650465398593,0.07598754691433504,0.15512441205989447,0.21160405697871526,0.30261975467422036
|
||||
3,9,0.11857686012065712,0.22632906021853458,0.0894930416573932,0.13732325002023446,0.07301057584552866,0.15512441205989447,0.2113030951579165,0.30261975467422036
|
||||
3,10,0.10729356864325958,0.20344185188182884,0.07796462720346475,0.10934999538648299,0.061529666363701885,0.12206511112909728,0.20134860071865865,0.3674668449615533
|
||||
3,11,0.0977944595233388,0.1907267361392145,0.06817358655402952,0.10934999538648299,0.05182744294193961,0.12206511112909728,0.19340887169490328,0.32423545143666466
|
||||
3,-1,0.0977944595233388,0.1907267361392145,0.06817358655402952,0.10934999538648299,0.05182744294193961,0.12206511112909728,0.19340887169490328,0.32423545143666466
|
||||
4,1,0.0899870657776459,0.21615696762444314,0.06055077229826266,0.09790639121813012,0.04401623488141881,0.13986627316875733,0.1875674433493838,0.32423545143666466
|
||||
4,2,0.08102216903053669,0.16275348150546307,0.05168811835506901,0.09790639121813012,0.03476475323733826,0.13986627316875733,0.18046633852061603,0.32423545143666466
|
||||
4,3,0.07354754425075906,0.16275348150546307,0.044609695006488814,0.09790639121813012,0.027282626375289346,0.13986627316875733,0.17423431357098337,0.32423545143666466
|
||||
4,4,0.072628455370346,0.16275348150546307,0.04393693486778321,0.10172092594091442,0.02662953353245701,0.13986627316875733,0.17347086403210338,0.31406335884257325
|
||||
4,5,0.07264751275104477,0.17165406252529306,0.04431500780252348,0.10172092594091442,0.02696535488184998,0.13986627316875733,0.17392088123323024,0.31406335884257325
|
||||
4,6,0.07309233321051097,0.17165406252529306,0.04502028430787145,0.10172092594091442,0.02768099624495165,0.13986627316875733,0.17428540435135212,0.31406335884257325
|
||||
4,7,0.07415863502210646,0.17165406252529306,0.04621911511432109,0.10172092594091442,0.029017384120890066,0.13986627316875733,0.1754214881608098,0.31406335884257325
|
||||
4,8,0.07592167356771871,0.17165406252529306,0.04821802267560702,0.10172092594091442,0.03127579350333767,0.13986627316875733,0.17731038473768557,0.31406335884257325
|
||||
4,9,0.0772956377426426,0.17928313197086163,0.049788723209415055,0.10172092594091442,0.03298899307007637,0.13986627316875733,0.17878720524908634,0.31406335884257325
|
||||
4,10,0.07796606514045307,0.17928313197086163,0.050463598796421795,0.10172092594091442,0.03369484305496711,0.13986627316875733,0.17960799863427354,0.31406335884257325
|
||||
4,11,0.07890410077819265,0.17928313197086163,0.05122711152555231,0.10172092594091442,0.03438548132124625,0.13986627316875733,0.1810031325721923,0.3318645208822333
|
||||
4,-1,0.07890410077819265,0.17928313197086163,0.05122711152555231,0.10172092594091442,0.03438548132124625,0.13986627316875733,0.1810031325721923,0.3318645208822333
|
||||
|
14
modules.json
Normal file
14
modules.json
Normal file
@@ -0,0 +1,14 @@
|
||||
[
|
||||
{
|
||||
"idx": 0,
|
||||
"name": "0",
|
||||
"path": "",
|
||||
"type": "sentence_transformers.models.Transformer"
|
||||
},
|
||||
{
|
||||
"idx": 1,
|
||||
"name": "1",
|
||||
"path": "1_Pooling",
|
||||
"type": "sentence_transformers.models.Pooling"
|
||||
}
|
||||
]
|
||||
3
pytorch_model.bin
Normal file
3
pytorch_model.bin
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:04d9e6f8cb14e977c3ebbfdbba280daf06401aacd62df23562236b38fa4b424d
|
||||
size 1112255985
|
||||
4
sentence_bert_config.json
Normal file
4
sentence_bert_config.json
Normal file
@@ -0,0 +1,4 @@
|
||||
{
|
||||
"max_seq_length": 128,
|
||||
"do_lower_case": false
|
||||
}
|
||||
3
sentencepiece.bpe.model
Normal file
3
sentencepiece.bpe.model
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:cfc8146abe2a0488e9e2a0c56de7952f7c11ab059eca145a0a727afce0db2865
|
||||
size 5069051
|
||||
2
similarity_evaluation_sts-test_results.csv
Normal file
2
similarity_evaluation_sts-test_results.csv
Normal file
@@ -0,0 +1,2 @@
|
||||
epoch,steps,cosine_pearson,cosine_spearman,euclidean_pearson,euclidean_spearman,manhattan_pearson,manhattan_spearman,dot_pearson,dot_spearman
|
||||
-1,-1,0.7597873915352352,0.44841412140826,0.7579338655938147,0.46685834942153726,0.7530452019043229,0.45868438757952973,0.735161000552104,0.3982165704540628
|
||||
|
1
special_tokens_map.json
Normal file
1
special_tokens_map.json
Normal file
@@ -0,0 +1 @@
|
||||
{"bos_token": "<s>", "eos_token": "</s>", "unk_token": "<unk>", "sep_token": "</s>", "pad_token": "<pad>", "cls_token": "<s>", "mask_token": {"content": "<mask>", "single_word": false, "lstrip": true, "rstrip": false, "normalized": false}}
|
||||
3
tokenizer.json
Normal file
3
tokenizer.json
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:3a3313815c3d2e1b78b5182b09e66e6cd4cdd54df67a35c4a318c23d461821a4
|
||||
size 17082913
|
||||
1
tokenizer_config.json
Normal file
1
tokenizer_config.json
Normal file
@@ -0,0 +1 @@
|
||||
{"bos_token": "<s>", "eos_token": "</s>", "sep_token": "</s>", "cls_token": "<s>", "unk_token": "<unk>", "pad_token": "<pad>", "mask_token": {"content": "<mask>", "single_word": false, "lstrip": true, "rstrip": false, "normalized": true, "__type": "AddedToken"}, "model_max_length": 512, "special_tokens_map_file": null, "name_or_path": "/root/.cache/torch/sentence_transformers/sentence-transformers_paraphrase-multilingual-mpnet-base-v2/", "tokenizer_class": "XLMRobertaTokenizer"}
|
||||
Reference in New Issue
Block a user