61 lines
1.5 KiB
Markdown
61 lines
1.5 KiB
Markdown
|
|
---
|
||
|
|
license: cc-by-4.0
|
||
|
|
datasets:
|
||
|
|
- clarin-knext/msmarco-pl
|
||
|
|
- clarin-knext/nq-pl
|
||
|
|
- clarin-knext/hotpotqa-pl
|
||
|
|
- clarin-knext/scidocs-pl
|
||
|
|
- clarin-knext/nfcorpus-pl
|
||
|
|
- clarin-knext/dbpedia-pl
|
||
|
|
- clarin-knext/trec-covid-pl
|
||
|
|
- clarin-knext/quora-pl
|
||
|
|
- clarin-knext/arguana-pl
|
||
|
|
- clarin-knext/fiqa-pl
|
||
|
|
- radlab/wikipedia-pl
|
||
|
|
- radlab/legal-mc4-pl
|
||
|
|
language:
|
||
|
|
- pl
|
||
|
|
library_name: transformers
|
||
|
|
tags:
|
||
|
|
- gpt2
|
||
|
|
- from-scratch
|
||
|
|
- polish-gpt2
|
||
|
|
---
|
||
|
|
|
||
|
|
## Description
|
||
|
|
This is the polish gpt2 model in small architecture.
|
||
|
|
|
||
|
|
This model was released on 30.11.2023 and it is the newest version of `radlab/polish-gpt2-small` (https://huggingface.co/radlab/polish-gpt2-small)
|
||
|
|
|
||
|
|
|
||
|
|
## Datasets
|
||
|
|
Data which are used to train this model:
|
||
|
|
- clarin-knext/msmarco-pl
|
||
|
|
- clarin-knext/nq-pl
|
||
|
|
- clarin-knext/hotpotqa-pl
|
||
|
|
- clarin-knext/scidocs-pl
|
||
|
|
- clarin-knext/nfcorpus-pl
|
||
|
|
- clarin-knext/dbpedia-pl
|
||
|
|
- clarin-knext/trec-covid-pl
|
||
|
|
- clarin-knext/quora-pl
|
||
|
|
- clarin-knext/arguana-pl
|
||
|
|
- clarin-knext/fiqa-pl
|
||
|
|
- radlab/wikipedia-pl
|
||
|
|
- radlab/legal-mc4-pl
|
||
|
|
- own corpora not published yet
|
||
|
|
|
||
|
|
It is about 30,5 GB of data which is 3 times more than the prevoius version.
|
||
|
|
|
||
|
|
|
||
|
|
## Metrics from W&B
|
||
|
|
|
||
|
|
|
||
|
|

|
||
|
|
|
||
|
|
|
||
|
|

|
||
|
|
|
||
|
|

|
||
|
|
|
||
|
|
## Changelog
|
||
|
|
- _2023.11.30_ - new dataset
|