Sequence to sequence pretraining for a less-resourced Slovenian language
Paper
•
2207.13988
•
Published
t5-sl-large model is a Slovene T5 model. It has 24 encoder and 24 decoder layers, in total about 750 million parameters. It was trained for 3 epochs on the following corpora:
The following corpora were used for training the model:
The model is described in detail and evaluated in our paper "Sequence to sequence pretraining for a less-resourced Slovenian language"