t5-sci-en2de-continued-pretraining
Collection
6 items
•
Updated
•
1
GermanT5/t5-efficient-gc4-german-base-nl36 continued for 15 000 steps on the German portion of the scientific corpus (same preprocessing as EN). Checkpoint: cross_lingual_transfer/logs/native_baseline/.../step-step=015000.ckpt.
German split of the Unpaywall-derived corpus (continued-pretraining windows of 512 tokens, 50 % overlap).
| Metric | EN | DE |
|---|---|---|
| Overall accuracy | 0.2295 | 0.2295 |
| Humanities | 0.2421 | 0.2421 |
| STEM | 0.2125 | 0.2125 |
| Social Sciences | 0.2171 | 0.2171 |
| Other | 0.2398 | 0.2398 |
German scientific NLP baseline; compare against WECHSEL-based models or continue fine-tuning on German datasets.