MaziyarPanahi commited on
Commit
a9d07a1
·
verified ·
1 Parent(s): 617a9b1

Upload Italian PII detection model OpenMed-PII-Italian-ModernMed-Large-395M-v1

Browse files
README.md ADDED
@@ -0,0 +1,305 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - it
4
+ license: apache-2.0
5
+ base_model: answerdotai/ModernBERT-large
6
+ tags:
7
+ - token-classification
8
+ - ner
9
+ - pii
10
+ - pii-detection
11
+ - de-identification
12
+ - privacy
13
+ - healthcare
14
+ - medical
15
+ - clinical
16
+ - phi
17
+ - italian
18
+ - pytorch
19
+ - transformers
20
+ - openmed
21
+ pipeline_tag: token-classification
22
+ library_name: transformers
23
+ metrics:
24
+ - f1
25
+ - precision
26
+ - recall
27
+ model-index:
28
+ - name: OpenMed-PII-Italian-ModernMed-Large-395M-v1
29
+ results:
30
+ - task:
31
+ type: token-classification
32
+ name: Named Entity Recognition
33
+ dataset:
34
+ name: AI4Privacy (Italian subset)
35
+ type: ai4privacy/pii-masking-400k
36
+ split: test
37
+ metrics:
38
+ - type: f1
39
+ value: 0.9564
40
+ name: F1 (micro)
41
+ - type: precision
42
+ value: 0.9541
43
+ name: Precision
44
+ - type: recall
45
+ value: 0.9587
46
+ name: Recall
47
+ widget:
48
+ - text: "Dr. Marco Rossi (Codice Fiscale: RSSMRC85C15H501Z) può essere contattato a marco.rossi@ospedale.it o al +39 333 123 4567. Abita in Via Roma 25, 00184 Roma."
49
+ example_title: Clinical Note with PII (Italian)
50
+ ---
51
+
52
+ # OpenMed-PII-Italian-ModernMed-Large-395M-v1
53
+
54
+ **Italian PII Detection Model** | 395M Parameters | Open Source
55
+
56
+ [![F1 Score](https://img.shields.io/badge/F1-95.64%25-brightgreen)]() [![Precision](https://img.shields.io/badge/Precision-95.41%25-blue)]() [![Recall](https://img.shields.io/badge/Recall-95.87%25-orange)]()
57
+
58
+ ## Model Description
59
+
60
+ **OpenMed-PII-Italian-ModernMed-Large-395M-v1** is a transformer-based token classification model fine-tuned for **Personally Identifiable Information (PII) detection in Italian text**. This model identifies and classifies **54 types of sensitive information** including names, addresses, social security numbers, medical record numbers, and more.
61
+
62
+ ### Key Features
63
+
64
+ - **Italian-Optimized**: Specifically trained on Italian text for optimal performance
65
+ - **High Accuracy**: Achieves strong F1 scores across diverse PII categories
66
+ - **Comprehensive Coverage**: Detects 55+ entity types spanning personal, financial, medical, and contact information
67
+ - **Privacy-Focused**: Designed for de-identification and compliance with GDPR and other privacy regulations
68
+ - **Production-Ready**: Optimized for real-world text processing pipelines
69
+
70
+ ## Performance
71
+
72
+ Evaluated on the Italian subset of AI4Privacy dataset:
73
+
74
+ | Metric | Score |
75
+ |:---|:---:|
76
+ | **Micro F1** | **0.9564** |
77
+ | Precision | 0.9541 |
78
+ | Recall | 0.9587 |
79
+ | Macro F1 | 0.9491 |
80
+ | Weighted F1 | 0.9563 |
81
+ | Accuracy | 0.9940 |
82
+
83
+ ### Top 10 Italian PII Models
84
+
85
+ | Rank | Model | F1 | Precision | Recall |
86
+ |:---:|:---|:---:|:---:|:---:|
87
+ | 1 | [OpenMed-PII-Italian-SuperClinical-Large-434M-v1](https://huggingface.co/OpenMed/OpenMed-PII-Italian-SuperClinical-Large-434M-v1) | 0.9728 | 0.9707 | 0.9750 |
88
+ | 2 | [OpenMed-PII-Italian-EuroMed-210M-v1](https://huggingface.co/OpenMed/OpenMed-PII-Italian-EuroMed-210M-v1) | 0.9685 | 0.9663 | 0.9707 |
89
+ | 3 | [OpenMed-PII-Italian-ClinicalBGE-568M-v1](https://huggingface.co/OpenMed/OpenMed-PII-Italian-ClinicalBGE-568M-v1) | 0.9678 | 0.9653 | 0.9703 |
90
+ | 4 | [OpenMed-PII-Italian-SnowflakeMed-Large-568M-v1](https://huggingface.co/OpenMed/OpenMed-PII-Italian-SnowflakeMed-Large-568M-v1) | 0.9678 | 0.9653 | 0.9702 |
91
+ | 5 | [OpenMed-PII-Italian-BigMed-Large-560M-v1](https://huggingface.co/OpenMed/OpenMed-PII-Italian-BigMed-Large-560M-v1) | 0.9671 | 0.9645 | 0.9697 |
92
+ | 6 | [OpenMed-PII-Italian-SuperMedical-Large-355M-v1](https://huggingface.co/OpenMed/OpenMed-PII-Italian-SuperMedical-Large-355M-v1) | 0.9663 | 0.9640 | 0.9686 |
93
+ | 7 | [OpenMed-PII-Italian-mClinicalE5-Large-560M-v1](https://huggingface.co/OpenMed/OpenMed-PII-Italian-mClinicalE5-Large-560M-v1) | 0.9659 | 0.9633 | 0.9684 |
94
+ | 8 | [OpenMed-PII-Italian-NomicMed-Large-395M-v1](https://huggingface.co/OpenMed/OpenMed-PII-Italian-NomicMed-Large-395M-v1) | 0.9656 | 0.9631 | 0.9682 |
95
+ | 9 | [OpenMed-PII-Italian-ClinicalBGE-Large-335M-v1](https://huggingface.co/OpenMed/OpenMed-PII-Italian-ClinicalBGE-Large-335M-v1) | 0.9605 | 0.9575 | 0.9635 |
96
+ | 10 | [OpenMed-PII-Italian-SuperClinical-Base-184M-v1](https://huggingface.co/OpenMed/OpenMed-PII-Italian-SuperClinical-Base-184M-v1) | 0.9596 | 0.9573 | 0.9620 |
97
+
98
+ ## Supported Entity Types
99
+
100
+ This model detects **54 PII entity types** organized into categories:
101
+
102
+ <details>
103
+ <summary><strong>Identifiers</strong> (22 types)</summary>
104
+
105
+ | Entity | Description |
106
+ |:---|:---|
107
+ | `ACCOUNTNAME` | Accountname |
108
+ | `BANKACCOUNT` | Bankaccount |
109
+ | `BIC` | Bic |
110
+ | `BITCOINADDRESS` | Bitcoinaddress |
111
+ | `CREDITCARD` | Creditcard |
112
+ | `CREDITCARDISSUER` | Creditcardissuer |
113
+ | `CVV` | Cvv |
114
+ | `ETHEREUMADDRESS` | Ethereumaddress |
115
+ | `IBAN` | Iban |
116
+ | `IMEI` | Imei |
117
+ | ... | *and 12 more* |
118
+
119
+ </details>
120
+
121
+ <details>
122
+ <summary><strong>Personal Info</strong> (11 types)</summary>
123
+
124
+ | Entity | Description |
125
+ |:---|:---|
126
+ | `AGE` | Age |
127
+ | `DATEOFBIRTH` | Dateofbirth |
128
+ | `EYECOLOR` | Eyecolor |
129
+ | `FIRSTNAME` | Firstname |
130
+ | `GENDER` | Gender |
131
+ | `HEIGHT` | Height |
132
+ | `LASTNAME` | Lastname |
133
+ | `MIDDLENAME` | Middlename |
134
+ | `OCCUPATION` | Occupation |
135
+ | `PREFIX` | Prefix |
136
+ | ... | *and 1 more* |
137
+
138
+ </details>
139
+
140
+ <details>
141
+ <summary><strong>Contact Info</strong> (2 types)</summary>
142
+
143
+ | Entity | Description |
144
+ |:---|:---|
145
+ | `EMAIL` | Email |
146
+ | `PHONE` | Phone |
147
+
148
+ </details>
149
+
150
+ <details>
151
+ <summary><strong>Location</strong> (9 types)</summary>
152
+
153
+ | Entity | Description |
154
+ |:---|:---|
155
+ | `BUILDINGNUMBER` | Buildingnumber |
156
+ | `CITY` | City |
157
+ | `COUNTY` | County |
158
+ | `GPSCOORDINATES` | Gpscoordinates |
159
+ | `ORDINALDIRECTION` | Ordinaldirection |
160
+ | `SECONDARYADDRESS` | Secondaryaddress |
161
+ | `STATE` | State |
162
+ | `STREET` | Street |
163
+ | `ZIPCODE` | Zipcode |
164
+
165
+ </details>
166
+
167
+ <details>
168
+ <summary><strong>Organization</strong> (3 types)</summary>
169
+
170
+ | Entity | Description |
171
+ |:---|:---|
172
+ | `JOBDEPARTMENT` | Jobdepartment |
173
+ | `JOBTITLE` | Jobtitle |
174
+ | `ORGANIZATION` | Organization |
175
+
176
+ </details>
177
+
178
+ <details>
179
+ <summary><strong>Financial</strong> (5 types)</summary>
180
+
181
+ | Entity | Description |
182
+ |:---|:---|
183
+ | `AMOUNT` | Amount |
184
+ | `CURRENCY` | Currency |
185
+ | `CURRENCYCODE` | Currencycode |
186
+ | `CURRENCYNAME` | Currencyname |
187
+ | `CURRENCYSYMBOL` | Currencysymbol |
188
+
189
+ </details>
190
+
191
+ <details>
192
+ <summary><strong>Temporal</strong> (2 types)</summary>
193
+
194
+ | Entity | Description |
195
+ |:---|:---|
196
+ | `DATE` | Date |
197
+ | `TIME` | Time |
198
+
199
+ </details>
200
+
201
+ ## Usage
202
+
203
+ ### Quick Start
204
+
205
+ ```python
206
+ from transformers import pipeline
207
+
208
+ # Load the PII detection pipeline
209
+ ner = pipeline("ner", model="OpenMed/OpenMed-PII-Italian-ModernMed-Large-395M-v1", aggregation_strategy="simple")
210
+
211
+ text = """
212
+ Paziente Marco Bianchi (nato il 15/03/1985, CF: BNCMRC85C15H501Z) è stato visitato oggi.
213
+ Contatto: marco.bianchi@email.it, Telefono: +39 333 123 4567.
214
+ Indirizzo: Via Garibaldi 42, 20121 Milano.
215
+ """
216
+
217
+ entities = ner(text)
218
+ for entity in entities:
219
+ print(f"{entity['entity_group']}: {entity['word']} (score: {entity['score']:.3f})")
220
+ ```
221
+
222
+ ### De-identification Example
223
+
224
+ ```python
225
+ def redact_pii(text, entities, placeholder='[REDACTED]'):
226
+ """Replace detected PII with placeholders."""
227
+ # Sort entities by start position (descending) to preserve offsets
228
+ sorted_entities = sorted(entities, key=lambda x: x['start'], reverse=True)
229
+ redacted = text
230
+ for ent in sorted_entities:
231
+ redacted = redacted[:ent['start']] + f"[{ent['entity_group']}]" + redacted[ent['end']:]
232
+ return redacted
233
+
234
+ # Apply de-identification
235
+ redacted_text = redact_pii(text, entities)
236
+ print(redacted_text)
237
+ ```
238
+
239
+ ### Batch Processing
240
+
241
+ ```python
242
+ from transformers import AutoModelForTokenClassification, AutoTokenizer
243
+ import torch
244
+
245
+ model_name = "OpenMed/OpenMed-PII-Italian-ModernMed-Large-395M-v1"
246
+ model = AutoModelForTokenClassification.from_pretrained(model_name)
247
+ tokenizer = AutoTokenizer.from_pretrained(model_name)
248
+
249
+ texts = [
250
+ "Paziente Marco Bianchi (nato il 15/03/1985, CF: BNCMRC85C15H501Z) è stato visitato oggi.",
251
+ "Contatto: marco.bianchi@email.it, Telefono: +39 333 123 4567.",
252
+ ]
253
+
254
+ inputs = tokenizer(texts, return_tensors='pt', padding=True, truncation=True)
255
+ with torch.no_grad():
256
+ outputs = model(**inputs)
257
+ predictions = torch.argmax(outputs.logits, dim=-1)
258
+ ```
259
+
260
+ ## Training Details
261
+
262
+ ### Dataset
263
+
264
+ - **Source**: [AI4Privacy PII Masking 400k](https://huggingface.co/datasets/ai4privacy/pii-masking-400k) (Italian subset)
265
+ - **Format**: BIO-tagged token classification
266
+ - **Labels**: 109 total (54 entity types × 2 BIO tags + O)
267
+
268
+ ### Training Configuration
269
+
270
+ - **Max Sequence Length**: 512 tokens
271
+ - **Epochs**: 3
272
+ - **Framework**: Hugging Face Transformers + Trainer API
273
+
274
+ ## Intended Use & Limitations
275
+
276
+ ### Intended Use
277
+
278
+ - **De-identification**: Automated redaction of PII in Italian clinical notes, medical records, and documents
279
+ - **Compliance**: Supporting GDPR, and other privacy regulation compliance
280
+ - **Data Preprocessing**: Preparing datasets for research by removing sensitive information
281
+ - **Audit Support**: Identifying PII in document collections
282
+
283
+ ### Limitations
284
+
285
+ **Important**: This model is intended as an **assistive tool**, not a replacement for human review.
286
+
287
+ - **False Negatives**: Some PII may not be detected; always verify critical applications
288
+ - **Context Sensitivity**: Performance may vary with domain-specific terminology
289
+ - **Language**: Optimized for Italian text; may not perform well on other languages
290
+
291
+ ## Citation
292
+
293
+ ```bibtex
294
+ @misc{openmed-pii-2026,
295
+ title = {OpenMed-PII-Italian-ModernMed-Large-395M-v1: Italian PII Detection Model},
296
+ author = {OpenMed Science},
297
+ year = {2026},
298
+ publisher = {Hugging Face},
299
+ url = {https://huggingface.co/OpenMed/OpenMed-PII-Italian-ModernMed-Large-395M-v1}
300
+ }
301
+ ```
302
+
303
+ ## Links
304
+
305
+ - **Organization**: [OpenMed](https://huggingface.co/OpenMed)
all_results.json ADDED
@@ -0,0 +1,28 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 3.0,
3
+ "eval_accuracy": 0.994844131159147,
4
+ "eval_f1": 0.9639322033898305,
5
+ "eval_loss": 0.014384633861482143,
6
+ "eval_macro_f1": 0.9541869985075359,
7
+ "eval_precision": 0.9613902224626412,
8
+ "eval_recall": 0.9664876622935219,
9
+ "eval_runtime": 7.6622,
10
+ "eval_samples_per_second": 648.903,
11
+ "eval_steps_per_second": 20.36,
12
+ "eval_weighted_f1": 0.9641413661433497,
13
+ "test_accuracy": 0.9939838776845737,
14
+ "test_f1": 0.9563980415508799,
15
+ "test_loss": 0.01686817966401577,
16
+ "test_macro_f1": 0.9490995419020036,
17
+ "test_precision": 0.9540624381228962,
18
+ "test_recall": 0.9587451084433243,
19
+ "test_runtime": 8.6689,
20
+ "test_samples_per_second": 584.735,
21
+ "test_steps_per_second": 18.341,
22
+ "test_weighted_f1": 0.9562803095255936,
23
+ "total_flos": 1.881641387556864e+16,
24
+ "train_loss": 0.08123073937216153,
25
+ "train_runtime": 961.2851,
26
+ "train_samples_per_second": 127.779,
27
+ "train_steps_per_second": 1.997
28
+ }
classification_report.txt ADDED
@@ -0,0 +1,64 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Classification Report for Italian PII Detection
2
+ Model: answerdotai/ModernBERT-large
3
+ ============================================================
4
+
5
+ precision recall f1-score support
6
+
7
+ ACCOUNTNAME 0.99 1.00 1.00 282
8
+ AGE 0.99 0.99 0.99 338
9
+ AMOUNT 0.98 0.99 0.99 116
10
+ BANKACCOUNT 1.00 1.00 1.00 306
11
+ BIC 1.00 1.00 1.00 77
12
+ BITCOINADDRESS 0.93 0.99 0.96 273
13
+ BUILDINGNUMBER 0.93 0.91 0.92 346
14
+ CITY 0.96 0.91 0.93 280
15
+ COUNTY 0.98 1.00 0.99 327
16
+ CREDITCARD 0.84 0.89 0.86 302
17
+ CREDITCARDISSUER 1.00 1.00 1.00 146
18
+ CURRENCY 0.68 0.82 0.74 187
19
+ CURRENCYCODE 0.98 0.96 0.97 85
20
+ CURRENCYNAME 0.46 0.36 0.40 97
21
+ CURRENCYSYMBOL 0.98 0.98 0.98 308
22
+ CVV 0.96 0.99 0.97 97
23
+ DATE 0.73 0.85 0.79 423
24
+ DATEOFBIRTH 0.73 0.65 0.68 327
25
+ EMAIL 1.00 1.00 1.00 423
26
+ ETHEREUMADDRESS 1.00 1.00 1.00 168
27
+ EYECOLOR 1.00 0.98 0.99 108
28
+ FIRSTNAME 0.97 0.97 0.97 1623
29
+ GENDER 0.99 0.99 0.99 302
30
+ GPSCOORDINATES 1.00 1.00 1.00 223
31
+ HEIGHT 0.99 0.99 0.99 126
32
+ IBAN 0.99 1.00 1.00 230
33
+ IMEI 1.00 1.00 1.00 215
34
+ IPADDRESS 1.00 1.00 1.00 783
35
+ JOBDEPARTMENT 0.99 0.98 0.98 327
36
+ JOBTITLE 0.99 1.00 0.99 279
37
+ LASTNAME 0.95 0.94 0.95 441
38
+ LITECOINADDRESS 0.93 0.76 0.83 83
39
+ MACADDRESS 0.99 0.98 0.99 114
40
+ MASKEDNUMBER 0.82 0.76 0.79 209
41
+ MIDDLENAME 0.90 0.96 0.93 310
42
+ OCCUPATION 1.00 0.99 1.00 323
43
+ ORDINALDIRECTION 1.00 1.00 1.00 152
44
+ ORGANIZATION 0.98 0.99 0.99 272
45
+ PASSWORD 0.99 0.98 0.98 286
46
+ PHONE 0.99 1.00 0.99 303
47
+ PIN 0.90 0.90 0.90 72
48
+ PREFIX 0.98 1.00 0.99 298
49
+ SECONDARYADDRESS 0.99 1.00 1.00 316
50
+ SEX 1.00 1.00 1.00 338
51
+ SSN 1.00 1.00 1.00 259
52
+ STATE 0.98 0.99 0.98 294
53
+ STREET 0.96 0.97 0.97 332
54
+ TIME 0.99 0.99 0.99 296
55
+ URL 1.00 1.00 1.00 244
56
+ USERAGENT 0.99 1.00 0.99 233
57
+ USERNAME 0.99 0.98 0.99 332
58
+ VIN 1.00 1.00 1.00 84
59
+ VRM 1.00 1.00 1.00 98
60
+ ZIPCODE 0.92 0.92 0.92 264
61
+
62
+ micro avg 0.95 0.96 0.96 15077
63
+ macro avg 0.95 0.95 0.95 15077
64
+ weighted avg 0.95 0.96 0.96 15077
config.json ADDED
@@ -0,0 +1,201 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "ModernBertForTokenClassification"
4
+ ],
5
+ "attention_bias": false,
6
+ "attention_dropout": 0.0,
7
+ "bos_token_id": null,
8
+ "classifier_activation": "gelu",
9
+ "classifier_bias": false,
10
+ "classifier_dropout": 0.0,
11
+ "classifier_pooling": "mean",
12
+ "cls_token_id": 50281,
13
+ "decoder_bias": true,
14
+ "deterministic_flash_attn": false,
15
+ "dtype": "float32",
16
+ "embedding_dropout": 0.0,
17
+ "eos_token_id": null,
18
+ "global_attn_every_n_layers": 3,
19
+ "global_rope_theta": 160000.0,
20
+ "gradient_checkpointing": false,
21
+ "hidden_activation": "gelu",
22
+ "hidden_size": 1024,
23
+ "id2label": {
24
+ "0": "O",
25
+ "1": "B-ACCOUNTNAME",
26
+ "2": "B-AGE",
27
+ "3": "B-AMOUNT",
28
+ "4": "B-BANKACCOUNT",
29
+ "5": "B-BIC",
30
+ "6": "B-BITCOINADDRESS",
31
+ "7": "B-BUILDINGNUMBER",
32
+ "8": "B-CITY",
33
+ "9": "B-COUNTY",
34
+ "10": "B-CREDITCARD",
35
+ "11": "B-CREDITCARDISSUER",
36
+ "12": "B-CURRENCY",
37
+ "13": "B-CURRENCYCODE",
38
+ "14": "B-CURRENCYNAME",
39
+ "15": "B-CURRENCYSYMBOL",
40
+ "16": "B-CVV",
41
+ "17": "B-DATE",
42
+ "18": "B-DATEOFBIRTH",
43
+ "19": "B-EMAIL",
44
+ "20": "B-ETHEREUMADDRESS",
45
+ "21": "B-EYECOLOR",
46
+ "22": "B-FIRSTNAME",
47
+ "23": "B-GENDER",
48
+ "24": "B-GPSCOORDINATES",
49
+ "25": "B-HEIGHT",
50
+ "26": "B-IBAN",
51
+ "27": "B-IMEI",
52
+ "28": "B-IPADDRESS",
53
+ "29": "B-JOBDEPARTMENT",
54
+ "30": "B-JOBTITLE",
55
+ "31": "B-LASTNAME",
56
+ "32": "B-LITECOINADDRESS",
57
+ "33": "B-MACADDRESS",
58
+ "34": "B-MASKEDNUMBER",
59
+ "35": "B-MIDDLENAME",
60
+ "36": "B-OCCUPATION",
61
+ "37": "B-ORDINALDIRECTION",
62
+ "38": "B-ORGANIZATION",
63
+ "39": "B-PASSWORD",
64
+ "40": "B-PHONE",
65
+ "41": "B-PIN",
66
+ "42": "B-PREFIX",
67
+ "43": "B-SECONDARYADDRESS",
68
+ "44": "B-SEX",
69
+ "45": "B-SSN",
70
+ "46": "B-STATE",
71
+ "47": "B-STREET",
72
+ "48": "B-TIME",
73
+ "49": "B-URL",
74
+ "50": "B-USERAGENT",
75
+ "51": "B-USERNAME",
76
+ "52": "B-VIN",
77
+ "53": "B-VRM",
78
+ "54": "B-ZIPCODE",
79
+ "55": "I-ACCOUNTNAME",
80
+ "56": "I-AGE",
81
+ "57": "I-AMOUNT",
82
+ "58": "I-CITY",
83
+ "59": "I-COUNTY",
84
+ "60": "I-CURRENCY",
85
+ "61": "I-CURRENCYNAME",
86
+ "62": "I-DATE",
87
+ "63": "I-DATEOFBIRTH",
88
+ "64": "I-EYECOLOR",
89
+ "65": "I-GENDER",
90
+ "66": "I-HEIGHT",
91
+ "67": "I-JOBTITLE",
92
+ "68": "I-ORGANIZATION",
93
+ "69": "I-PHONE",
94
+ "70": "I-SECONDARYADDRESS",
95
+ "71": "I-SSN",
96
+ "72": "I-STATE",
97
+ "73": "I-STREET",
98
+ "74": "I-TIME",
99
+ "75": "I-USERAGENT"
100
+ },
101
+ "initializer_cutoff_factor": 2.0,
102
+ "initializer_range": 0.02,
103
+ "intermediate_size": 2624,
104
+ "label2id": {
105
+ "B-ACCOUNTNAME": 1,
106
+ "B-AGE": 2,
107
+ "B-AMOUNT": 3,
108
+ "B-BANKACCOUNT": 4,
109
+ "B-BIC": 5,
110
+ "B-BITCOINADDRESS": 6,
111
+ "B-BUILDINGNUMBER": 7,
112
+ "B-CITY": 8,
113
+ "B-COUNTY": 9,
114
+ "B-CREDITCARD": 10,
115
+ "B-CREDITCARDISSUER": 11,
116
+ "B-CURRENCY": 12,
117
+ "B-CURRENCYCODE": 13,
118
+ "B-CURRENCYNAME": 14,
119
+ "B-CURRENCYSYMBOL": 15,
120
+ "B-CVV": 16,
121
+ "B-DATE": 17,
122
+ "B-DATEOFBIRTH": 18,
123
+ "B-EMAIL": 19,
124
+ "B-ETHEREUMADDRESS": 20,
125
+ "B-EYECOLOR": 21,
126
+ "B-FIRSTNAME": 22,
127
+ "B-GENDER": 23,
128
+ "B-GPSCOORDINATES": 24,
129
+ "B-HEIGHT": 25,
130
+ "B-IBAN": 26,
131
+ "B-IMEI": 27,
132
+ "B-IPADDRESS": 28,
133
+ "B-JOBDEPARTMENT": 29,
134
+ "B-JOBTITLE": 30,
135
+ "B-LASTNAME": 31,
136
+ "B-LITECOINADDRESS": 32,
137
+ "B-MACADDRESS": 33,
138
+ "B-MASKEDNUMBER": 34,
139
+ "B-MIDDLENAME": 35,
140
+ "B-OCCUPATION": 36,
141
+ "B-ORDINALDIRECTION": 37,
142
+ "B-ORGANIZATION": 38,
143
+ "B-PASSWORD": 39,
144
+ "B-PHONE": 40,
145
+ "B-PIN": 41,
146
+ "B-PREFIX": 42,
147
+ "B-SECONDARYADDRESS": 43,
148
+ "B-SEX": 44,
149
+ "B-SSN": 45,
150
+ "B-STATE": 46,
151
+ "B-STREET": 47,
152
+ "B-TIME": 48,
153
+ "B-URL": 49,
154
+ "B-USERAGENT": 50,
155
+ "B-USERNAME": 51,
156
+ "B-VIN": 52,
157
+ "B-VRM": 53,
158
+ "B-ZIPCODE": 54,
159
+ "I-ACCOUNTNAME": 55,
160
+ "I-AGE": 56,
161
+ "I-AMOUNT": 57,
162
+ "I-CITY": 58,
163
+ "I-COUNTY": 59,
164
+ "I-CURRENCY": 60,
165
+ "I-CURRENCYNAME": 61,
166
+ "I-DATE": 62,
167
+ "I-DATEOFBIRTH": 63,
168
+ "I-EYECOLOR": 64,
169
+ "I-GENDER": 65,
170
+ "I-HEIGHT": 66,
171
+ "I-JOBTITLE": 67,
172
+ "I-ORGANIZATION": 68,
173
+ "I-PHONE": 69,
174
+ "I-SECONDARYADDRESS": 70,
175
+ "I-SSN": 71,
176
+ "I-STATE": 72,
177
+ "I-STREET": 73,
178
+ "I-TIME": 74,
179
+ "I-USERAGENT": 75,
180
+ "O": 0
181
+ },
182
+ "layer_norm_eps": 1e-05,
183
+ "local_attention": 128,
184
+ "local_rope_theta": 10000.0,
185
+ "max_position_embeddings": 8192,
186
+ "mlp_bias": false,
187
+ "mlp_dropout": 0.0,
188
+ "model_type": "modernbert",
189
+ "norm_bias": false,
190
+ "norm_eps": 1e-05,
191
+ "num_attention_heads": 16,
192
+ "num_hidden_layers": 28,
193
+ "pad_token_id": 50283,
194
+ "position_embedding_type": "absolute",
195
+ "repad_logits_with_grad": false,
196
+ "sep_token_id": 50282,
197
+ "sparse_pred_ignore_index": -100,
198
+ "sparse_prediction": false,
199
+ "transformers_version": "4.57.3",
200
+ "vocab_size": 50368
201
+ }
eval_results.json ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 3.0,
3
+ "eval_accuracy": 0.994844131159147,
4
+ "eval_f1": 0.9639322033898305,
5
+ "eval_loss": 0.014384633861482143,
6
+ "eval_macro_f1": 0.9541869985075359,
7
+ "eval_precision": 0.9613902224626412,
8
+ "eval_recall": 0.9664876622935219,
9
+ "eval_runtime": 7.6622,
10
+ "eval_samples_per_second": 648.903,
11
+ "eval_steps_per_second": 20.36,
12
+ "eval_weighted_f1": 0.9641413661433497
13
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:50ac4d5b72c540774007281c065e98171a9e91cfa4a63f195dc1a7defc256e61
3
+ size 1583655040
special_tokens_map.json ADDED
@@ -0,0 +1,37 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cls_token": {
3
+ "content": "[CLS]",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "mask_token": {
10
+ "content": "[MASK]",
11
+ "lstrip": true,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "[PAD]",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "sep_token": {
24
+ "content": "[SEP]",
25
+ "lstrip": false,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "unk_token": {
31
+ "content": "[UNK]",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ }
37
+ }
test_results.json ADDED
@@ -0,0 +1,12 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "test_accuracy": 0.9939838776845737,
3
+ "test_f1": 0.9563980415508799,
4
+ "test_loss": 0.01686817966401577,
5
+ "test_macro_f1": 0.9490995419020036,
6
+ "test_precision": 0.9540624381228962,
7
+ "test_recall": 0.9587451084433243,
8
+ "test_runtime": 8.6689,
9
+ "test_samples_per_second": 584.735,
10
+ "test_steps_per_second": 18.341,
11
+ "test_weighted_f1": 0.9562803095255936
12
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,945 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "|||IP_ADDRESS|||",
5
+ "lstrip": false,
6
+ "normalized": true,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": false
10
+ },
11
+ "1": {
12
+ "content": "<|padding|>",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "50254": {
20
+ "content": " ",
21
+ "lstrip": false,
22
+ "normalized": true,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": false
26
+ },
27
+ "50255": {
28
+ "content": " ",
29
+ "lstrip": false,
30
+ "normalized": true,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": false
34
+ },
35
+ "50256": {
36
+ "content": " ",
37
+ "lstrip": false,
38
+ "normalized": true,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": false
42
+ },
43
+ "50257": {
44
+ "content": " ",
45
+ "lstrip": false,
46
+ "normalized": true,
47
+ "rstrip": false,
48
+ "single_word": false,
49
+ "special": false
50
+ },
51
+ "50258": {
52
+ "content": " ",
53
+ "lstrip": false,
54
+ "normalized": true,
55
+ "rstrip": false,
56
+ "single_word": false,
57
+ "special": false
58
+ },
59
+ "50259": {
60
+ "content": " ",
61
+ "lstrip": false,
62
+ "normalized": true,
63
+ "rstrip": false,
64
+ "single_word": false,
65
+ "special": false
66
+ },
67
+ "50260": {
68
+ "content": " ",
69
+ "lstrip": false,
70
+ "normalized": true,
71
+ "rstrip": false,
72
+ "single_word": false,
73
+ "special": false
74
+ },
75
+ "50261": {
76
+ "content": " ",
77
+ "lstrip": false,
78
+ "normalized": true,
79
+ "rstrip": false,
80
+ "single_word": false,
81
+ "special": false
82
+ },
83
+ "50262": {
84
+ "content": " ",
85
+ "lstrip": false,
86
+ "normalized": true,
87
+ "rstrip": false,
88
+ "single_word": false,
89
+ "special": false
90
+ },
91
+ "50263": {
92
+ "content": " ",
93
+ "lstrip": false,
94
+ "normalized": true,
95
+ "rstrip": false,
96
+ "single_word": false,
97
+ "special": false
98
+ },
99
+ "50264": {
100
+ "content": " ",
101
+ "lstrip": false,
102
+ "normalized": true,
103
+ "rstrip": false,
104
+ "single_word": false,
105
+ "special": false
106
+ },
107
+ "50265": {
108
+ "content": " ",
109
+ "lstrip": false,
110
+ "normalized": true,
111
+ "rstrip": false,
112
+ "single_word": false,
113
+ "special": false
114
+ },
115
+ "50266": {
116
+ "content": " ",
117
+ "lstrip": false,
118
+ "normalized": true,
119
+ "rstrip": false,
120
+ "single_word": false,
121
+ "special": false
122
+ },
123
+ "50267": {
124
+ "content": " ",
125
+ "lstrip": false,
126
+ "normalized": true,
127
+ "rstrip": false,
128
+ "single_word": false,
129
+ "special": false
130
+ },
131
+ "50268": {
132
+ "content": " ",
133
+ "lstrip": false,
134
+ "normalized": true,
135
+ "rstrip": false,
136
+ "single_word": false,
137
+ "special": false
138
+ },
139
+ "50269": {
140
+ "content": " ",
141
+ "lstrip": false,
142
+ "normalized": true,
143
+ "rstrip": false,
144
+ "single_word": false,
145
+ "special": false
146
+ },
147
+ "50270": {
148
+ "content": " ",
149
+ "lstrip": false,
150
+ "normalized": true,
151
+ "rstrip": false,
152
+ "single_word": false,
153
+ "special": false
154
+ },
155
+ "50271": {
156
+ "content": " ",
157
+ "lstrip": false,
158
+ "normalized": true,
159
+ "rstrip": false,
160
+ "single_word": false,
161
+ "special": false
162
+ },
163
+ "50272": {
164
+ "content": " ",
165
+ "lstrip": false,
166
+ "normalized": true,
167
+ "rstrip": false,
168
+ "single_word": false,
169
+ "special": false
170
+ },
171
+ "50273": {
172
+ "content": " ",
173
+ "lstrip": false,
174
+ "normalized": true,
175
+ "rstrip": false,
176
+ "single_word": false,
177
+ "special": false
178
+ },
179
+ "50274": {
180
+ "content": " ",
181
+ "lstrip": false,
182
+ "normalized": true,
183
+ "rstrip": false,
184
+ "single_word": false,
185
+ "special": false
186
+ },
187
+ "50275": {
188
+ "content": " ",
189
+ "lstrip": false,
190
+ "normalized": true,
191
+ "rstrip": false,
192
+ "single_word": false,
193
+ "special": false
194
+ },
195
+ "50276": {
196
+ "content": " ",
197
+ "lstrip": false,
198
+ "normalized": true,
199
+ "rstrip": false,
200
+ "single_word": false,
201
+ "special": false
202
+ },
203
+ "50277": {
204
+ "content": "|||EMAIL_ADDRESS|||",
205
+ "lstrip": false,
206
+ "normalized": true,
207
+ "rstrip": false,
208
+ "single_word": false,
209
+ "special": false
210
+ },
211
+ "50278": {
212
+ "content": "|||PHONE_NUMBER|||",
213
+ "lstrip": false,
214
+ "normalized": true,
215
+ "rstrip": false,
216
+ "single_word": false,
217
+ "special": false
218
+ },
219
+ "50279": {
220
+ "content": "<|endoftext|>",
221
+ "lstrip": false,
222
+ "normalized": false,
223
+ "rstrip": false,
224
+ "single_word": false,
225
+ "special": true
226
+ },
227
+ "50280": {
228
+ "content": "[UNK]",
229
+ "lstrip": false,
230
+ "normalized": false,
231
+ "rstrip": false,
232
+ "single_word": false,
233
+ "special": true
234
+ },
235
+ "50281": {
236
+ "content": "[CLS]",
237
+ "lstrip": false,
238
+ "normalized": false,
239
+ "rstrip": false,
240
+ "single_word": false,
241
+ "special": true
242
+ },
243
+ "50282": {
244
+ "content": "[SEP]",
245
+ "lstrip": false,
246
+ "normalized": false,
247
+ "rstrip": false,
248
+ "single_word": false,
249
+ "special": true
250
+ },
251
+ "50283": {
252
+ "content": "[PAD]",
253
+ "lstrip": false,
254
+ "normalized": false,
255
+ "rstrip": false,
256
+ "single_word": false,
257
+ "special": true
258
+ },
259
+ "50284": {
260
+ "content": "[MASK]",
261
+ "lstrip": true,
262
+ "normalized": false,
263
+ "rstrip": false,
264
+ "single_word": false,
265
+ "special": true
266
+ },
267
+ "50285": {
268
+ "content": "[unused0]",
269
+ "lstrip": false,
270
+ "normalized": true,
271
+ "rstrip": false,
272
+ "single_word": false,
273
+ "special": false
274
+ },
275
+ "50286": {
276
+ "content": "[unused1]",
277
+ "lstrip": false,
278
+ "normalized": true,
279
+ "rstrip": false,
280
+ "single_word": false,
281
+ "special": false
282
+ },
283
+ "50287": {
284
+ "content": "[unused2]",
285
+ "lstrip": false,
286
+ "normalized": true,
287
+ "rstrip": false,
288
+ "single_word": false,
289
+ "special": false
290
+ },
291
+ "50288": {
292
+ "content": "[unused3]",
293
+ "lstrip": false,
294
+ "normalized": true,
295
+ "rstrip": false,
296
+ "single_word": false,
297
+ "special": false
298
+ },
299
+ "50289": {
300
+ "content": "[unused4]",
301
+ "lstrip": false,
302
+ "normalized": true,
303
+ "rstrip": false,
304
+ "single_word": false,
305
+ "special": false
306
+ },
307
+ "50290": {
308
+ "content": "[unused5]",
309
+ "lstrip": false,
310
+ "normalized": true,
311
+ "rstrip": false,
312
+ "single_word": false,
313
+ "special": false
314
+ },
315
+ "50291": {
316
+ "content": "[unused6]",
317
+ "lstrip": false,
318
+ "normalized": true,
319
+ "rstrip": false,
320
+ "single_word": false,
321
+ "special": false
322
+ },
323
+ "50292": {
324
+ "content": "[unused7]",
325
+ "lstrip": false,
326
+ "normalized": true,
327
+ "rstrip": false,
328
+ "single_word": false,
329
+ "special": false
330
+ },
331
+ "50293": {
332
+ "content": "[unused8]",
333
+ "lstrip": false,
334
+ "normalized": true,
335
+ "rstrip": false,
336
+ "single_word": false,
337
+ "special": false
338
+ },
339
+ "50294": {
340
+ "content": "[unused9]",
341
+ "lstrip": false,
342
+ "normalized": true,
343
+ "rstrip": false,
344
+ "single_word": false,
345
+ "special": false
346
+ },
347
+ "50295": {
348
+ "content": "[unused10]",
349
+ "lstrip": false,
350
+ "normalized": true,
351
+ "rstrip": false,
352
+ "single_word": false,
353
+ "special": false
354
+ },
355
+ "50296": {
356
+ "content": "[unused11]",
357
+ "lstrip": false,
358
+ "normalized": true,
359
+ "rstrip": false,
360
+ "single_word": false,
361
+ "special": false
362
+ },
363
+ "50297": {
364
+ "content": "[unused12]",
365
+ "lstrip": false,
366
+ "normalized": true,
367
+ "rstrip": false,
368
+ "single_word": false,
369
+ "special": false
370
+ },
371
+ "50298": {
372
+ "content": "[unused13]",
373
+ "lstrip": false,
374
+ "normalized": true,
375
+ "rstrip": false,
376
+ "single_word": false,
377
+ "special": false
378
+ },
379
+ "50299": {
380
+ "content": "[unused14]",
381
+ "lstrip": false,
382
+ "normalized": true,
383
+ "rstrip": false,
384
+ "single_word": false,
385
+ "special": false
386
+ },
387
+ "50300": {
388
+ "content": "[unused15]",
389
+ "lstrip": false,
390
+ "normalized": true,
391
+ "rstrip": false,
392
+ "single_word": false,
393
+ "special": false
394
+ },
395
+ "50301": {
396
+ "content": "[unused16]",
397
+ "lstrip": false,
398
+ "normalized": true,
399
+ "rstrip": false,
400
+ "single_word": false,
401
+ "special": false
402
+ },
403
+ "50302": {
404
+ "content": "[unused17]",
405
+ "lstrip": false,
406
+ "normalized": true,
407
+ "rstrip": false,
408
+ "single_word": false,
409
+ "special": false
410
+ },
411
+ "50303": {
412
+ "content": "[unused18]",
413
+ "lstrip": false,
414
+ "normalized": true,
415
+ "rstrip": false,
416
+ "single_word": false,
417
+ "special": false
418
+ },
419
+ "50304": {
420
+ "content": "[unused19]",
421
+ "lstrip": false,
422
+ "normalized": true,
423
+ "rstrip": false,
424
+ "single_word": false,
425
+ "special": false
426
+ },
427
+ "50305": {
428
+ "content": "[unused20]",
429
+ "lstrip": false,
430
+ "normalized": true,
431
+ "rstrip": false,
432
+ "single_word": false,
433
+ "special": false
434
+ },
435
+ "50306": {
436
+ "content": "[unused21]",
437
+ "lstrip": false,
438
+ "normalized": true,
439
+ "rstrip": false,
440
+ "single_word": false,
441
+ "special": false
442
+ },
443
+ "50307": {
444
+ "content": "[unused22]",
445
+ "lstrip": false,
446
+ "normalized": true,
447
+ "rstrip": false,
448
+ "single_word": false,
449
+ "special": false
450
+ },
451
+ "50308": {
452
+ "content": "[unused23]",
453
+ "lstrip": false,
454
+ "normalized": true,
455
+ "rstrip": false,
456
+ "single_word": false,
457
+ "special": false
458
+ },
459
+ "50309": {
460
+ "content": "[unused24]",
461
+ "lstrip": false,
462
+ "normalized": true,
463
+ "rstrip": false,
464
+ "single_word": false,
465
+ "special": false
466
+ },
467
+ "50310": {
468
+ "content": "[unused25]",
469
+ "lstrip": false,
470
+ "normalized": true,
471
+ "rstrip": false,
472
+ "single_word": false,
473
+ "special": false
474
+ },
475
+ "50311": {
476
+ "content": "[unused26]",
477
+ "lstrip": false,
478
+ "normalized": true,
479
+ "rstrip": false,
480
+ "single_word": false,
481
+ "special": false
482
+ },
483
+ "50312": {
484
+ "content": "[unused27]",
485
+ "lstrip": false,
486
+ "normalized": true,
487
+ "rstrip": false,
488
+ "single_word": false,
489
+ "special": false
490
+ },
491
+ "50313": {
492
+ "content": "[unused28]",
493
+ "lstrip": false,
494
+ "normalized": true,
495
+ "rstrip": false,
496
+ "single_word": false,
497
+ "special": false
498
+ },
499
+ "50314": {
500
+ "content": "[unused29]",
501
+ "lstrip": false,
502
+ "normalized": true,
503
+ "rstrip": false,
504
+ "single_word": false,
505
+ "special": false
506
+ },
507
+ "50315": {
508
+ "content": "[unused30]",
509
+ "lstrip": false,
510
+ "normalized": true,
511
+ "rstrip": false,
512
+ "single_word": false,
513
+ "special": false
514
+ },
515
+ "50316": {
516
+ "content": "[unused31]",
517
+ "lstrip": false,
518
+ "normalized": true,
519
+ "rstrip": false,
520
+ "single_word": false,
521
+ "special": false
522
+ },
523
+ "50317": {
524
+ "content": "[unused32]",
525
+ "lstrip": false,
526
+ "normalized": true,
527
+ "rstrip": false,
528
+ "single_word": false,
529
+ "special": false
530
+ },
531
+ "50318": {
532
+ "content": "[unused33]",
533
+ "lstrip": false,
534
+ "normalized": true,
535
+ "rstrip": false,
536
+ "single_word": false,
537
+ "special": false
538
+ },
539
+ "50319": {
540
+ "content": "[unused34]",
541
+ "lstrip": false,
542
+ "normalized": true,
543
+ "rstrip": false,
544
+ "single_word": false,
545
+ "special": false
546
+ },
547
+ "50320": {
548
+ "content": "[unused35]",
549
+ "lstrip": false,
550
+ "normalized": true,
551
+ "rstrip": false,
552
+ "single_word": false,
553
+ "special": false
554
+ },
555
+ "50321": {
556
+ "content": "[unused36]",
557
+ "lstrip": false,
558
+ "normalized": true,
559
+ "rstrip": false,
560
+ "single_word": false,
561
+ "special": false
562
+ },
563
+ "50322": {
564
+ "content": "[unused37]",
565
+ "lstrip": false,
566
+ "normalized": true,
567
+ "rstrip": false,
568
+ "single_word": false,
569
+ "special": false
570
+ },
571
+ "50323": {
572
+ "content": "[unused38]",
573
+ "lstrip": false,
574
+ "normalized": true,
575
+ "rstrip": false,
576
+ "single_word": false,
577
+ "special": false
578
+ },
579
+ "50324": {
580
+ "content": "[unused39]",
581
+ "lstrip": false,
582
+ "normalized": true,
583
+ "rstrip": false,
584
+ "single_word": false,
585
+ "special": false
586
+ },
587
+ "50325": {
588
+ "content": "[unused40]",
589
+ "lstrip": false,
590
+ "normalized": true,
591
+ "rstrip": false,
592
+ "single_word": false,
593
+ "special": false
594
+ },
595
+ "50326": {
596
+ "content": "[unused41]",
597
+ "lstrip": false,
598
+ "normalized": true,
599
+ "rstrip": false,
600
+ "single_word": false,
601
+ "special": false
602
+ },
603
+ "50327": {
604
+ "content": "[unused42]",
605
+ "lstrip": false,
606
+ "normalized": true,
607
+ "rstrip": false,
608
+ "single_word": false,
609
+ "special": false
610
+ },
611
+ "50328": {
612
+ "content": "[unused43]",
613
+ "lstrip": false,
614
+ "normalized": true,
615
+ "rstrip": false,
616
+ "single_word": false,
617
+ "special": false
618
+ },
619
+ "50329": {
620
+ "content": "[unused44]",
621
+ "lstrip": false,
622
+ "normalized": true,
623
+ "rstrip": false,
624
+ "single_word": false,
625
+ "special": false
626
+ },
627
+ "50330": {
628
+ "content": "[unused45]",
629
+ "lstrip": false,
630
+ "normalized": true,
631
+ "rstrip": false,
632
+ "single_word": false,
633
+ "special": false
634
+ },
635
+ "50331": {
636
+ "content": "[unused46]",
637
+ "lstrip": false,
638
+ "normalized": true,
639
+ "rstrip": false,
640
+ "single_word": false,
641
+ "special": false
642
+ },
643
+ "50332": {
644
+ "content": "[unused47]",
645
+ "lstrip": false,
646
+ "normalized": true,
647
+ "rstrip": false,
648
+ "single_word": false,
649
+ "special": false
650
+ },
651
+ "50333": {
652
+ "content": "[unused48]",
653
+ "lstrip": false,
654
+ "normalized": true,
655
+ "rstrip": false,
656
+ "single_word": false,
657
+ "special": false
658
+ },
659
+ "50334": {
660
+ "content": "[unused49]",
661
+ "lstrip": false,
662
+ "normalized": true,
663
+ "rstrip": false,
664
+ "single_word": false,
665
+ "special": false
666
+ },
667
+ "50335": {
668
+ "content": "[unused50]",
669
+ "lstrip": false,
670
+ "normalized": true,
671
+ "rstrip": false,
672
+ "single_word": false,
673
+ "special": false
674
+ },
675
+ "50336": {
676
+ "content": "[unused51]",
677
+ "lstrip": false,
678
+ "normalized": true,
679
+ "rstrip": false,
680
+ "single_word": false,
681
+ "special": false
682
+ },
683
+ "50337": {
684
+ "content": "[unused52]",
685
+ "lstrip": false,
686
+ "normalized": true,
687
+ "rstrip": false,
688
+ "single_word": false,
689
+ "special": false
690
+ },
691
+ "50338": {
692
+ "content": "[unused53]",
693
+ "lstrip": false,
694
+ "normalized": true,
695
+ "rstrip": false,
696
+ "single_word": false,
697
+ "special": false
698
+ },
699
+ "50339": {
700
+ "content": "[unused54]",
701
+ "lstrip": false,
702
+ "normalized": true,
703
+ "rstrip": false,
704
+ "single_word": false,
705
+ "special": false
706
+ },
707
+ "50340": {
708
+ "content": "[unused55]",
709
+ "lstrip": false,
710
+ "normalized": true,
711
+ "rstrip": false,
712
+ "single_word": false,
713
+ "special": false
714
+ },
715
+ "50341": {
716
+ "content": "[unused56]",
717
+ "lstrip": false,
718
+ "normalized": true,
719
+ "rstrip": false,
720
+ "single_word": false,
721
+ "special": false
722
+ },
723
+ "50342": {
724
+ "content": "[unused57]",
725
+ "lstrip": false,
726
+ "normalized": true,
727
+ "rstrip": false,
728
+ "single_word": false,
729
+ "special": false
730
+ },
731
+ "50343": {
732
+ "content": "[unused58]",
733
+ "lstrip": false,
734
+ "normalized": true,
735
+ "rstrip": false,
736
+ "single_word": false,
737
+ "special": false
738
+ },
739
+ "50344": {
740
+ "content": "[unused59]",
741
+ "lstrip": false,
742
+ "normalized": true,
743
+ "rstrip": false,
744
+ "single_word": false,
745
+ "special": false
746
+ },
747
+ "50345": {
748
+ "content": "[unused60]",
749
+ "lstrip": false,
750
+ "normalized": true,
751
+ "rstrip": false,
752
+ "single_word": false,
753
+ "special": false
754
+ },
755
+ "50346": {
756
+ "content": "[unused61]",
757
+ "lstrip": false,
758
+ "normalized": true,
759
+ "rstrip": false,
760
+ "single_word": false,
761
+ "special": false
762
+ },
763
+ "50347": {
764
+ "content": "[unused62]",
765
+ "lstrip": false,
766
+ "normalized": true,
767
+ "rstrip": false,
768
+ "single_word": false,
769
+ "special": false
770
+ },
771
+ "50348": {
772
+ "content": "[unused63]",
773
+ "lstrip": false,
774
+ "normalized": true,
775
+ "rstrip": false,
776
+ "single_word": false,
777
+ "special": false
778
+ },
779
+ "50349": {
780
+ "content": "[unused64]",
781
+ "lstrip": false,
782
+ "normalized": true,
783
+ "rstrip": false,
784
+ "single_word": false,
785
+ "special": false
786
+ },
787
+ "50350": {
788
+ "content": "[unused65]",
789
+ "lstrip": false,
790
+ "normalized": true,
791
+ "rstrip": false,
792
+ "single_word": false,
793
+ "special": false
794
+ },
795
+ "50351": {
796
+ "content": "[unused66]",
797
+ "lstrip": false,
798
+ "normalized": true,
799
+ "rstrip": false,
800
+ "single_word": false,
801
+ "special": false
802
+ },
803
+ "50352": {
804
+ "content": "[unused67]",
805
+ "lstrip": false,
806
+ "normalized": true,
807
+ "rstrip": false,
808
+ "single_word": false,
809
+ "special": false
810
+ },
811
+ "50353": {
812
+ "content": "[unused68]",
813
+ "lstrip": false,
814
+ "normalized": true,
815
+ "rstrip": false,
816
+ "single_word": false,
817
+ "special": false
818
+ },
819
+ "50354": {
820
+ "content": "[unused69]",
821
+ "lstrip": false,
822
+ "normalized": true,
823
+ "rstrip": false,
824
+ "single_word": false,
825
+ "special": false
826
+ },
827
+ "50355": {
828
+ "content": "[unused70]",
829
+ "lstrip": false,
830
+ "normalized": true,
831
+ "rstrip": false,
832
+ "single_word": false,
833
+ "special": false
834
+ },
835
+ "50356": {
836
+ "content": "[unused71]",
837
+ "lstrip": false,
838
+ "normalized": true,
839
+ "rstrip": false,
840
+ "single_word": false,
841
+ "special": false
842
+ },
843
+ "50357": {
844
+ "content": "[unused72]",
845
+ "lstrip": false,
846
+ "normalized": true,
847
+ "rstrip": false,
848
+ "single_word": false,
849
+ "special": false
850
+ },
851
+ "50358": {
852
+ "content": "[unused73]",
853
+ "lstrip": false,
854
+ "normalized": true,
855
+ "rstrip": false,
856
+ "single_word": false,
857
+ "special": false
858
+ },
859
+ "50359": {
860
+ "content": "[unused74]",
861
+ "lstrip": false,
862
+ "normalized": true,
863
+ "rstrip": false,
864
+ "single_word": false,
865
+ "special": false
866
+ },
867
+ "50360": {
868
+ "content": "[unused75]",
869
+ "lstrip": false,
870
+ "normalized": true,
871
+ "rstrip": false,
872
+ "single_word": false,
873
+ "special": false
874
+ },
875
+ "50361": {
876
+ "content": "[unused76]",
877
+ "lstrip": false,
878
+ "normalized": true,
879
+ "rstrip": false,
880
+ "single_word": false,
881
+ "special": false
882
+ },
883
+ "50362": {
884
+ "content": "[unused77]",
885
+ "lstrip": false,
886
+ "normalized": true,
887
+ "rstrip": false,
888
+ "single_word": false,
889
+ "special": false
890
+ },
891
+ "50363": {
892
+ "content": "[unused78]",
893
+ "lstrip": false,
894
+ "normalized": true,
895
+ "rstrip": false,
896
+ "single_word": false,
897
+ "special": false
898
+ },
899
+ "50364": {
900
+ "content": "[unused79]",
901
+ "lstrip": false,
902
+ "normalized": true,
903
+ "rstrip": false,
904
+ "single_word": false,
905
+ "special": false
906
+ },
907
+ "50365": {
908
+ "content": "[unused80]",
909
+ "lstrip": false,
910
+ "normalized": true,
911
+ "rstrip": false,
912
+ "single_word": false,
913
+ "special": false
914
+ },
915
+ "50366": {
916
+ "content": "[unused81]",
917
+ "lstrip": false,
918
+ "normalized": true,
919
+ "rstrip": false,
920
+ "single_word": false,
921
+ "special": false
922
+ },
923
+ "50367": {
924
+ "content": "[unused82]",
925
+ "lstrip": false,
926
+ "normalized": true,
927
+ "rstrip": false,
928
+ "single_word": false,
929
+ "special": false
930
+ }
931
+ },
932
+ "clean_up_tokenization_spaces": true,
933
+ "cls_token": "[CLS]",
934
+ "extra_special_tokens": {},
935
+ "mask_token": "[MASK]",
936
+ "model_input_names": [
937
+ "input_ids",
938
+ "attention_mask"
939
+ ],
940
+ "model_max_length": 8192,
941
+ "pad_token": "[PAD]",
942
+ "sep_token": "[SEP]",
943
+ "tokenizer_class": "PreTrainedTokenizerFast",
944
+ "unk_token": "[UNK]"
945
+ }
train_results.json ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 3.0,
3
+ "total_flos": 1.881641387556864e+16,
4
+ "train_loss": 0.08123073937216153,
5
+ "train_runtime": 961.2851,
6
+ "train_samples_per_second": 127.779,
7
+ "train_steps_per_second": 1.997
8
+ }