smutuvi commited on
Commit
e6363af
1 Parent(s): 0bf45b9

End of training

Browse files
README.md CHANGED
@@ -1,201 +1,351 @@
1
  ---
2
- library_name: transformers
3
- tags: []
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
4
  ---
5
 
6
- # Model Card for Model ID
7
-
8
- <!-- Provide a quick summary of what the model is/does. -->
9
-
10
-
11
-
12
- ## Model Details
13
-
14
- ### Model Description
15
-
16
- <!-- Provide a longer summary of what this model is. -->
17
-
18
- This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
19
-
20
- - **Developed by:** [More Information Needed]
21
- - **Funded by [optional]:** [More Information Needed]
22
- - **Shared by [optional]:** [More Information Needed]
23
- - **Model type:** [More Information Needed]
24
- - **Language(s) (NLP):** [More Information Needed]
25
- - **License:** [More Information Needed]
26
- - **Finetuned from model [optional]:** [More Information Needed]
27
-
28
- ### Model Sources [optional]
29
-
30
- <!-- Provide the basic links for the model. -->
31
-
32
- - **Repository:** [More Information Needed]
33
- - **Paper [optional]:** [More Information Needed]
34
- - **Demo [optional]:** [More Information Needed]
35
-
36
- ## Uses
37
-
38
- <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
39
-
40
- ### Direct Use
41
-
42
- <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
43
-
44
- [More Information Needed]
45
-
46
- ### Downstream Use [optional]
47
-
48
- <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
49
-
50
- [More Information Needed]
51
-
52
- ### Out-of-Scope Use
53
-
54
- <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
55
-
56
- [More Information Needed]
57
-
58
- ## Bias, Risks, and Limitations
59
-
60
- <!-- This section is meant to convey both technical and sociotechnical limitations. -->
61
-
62
- [More Information Needed]
63
-
64
- ### Recommendations
65
-
66
- <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
67
-
68
- Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
69
-
70
- ## How to Get Started with the Model
71
-
72
- Use the code below to get started with the model.
73
-
74
- [More Information Needed]
75
-
76
- ## Training Details
77
-
78
- ### Training Data
79
-
80
- <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
81
-
82
- [More Information Needed]
83
-
84
- ### Training Procedure
85
-
86
- <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
87
-
88
- #### Preprocessing [optional]
89
-
90
- [More Information Needed]
91
-
92
-
93
- #### Training Hyperparameters
94
-
95
- - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
96
-
97
- #### Speeds, Sizes, Times [optional]
98
-
99
- <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
100
-
101
- [More Information Needed]
102
-
103
- ## Evaluation
104
-
105
- <!-- This section describes the evaluation protocols and provides the results. -->
106
-
107
- ### Testing Data, Factors & Metrics
108
-
109
- #### Testing Data
110
-
111
- <!-- This should link to a Dataset Card if possible. -->
112
-
113
- [More Information Needed]
114
-
115
- #### Factors
116
-
117
- <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
118
-
119
- [More Information Needed]
120
-
121
- #### Metrics
122
-
123
- <!-- These are the evaluation metrics being used, ideally with a description of why. -->
124
-
125
- [More Information Needed]
126
-
127
- ### Results
128
-
129
- [More Information Needed]
130
-
131
- #### Summary
132
-
133
-
134
-
135
- ## Model Examination [optional]
136
-
137
- <!-- Relevant interpretability work for the model goes here -->
138
-
139
- [More Information Needed]
140
-
141
- ## Environmental Impact
142
-
143
- <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
144
-
145
- Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
146
-
147
- - **Hardware Type:** [More Information Needed]
148
- - **Hours used:** [More Information Needed]
149
- - **Cloud Provider:** [More Information Needed]
150
- - **Compute Region:** [More Information Needed]
151
- - **Carbon Emitted:** [More Information Needed]
152
-
153
- ## Technical Specifications [optional]
154
-
155
- ### Model Architecture and Objective
156
-
157
- [More Information Needed]
158
-
159
- ### Compute Infrastructure
160
-
161
- [More Information Needed]
162
-
163
- #### Hardware
164
-
165
- [More Information Needed]
166
-
167
- #### Software
168
-
169
- [More Information Needed]
170
-
171
- ## Citation [optional]
172
-
173
- <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
174
-
175
- **BibTeX:**
176
-
177
- [More Information Needed]
178
-
179
- **APA:**
180
-
181
- [More Information Needed]
182
-
183
- ## Glossary [optional]
184
-
185
- <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
186
-
187
- [More Information Needed]
188
-
189
- ## More Information [optional]
190
-
191
- [More Information Needed]
192
-
193
- ## Model Card Authors [optional]
194
-
195
- [More Information Needed]
196
-
197
- ## Model Card Contact
198
-
199
- [More Information Needed]
200
-
201
-
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ license: apache-2.0
3
+ base_model: facebook/wav2vec2-large-xlsr-53
4
+ tags:
5
+ - generated_from_trainer
6
+ datasets:
7
+ - common_voice_16_0
8
+ metrics:
9
+ - wer
10
+ model-index:
11
+ - name: wav2vec2-large-xlsr-sw-common-voice-16
12
+ results:
13
+ - task:
14
+ name: Automatic Speech Recognition
15
+ type: automatic-speech-recognition
16
+ dataset:
17
+ name: common_voice_16_0
18
+ type: common_voice_16_0
19
+ config: sw
20
+ split: test
21
+ args: sw
22
+ metrics:
23
+ - name: Wer
24
+ type: wer
25
+ value: 0.3082326604654753
26
  ---
27
 
28
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
29
+ should probably proofread and complete it, then remove this comment. -->
30
+
31
+ # wav2vec2-large-xlsr-sw-common-voice-16
32
+
33
+ This model is a fine-tuned version of [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on the common_voice_16_0 dataset.
34
+ It achieves the following results on the evaluation set:
35
+ - Loss: inf
36
+ - Wer: 0.3082
37
+
38
+ ## Model description
39
+
40
+ More information needed
41
+
42
+ ## Intended uses & limitations
43
+
44
+ More information needed
45
+
46
+ ## Training and evaluation data
47
+
48
+ More information needed
49
+
50
+ ## Training procedure
51
+
52
+ ### Training hyperparameters
53
+
54
+ The following hyperparameters were used during training:
55
+ - learning_rate: 0.0003
56
+ - train_batch_size: 8
57
+ - eval_batch_size: 8
58
+ - seed: 42
59
+ - gradient_accumulation_steps: 2
60
+ - total_train_batch_size: 16
61
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
62
+ - lr_scheduler_type: linear
63
+ - lr_scheduler_warmup_steps: 500
64
+ - num_epochs: 30
65
+ - mixed_precision_training: Native AMP
66
+
67
+ ### Training results
68
+
69
+ | Training Loss | Epoch | Step | Validation Loss | Wer |
70
+ |:-------------:|:-----:|:------:|:---------------:|:------:|
71
+ | 5.8691 | 0.11 | 400 | inf | 1.0 |
72
+ | 1.673 | 0.22 | 800 | inf | 0.7178 |
73
+ | 0.6454 | 0.33 | 1200 | inf | 0.6274 |
74
+ | 0.5527 | 0.44 | 1600 | inf | 0.5747 |
75
+ | 0.4989 | 0.55 | 2000 | inf | 0.5174 |
76
+ | 0.4827 | 0.66 | 2400 | inf | 0.5302 |
77
+ | 0.4462 | 0.77 | 2800 | inf | 0.4916 |
78
+ | 0.4374 | 0.88 | 3200 | inf | 0.4769 |
79
+ | 0.4183 | 0.99 | 3600 | inf | 0.4687 |
80
+ | 0.3854 | 1.1 | 4000 | inf | 0.4669 |
81
+ | 0.3802 | 1.2 | 4400 | inf | 0.4513 |
82
+ | 0.3727 | 1.31 | 4800 | inf | 0.4505 |
83
+ | 0.3694 | 1.42 | 5200 | inf | 0.4405 |
84
+ | 0.3709 | 1.53 | 5600 | inf | 0.4364 |
85
+ | 0.363 | 1.64 | 6000 | inf | 0.4318 |
86
+ | 0.3669 | 1.75 | 6400 | inf | 0.4398 |
87
+ | 0.3597 | 1.86 | 6800 | inf | 0.4353 |
88
+ | 0.3541 | 1.97 | 7200 | inf | 0.4251 |
89
+ | 0.3277 | 2.08 | 7600 | inf | 0.4153 |
90
+ | 0.3211 | 2.19 | 8000 | inf | 0.4178 |
91
+ | 0.3225 | 2.3 | 8400 | inf | 0.4267 |
92
+ | 0.3215 | 2.41 | 8800 | inf | 0.4139 |
93
+ | 0.3224 | 2.52 | 9200 | inf | 0.4054 |
94
+ | 0.3106 | 2.63 | 9600 | inf | 0.4155 |
95
+ | 0.3141 | 2.74 | 10000 | inf | 0.4188 |
96
+ | 0.3189 | 2.85 | 10400 | inf | 0.4036 |
97
+ | 0.3213 | 2.96 | 10800 | inf | 0.4071 |
98
+ | 0.3005 | 3.07 | 11200 | inf | 0.3954 |
99
+ | 0.2872 | 3.18 | 11600 | inf | 0.3974 |
100
+ | 0.2855 | 3.29 | 12000 | inf | 0.3982 |
101
+ | 0.2898 | 3.39 | 12400 | inf | 0.3987 |
102
+ | 0.288 | 3.5 | 12800 | inf | 0.4021 |
103
+ | 0.2941 | 3.61 | 13200 | inf | 0.3955 |
104
+ | 0.2951 | 3.72 | 13600 | inf | 0.4022 |
105
+ | 0.2916 | 3.83 | 14000 | inf | 0.3960 |
106
+ | 0.2896 | 3.94 | 14400 | inf | 0.3903 |
107
+ | 0.2794 | 4.05 | 14800 | inf | 0.3918 |
108
+ | 0.2707 | 4.16 | 15200 | inf | 0.3873 |
109
+ | 0.2682 | 4.27 | 15600 | inf | 0.3927 |
110
+ | 0.2654 | 4.38 | 16000 | inf | 0.3844 |
111
+ | 0.2699 | 4.49 | 16400 | inf | 0.3909 |
112
+ | 0.2723 | 4.6 | 16800 | inf | 0.3904 |
113
+ | 0.2762 | 4.71 | 17200 | inf | 0.3857 |
114
+ | 0.2621 | 4.82 | 17600 | inf | 0.3795 |
115
+ | 0.26 | 4.93 | 18000 | inf | 0.3764 |
116
+ | 0.2659 | 5.04 | 18400 | inf | 0.3842 |
117
+ | 0.2479 | 5.15 | 18800 | inf | 0.3719 |
118
+ | 0.2518 | 5.26 | 19200 | inf | 0.3822 |
119
+ | 0.2591 | 5.37 | 19600 | inf | 0.3837 |
120
+ | 0.2491 | 5.48 | 20000 | inf | 0.3871 |
121
+ | 0.2466 | 5.59 | 20400 | inf | 0.3747 |
122
+ | 0.2519 | 5.69 | 20800 | inf | 0.3788 |
123
+ | 0.2516 | 5.8 | 21200 | inf | 0.3781 |
124
+ | 0.2422 | 5.91 | 21600 | inf | 0.3751 |
125
+ | 0.2439 | 6.02 | 22000 | inf | 0.3693 |
126
+ | 0.2327 | 6.13 | 22400 | inf | 0.3752 |
127
+ | 0.2327 | 6.24 | 22800 | inf | 0.3706 |
128
+ | 0.2302 | 6.35 | 23200 | inf | 0.3687 |
129
+ | 0.2313 | 6.46 | 23600 | inf | 0.3690 |
130
+ | 0.2363 | 6.57 | 24000 | inf | 0.3686 |
131
+ | 0.2329 | 6.68 | 24400 | inf | 0.3681 |
132
+ | 0.2328 | 6.79 | 24800 | inf | 0.3626 |
133
+ | 0.2329 | 6.9 | 25200 | inf | 0.3652 |
134
+ | 0.2254 | 7.01 | 25600 | inf | 0.3606 |
135
+ | 0.2124 | 7.12 | 26000 | inf | 0.3648 |
136
+ | 0.2206 | 7.23 | 26400 | inf | 0.3686 |
137
+ | 0.2151 | 7.34 | 26800 | inf | 0.3646 |
138
+ | 0.2167 | 7.45 | 27200 | inf | 0.3630 |
139
+ | 0.2196 | 7.56 | 27600 | inf | 0.3597 |
140
+ | 0.2089 | 7.67 | 28000 | inf | 0.3561 |
141
+ | 0.2183 | 7.78 | 28400 | inf | 0.3593 |
142
+ | 0.2148 | 7.89 | 28800 | inf | 0.3580 |
143
+ | 0.2232 | 7.99 | 29200 | inf | 0.3597 |
144
+ | 0.2002 | 8.1 | 29600 | inf | 0.3581 |
145
+ | 0.1924 | 8.21 | 30000 | inf | 0.3585 |
146
+ | 0.2046 | 8.32 | 30400 | inf | 0.3606 |
147
+ | 0.2057 | 8.43 | 30800 | inf | 0.3611 |
148
+ | 0.2042 | 8.54 | 31200 | inf | 0.3618 |
149
+ | 0.21 | 8.65 | 31600 | inf | 0.3599 |
150
+ | 0.2076 | 8.76 | 32000 | inf | 0.3568 |
151
+ | 0.208 | 8.87 | 32400 | inf | 0.3564 |
152
+ | 0.2154 | 8.98 | 32800 | inf | 0.3566 |
153
+ | 0.1991 | 9.09 | 33200 | inf | 0.3621 |
154
+ | 0.1986 | 9.2 | 33600 | inf | 0.3571 |
155
+ | 0.1898 | 9.31 | 34000 | inf | 0.3515 |
156
+ | 0.1961 | 9.42 | 34400 | inf | 0.3559 |
157
+ | 0.1947 | 9.53 | 34800 | inf | 0.3521 |
158
+ | 0.1886 | 9.64 | 35200 | inf | 0.3500 |
159
+ | 0.1901 | 9.75 | 35600 | inf | 0.3557 |
160
+ | 0.1998 | 9.86 | 36000 | inf | 0.3547 |
161
+ | 0.1873 | 9.97 | 36400 | inf | 0.3498 |
162
+ | 0.1858 | 10.08 | 36800 | inf | 0.3552 |
163
+ | 0.1804 | 10.18 | 37200 | inf | 0.3518 |
164
+ | 0.18 | 10.29 | 37600 | inf | 0.3504 |
165
+ | 0.1777 | 10.4 | 38000 | inf | 0.3532 |
166
+ | 0.1777 | 10.51 | 38400 | inf | 0.3530 |
167
+ | 0.1801 | 10.62 | 38800 | inf | 0.3515 |
168
+ | 0.1839 | 10.73 | 39200 | inf | 0.3538 |
169
+ | 0.1913 | 10.84 | 39600 | inf | 0.3554 |
170
+ | 0.1909 | 10.95 | 40000 | inf | 0.3479 |
171
+ | 0.1812 | 11.06 | 40400 | inf | 0.3467 |
172
+ | 0.1664 | 11.17 | 40800 | inf | 0.3491 |
173
+ | 0.175 | 11.28 | 41200 | inf | 0.3446 |
174
+ | 0.1733 | 11.39 | 41600 | inf | 0.3464 |
175
+ | 0.1709 | 11.5 | 42000 | inf | 0.3467 |
176
+ | 0.1777 | 11.61 | 42400 | inf | 0.3469 |
177
+ | 0.1735 | 11.72 | 42800 | inf | 0.3452 |
178
+ | 0.1765 | 11.83 | 43200 | inf | 0.3471 |
179
+ | 0.1738 | 11.94 | 43600 | inf | 0.3496 |
180
+ | 0.1649 | 12.05 | 44000 | inf | 0.3445 |
181
+ | 0.1601 | 12.16 | 44400 | inf | 0.3464 |
182
+ | 0.1603 | 12.27 | 44800 | inf | 0.3416 |
183
+ | 0.1634 | 12.38 | 45200 | inf | 0.3445 |
184
+ | 0.1628 | 12.48 | 45600 | inf | 0.3452 |
185
+ | 0.1621 | 12.59 | 46000 | inf | 0.3403 |
186
+ | 0.1596 | 12.7 | 46400 | inf | 0.3394 |
187
+ | 0.1589 | 12.81 | 46800 | inf | 0.3401 |
188
+ | 0.1632 | 12.92 | 47200 | inf | 0.3403 |
189
+ | 0.163 | 13.03 | 47600 | inf | 0.3429 |
190
+ | 0.1516 | 13.14 | 48000 | inf | 0.3417 |
191
+ | 0.1506 | 13.25 | 48400 | inf | 0.3417 |
192
+ | 0.1568 | 13.36 | 48800 | inf | 0.3410 |
193
+ | 0.1543 | 13.47 | 49200 | inf | 0.3409 |
194
+ | 0.1574 | 13.58 | 49600 | inf | 0.3408 |
195
+ | 0.1555 | 13.69 | 50000 | inf | 0.3424 |
196
+ | 0.1535 | 13.8 | 50400 | inf | 0.3395 |
197
+ | 0.1539 | 13.91 | 50800 | inf | 0.3409 |
198
+ | 0.1528 | 14.02 | 51200 | inf | 0.3406 |
199
+ | 0.1411 | 14.13 | 51600 | inf | 0.3366 |
200
+ | 0.1413 | 14.24 | 52000 | inf | 0.3402 |
201
+ | 0.1477 | 14.35 | 52400 | inf | 0.3386 |
202
+ | 0.1433 | 14.46 | 52800 | inf | 0.3356 |
203
+ | 0.1446 | 14.57 | 53200 | inf | 0.3357 |
204
+ | 0.1427 | 14.67 | 53600 | inf | 0.3378 |
205
+ | 0.1462 | 14.78 | 54000 | inf | 0.3328 |
206
+ | 0.1436 | 14.89 | 54400 | inf | 0.3358 |
207
+ | 0.1434 | 15.0 | 54800 | inf | 0.3366 |
208
+ | 0.135 | 15.11 | 55200 | inf | 0.3354 |
209
+ | 0.1375 | 15.22 | 55600 | inf | 0.3355 |
210
+ | 0.1366 | 15.33 | 56000 | inf | 0.3356 |
211
+ | 0.1389 | 15.44 | 56400 | inf | 0.3336 |
212
+ | 0.1378 | 15.55 | 56800 | inf | 0.3364 |
213
+ | 0.1362 | 15.66 | 57200 | inf | 0.3325 |
214
+ | 0.1376 | 15.77 | 57600 | inf | 0.3361 |
215
+ | 0.1323 | 15.88 | 58000 | inf | 0.3364 |
216
+ | 0.1343 | 15.99 | 58400 | inf | 0.3332 |
217
+ | 0.1257 | 16.1 | 58800 | inf | 0.3339 |
218
+ | 0.1239 | 16.21 | 59200 | inf | 0.3316 |
219
+ | 0.1292 | 16.32 | 59600 | inf | 0.3313 |
220
+ | 0.1297 | 16.43 | 60000 | inf | 0.3332 |
221
+ | 0.1265 | 16.54 | 60400 | inf | 0.3313 |
222
+ | 0.1271 | 16.65 | 60800 | inf | 0.3310 |
223
+ | 0.1315 | 16.76 | 61200 | inf | 0.3307 |
224
+ | 0.1271 | 16.87 | 61600 | inf | 0.3337 |
225
+ | 0.1298 | 16.97 | 62000 | inf | 0.3318 |
226
+ | 0.1211 | 17.08 | 62400 | inf | 0.3326 |
227
+ | 0.1192 | 17.19 | 62800 | inf | 0.3290 |
228
+ | 0.1232 | 17.3 | 63200 | inf | 0.3291 |
229
+ | 0.1229 | 17.41 | 63600 | inf | 0.3349 |
230
+ | 0.1162 | 17.52 | 64000 | inf | 0.3281 |
231
+ | 0.1207 | 17.63 | 64400 | inf | 0.3308 |
232
+ | 0.1179 | 17.74 | 64800 | inf | 0.3257 |
233
+ | 0.1207 | 17.85 | 65200 | inf | 0.3290 |
234
+ | 0.1256 | 17.96 | 65600 | inf | 0.3297 |
235
+ | 0.119 | 18.07 | 66000 | inf | 0.3279 |
236
+ | 0.1111 | 18.18 | 66400 | inf | 0.3302 |
237
+ | 0.1086 | 18.29 | 66800 | inf | 0.3285 |
238
+ | 0.1179 | 18.4 | 67200 | inf | 0.3274 |
239
+ | 0.1099 | 18.51 | 67600 | inf | 0.3281 |
240
+ | 0.1141 | 18.62 | 68000 | inf | 0.3281 |
241
+ | 0.1091 | 18.73 | 68400 | inf | 0.3301 |
242
+ | 0.1147 | 18.84 | 68800 | inf | 0.3270 |
243
+ | 0.1158 | 18.95 | 69200 | inf | 0.3246 |
244
+ | 0.1111 | 19.06 | 69600 | inf | 0.3227 |
245
+ | 0.1075 | 19.16 | 70000 | inf | 0.3249 |
246
+ | 0.1051 | 19.27 | 70400 | inf | 0.3253 |
247
+ | 0.1029 | 19.38 | 70800 | inf | 0.3252 |
248
+ | 0.1039 | 19.49 | 71200 | inf | 0.3264 |
249
+ | 0.1063 | 19.6 | 71600 | inf | 0.3242 |
250
+ | 0.1071 | 19.71 | 72000 | inf | 0.3250 |
251
+ | 0.1063 | 19.82 | 72400 | inf | 0.3248 |
252
+ | 0.1085 | 19.93 | 72800 | inf | 0.3247 |
253
+ | 0.1038 | 20.04 | 73200 | inf | 0.3242 |
254
+ | 0.1017 | 20.15 | 73600 | inf | 0.3255 |
255
+ | 0.099 | 20.26 | 74000 | inf | 0.3247 |
256
+ | 0.0971 | 20.37 | 74400 | inf | 0.3258 |
257
+ | 0.1002 | 20.48 | 74800 | inf | 0.3223 |
258
+ | 0.1013 | 20.59 | 75200 | inf | 0.3230 |
259
+ | 0.1018 | 20.7 | 75600 | inf | 0.3232 |
260
+ | 0.0967 | 20.81 | 76000 | inf | 0.3215 |
261
+ | 0.1008 | 20.92 | 76400 | inf | 0.3212 |
262
+ | 0.0975 | 21.03 | 76800 | inf | 0.3191 |
263
+ | 0.0893 | 21.14 | 77200 | inf | 0.3210 |
264
+ | 0.0911 | 21.25 | 77600 | inf | 0.3206 |
265
+ | 0.0959 | 21.36 | 78000 | inf | 0.3211 |
266
+ | 0.094 | 21.46 | 78400 | inf | 0.3198 |
267
+ | 0.0939 | 21.57 | 78800 | inf | 0.3202 |
268
+ | 0.0936 | 21.68 | 79200 | inf | 0.3202 |
269
+ | 0.0938 | 21.79 | 79600 | inf | 0.3195 |
270
+ | 0.0938 | 21.9 | 80000 | inf | 0.3184 |
271
+ | 0.0916 | 22.01 | 80400 | inf | 0.3185 |
272
+ | 0.0858 | 22.12 | 80800 | inf | 0.3177 |
273
+ | 0.0909 | 22.23 | 81200 | inf | 0.3211 |
274
+ | 0.0915 | 22.34 | 81600 | inf | 0.3222 |
275
+ | 0.088 | 22.45 | 82000 | inf | 0.3194 |
276
+ | 0.0902 | 22.56 | 82400 | inf | 0.3199 |
277
+ | 0.0868 | 22.67 | 82800 | inf | 0.3174 |
278
+ | 0.0871 | 22.78 | 83200 | inf | 0.3201 |
279
+ | 0.0908 | 22.89 | 83600 | inf | 0.3177 |
280
+ | 0.0842 | 23.0 | 84000 | inf | 0.3187 |
281
+ | 0.0842 | 23.11 | 84400 | inf | 0.3168 |
282
+ | 0.0815 | 23.22 | 84800 | inf | 0.3187 |
283
+ | 0.084 | 23.33 | 85200 | inf | 0.3201 |
284
+ | 0.0835 | 23.44 | 85600 | inf | 0.3185 |
285
+ | 0.0821 | 23.55 | 86000 | inf | 0.3189 |
286
+ | 0.0836 | 23.66 | 86400 | inf | 0.3179 |
287
+ | 0.0816 | 23.76 | 86800 | inf | 0.3174 |
288
+ | 0.0847 | 23.87 | 87200 | inf | 0.3172 |
289
+ | 0.0828 | 23.98 | 87600 | inf | 0.3178 |
290
+ | 0.0796 | 24.09 | 88000 | inf | 0.3144 |
291
+ | 0.0793 | 24.2 | 88400 | inf | 0.3149 |
292
+ | 0.0773 | 24.31 | 88800 | inf | 0.3165 |
293
+ | 0.0808 | 24.42 | 89200 | inf | 0.3154 |
294
+ | 0.0743 | 24.53 | 89600 | inf | 0.3159 |
295
+ | 0.078 | 24.64 | 90000 | inf | 0.3145 |
296
+ | 0.0792 | 24.75 | 90400 | inf | 0.3170 |
297
+ | 0.0775 | 24.86 | 90800 | inf | 0.3134 |
298
+ | 0.0763 | 24.97 | 91200 | inf | 0.3144 |
299
+ | 0.0705 | 25.08 | 91600 | inf | 0.3138 |
300
+ | 0.0724 | 25.19 | 92000 | inf | 0.3156 |
301
+ | 0.0732 | 25.3 | 92400 | inf | 0.3158 |
302
+ | 0.0743 | 25.41 | 92800 | inf | 0.3144 |
303
+ | 0.0729 | 25.52 | 93200 | inf | 0.3133 |
304
+ | 0.071 | 25.63 | 93600 | inf | 0.3139 |
305
+ | 0.0764 | 25.74 | 94000 | inf | 0.3122 |
306
+ | 0.0726 | 25.85 | 94400 | inf | 0.3128 |
307
+ | 0.0714 | 25.95 | 94800 | inf | 0.3135 |
308
+ | 0.0725 | 26.06 | 95200 | inf | 0.3147 |
309
+ | 0.0711 | 26.17 | 95600 | inf | 0.3130 |
310
+ | 0.0684 | 26.28 | 96000 | inf | 0.3125 |
311
+ | 0.0683 | 26.39 | 96400 | inf | 0.3144 |
312
+ | 0.0698 | 26.5 | 96800 | inf | 0.3135 |
313
+ | 0.0687 | 26.61 | 97200 | inf | 0.3131 |
314
+ | 0.0675 | 26.72 | 97600 | inf | 0.3119 |
315
+ | 0.0678 | 26.83 | 98000 | inf | 0.3105 |
316
+ | 0.0677 | 26.94 | 98400 | inf | 0.3102 |
317
+ | 0.068 | 27.05 | 98800 | inf | 0.3128 |
318
+ | 0.0694 | 27.16 | 99200 | inf | 0.3111 |
319
+ | 0.0681 | 27.27 | 99600 | inf | 0.3118 |
320
+ | 0.0656 | 27.38 | 100000 | inf | 0.3110 |
321
+ | 0.065 | 27.49 | 100400 | inf | 0.3113 |
322
+ | 0.0649 | 27.6 | 100800 | inf | 0.3113 |
323
+ | 0.0643 | 27.71 | 101200 | inf | 0.3107 |
324
+ | 0.0651 | 27.82 | 101600 | inf | 0.3102 |
325
+ | 0.0643 | 27.93 | 102000 | inf | 0.3109 |
326
+ | 0.063 | 28.04 | 102400 | inf | 0.3110 |
327
+ | 0.0604 | 28.15 | 102800 | inf | 0.3108 |
328
+ | 0.062 | 28.25 | 103200 | inf | 0.3110 |
329
+ | 0.0623 | 28.36 | 103600 | inf | 0.3106 |
330
+ | 0.063 | 28.47 | 104000 | inf | 0.3102 |
331
+ | 0.0619 | 28.58 | 104400 | inf | 0.3101 |
332
+ | 0.0636 | 28.69 | 104800 | inf | 0.3108 |
333
+ | 0.0636 | 28.8 | 105200 | inf | 0.3099 |
334
+ | 0.0643 | 28.91 | 105600 | inf | 0.3089 |
335
+ | 0.0607 | 29.02 | 106000 | inf | 0.3094 |
336
+ | 0.0597 | 29.13 | 106400 | inf | 0.3091 |
337
+ | 0.0616 | 29.24 | 106800 | inf | 0.3087 |
338
+ | 0.0594 | 29.35 | 107200 | inf | 0.3087 |
339
+ | 0.0614 | 29.46 | 107600 | inf | 0.3087 |
340
+ | 0.06 | 29.57 | 108000 | inf | 0.3082 |
341
+ | 0.0617 | 29.68 | 108400 | inf | 0.3085 |
342
+ | 0.0574 | 29.79 | 108800 | inf | 0.3082 |
343
+ | 0.06 | 29.9 | 109200 | inf | 0.3082 |
344
+
345
+
346
+ ### Framework versions
347
+
348
+ - Transformers 4.37.1
349
+ - Pytorch 2.2.0+cu121
350
+ - Datasets 2.16.1
351
+ - Tokenizers 0.15.0
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:bf6a318ef99dae0fdce2546aca2126503ccdc167d33f25de145183d79ff26c74
3
  size 1262102680
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4a9dfcad33e91cb23a9b398f820a81379715fab3cc32799a63fae2a1769a2122
3
  size 1262102680
runs/Mar01_18-22-57_hades-prod01/events.out.tfevents.1709318753.hades-prod01.2912126.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:49a1d524e8f320def085ffb7b3c928edadf26292af10e40a333c0a6096007c49
3
- size 137857
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:09585ff6cfafc8a573d26e174e3d35652be6f3487e2e6224092348309102ce16
3
+ size 138217