Model save
Browse files
README.md
CHANGED
@@ -17,8 +17,8 @@ should probably proofread and complete it, then remove this comment. -->
|
|
17 |
|
18 |
This model is a fine-tuned version of [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on the None dataset.
|
19 |
It achieves the following results on the evaluation set:
|
20 |
-
- Loss: 0.
|
21 |
-
- Wer: 0.
|
22 |
|
23 |
## Model description
|
24 |
|
@@ -37,7 +37,7 @@ More information needed
|
|
37 |
### Training hyperparameters
|
38 |
|
39 |
The following hyperparameters were used during training:
|
40 |
-
- learning_rate:
|
41 |
- train_batch_size: 16
|
42 |
- eval_batch_size: 16
|
43 |
- seed: 42
|
@@ -46,7 +46,7 @@ The following hyperparameters were used during training:
|
|
46 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
47 |
- lr_scheduler_type: linear
|
48 |
- lr_scheduler_warmup_steps: 200
|
49 |
-
- num_epochs:
|
50 |
- mixed_precision_training: Native AMP
|
51 |
|
52 |
### Training results
|
@@ -175,6 +175,129 @@ The following hyperparameters were used during training:
|
|
175 |
| 0.0668 | 19.54 | 12000 | 0.1086 | 0.2813 |
|
176 |
| 0.069 | 19.7 | 12100 | 0.1087 | 0.2810 |
|
177 |
| 0.0683 | 19.87 | 12200 | 0.1085 | 0.2807 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
178 |
|
179 |
|
180 |
### Framework versions
|
|
|
17 |
|
18 |
This model is a fine-tuned version of [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on the None dataset.
|
19 |
It achieves the following results on the evaluation set:
|
20 |
+
- Loss: 0.0992
|
21 |
+
- Wer: 0.2605
|
22 |
|
23 |
## Model description
|
24 |
|
|
|
37 |
### Training hyperparameters
|
38 |
|
39 |
The following hyperparameters were used during training:
|
40 |
+
- learning_rate: 2e-05
|
41 |
- train_batch_size: 16
|
42 |
- eval_batch_size: 16
|
43 |
- seed: 42
|
|
|
46 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
47 |
- lr_scheduler_type: linear
|
48 |
- lr_scheduler_warmup_steps: 200
|
49 |
+
- num_epochs: 40
|
50 |
- mixed_precision_training: Native AMP
|
51 |
|
52 |
### Training results
|
|
|
175 |
| 0.0668 | 19.54 | 12000 | 0.1086 | 0.2813 |
|
176 |
| 0.069 | 19.7 | 12100 | 0.1087 | 0.2810 |
|
177 |
| 0.0683 | 19.87 | 12200 | 0.1085 | 0.2807 |
|
178 |
+
| 0.1116 | 20.03 | 12300 | 0.1221 | 0.2978 |
|
179 |
+
| 0.0752 | 20.19 | 12400 | 0.1161 | 0.2956 |
|
180 |
+
| 0.0787 | 20.36 | 12500 | 0.1128 | 0.2927 |
|
181 |
+
| 0.0741 | 20.52 | 12600 | 0.1100 | 0.2922 |
|
182 |
+
| 0.0764 | 20.68 | 12700 | 0.1081 | 0.2906 |
|
183 |
+
| 0.0747 | 20.85 | 12800 | 0.1082 | 0.2896 |
|
184 |
+
| 0.0876 | 21.01 | 12900 | 0.1052 | 0.2896 |
|
185 |
+
| 0.0878 | 21.17 | 13000 | 0.1110 | 0.2950 |
|
186 |
+
| 0.0895 | 21.33 | 13100 | 0.1095 | 0.2934 |
|
187 |
+
| 0.0953 | 21.5 | 13200 | 0.1122 | 0.2981 |
|
188 |
+
| 0.0787 | 21.66 | 13300 | 0.1072 | 0.2896 |
|
189 |
+
| 0.0774 | 21.82 | 13400 | 0.1076 | 0.2880 |
|
190 |
+
| 0.0908 | 21.98 | 13500 | 0.1113 | 0.2916 |
|
191 |
+
| 0.0737 | 22.15 | 13600 | 0.1067 | 0.2870 |
|
192 |
+
| 0.0714 | 22.31 | 13700 | 0.1096 | 0.2864 |
|
193 |
+
| 0.0775 | 22.47 | 13800 | 0.1085 | 0.2868 |
|
194 |
+
| 0.0761 | 22.64 | 13900 | 0.1040 | 0.2852 |
|
195 |
+
| 0.0675 | 22.8 | 14000 | 0.1090 | 0.2836 |
|
196 |
+
| 0.0829 | 22.96 | 14100 | 0.1066 | 0.2814 |
|
197 |
+
| 0.0731 | 23.12 | 14200 | 0.1057 | 0.2835 |
|
198 |
+
| 0.058 | 23.29 | 14300 | 0.1059 | 0.2834 |
|
199 |
+
| 0.0833 | 23.45 | 14400 | 0.1056 | 0.2847 |
|
200 |
+
| 0.1007 | 23.62 | 14500 | 0.1225 | 0.3059 |
|
201 |
+
| 0.0896 | 23.78 | 14600 | 0.1088 | 0.2899 |
|
202 |
+
| 0.084 | 23.94 | 14700 | 0.1056 | 0.2834 |
|
203 |
+
| 0.0684 | 24.1 | 14800 | 0.1070 | 0.2865 |
|
204 |
+
| 0.0646 | 24.27 | 14900 | 0.1109 | 0.2862 |
|
205 |
+
| 0.0728 | 24.43 | 15000 | 0.1081 | 0.2876 |
|
206 |
+
| 0.0615 | 24.59 | 15100 | 0.1077 | 0.2846 |
|
207 |
+
| 0.0642 | 24.75 | 15200 | 0.1062 | 0.2842 |
|
208 |
+
| 0.0736 | 24.92 | 15300 | 0.1058 | 0.2864 |
|
209 |
+
| 0.0801 | 25.08 | 15400 | 0.1106 | 0.2844 |
|
210 |
+
| 0.0687 | 25.24 | 15500 | 0.1104 | 0.2836 |
|
211 |
+
| 0.0852 | 25.41 | 15600 | 0.1055 | 0.2826 |
|
212 |
+
| 0.078 | 25.57 | 15700 | 0.1069 | 0.2817 |
|
213 |
+
| 0.0815 | 25.73 | 15800 | 0.1040 | 0.2799 |
|
214 |
+
| 0.0863 | 25.89 | 15900 | 0.1074 | 0.2801 |
|
215 |
+
| 0.0603 | 26.06 | 16000 | 0.1044 | 0.2779 |
|
216 |
+
| 0.0625 | 26.22 | 16100 | 0.1036 | 0.2796 |
|
217 |
+
| 0.057 | 26.38 | 16200 | 0.1086 | 0.2802 |
|
218 |
+
| 0.0632 | 26.54 | 16300 | 0.1057 | 0.2790 |
|
219 |
+
| 0.0644 | 26.71 | 16400 | 0.1022 | 0.2750 |
|
220 |
+
| 0.0645 | 26.87 | 16500 | 0.1003 | 0.2766 |
|
221 |
+
| 0.0536 | 27.03 | 16600 | 0.1051 | 0.2786 |
|
222 |
+
| 0.058 | 27.2 | 16700 | 0.1051 | 0.2790 |
|
223 |
+
| 0.052 | 27.36 | 16800 | 0.1034 | 0.2748 |
|
224 |
+
| 0.0514 | 27.52 | 16900 | 0.1027 | 0.2751 |
|
225 |
+
| 0.0593 | 27.68 | 17000 | 0.1036 | 0.2795 |
|
226 |
+
| 0.0577 | 27.85 | 17100 | 0.1025 | 0.2770 |
|
227 |
+
| 0.0694 | 28.01 | 17200 | 0.1008 | 0.2733 |
|
228 |
+
| 0.0641 | 28.17 | 17300 | 0.1088 | 0.2760 |
|
229 |
+
| 0.0566 | 28.33 | 17400 | 0.1092 | 0.2759 |
|
230 |
+
| 0.073 | 28.5 | 17500 | 0.1120 | 0.2788 |
|
231 |
+
| 0.0632 | 28.66 | 17600 | 0.1056 | 0.2764 |
|
232 |
+
| 0.0674 | 28.82 | 17700 | 0.1021 | 0.2739 |
|
233 |
+
| 0.0663 | 28.99 | 17800 | 0.1033 | 0.2733 |
|
234 |
+
| 0.0544 | 29.15 | 17900 | 0.1053 | 0.2721 |
|
235 |
+
| 0.0583 | 29.31 | 18000 | 0.1033 | 0.2732 |
|
236 |
+
| 0.0652 | 29.47 | 18100 | 0.1015 | 0.2728 |
|
237 |
+
| 0.0577 | 29.64 | 18200 | 0.1029 | 0.2730 |
|
238 |
+
| 0.1068 | 29.8 | 18300 | 0.1297 | 0.2950 |
|
239 |
+
| 0.0805 | 29.97 | 18400 | 0.1113 | 0.2792 |
|
240 |
+
| 0.0689 | 30.13 | 18500 | 0.1077 | 0.2789 |
|
241 |
+
| 0.0688 | 30.29 | 18600 | 0.1069 | 0.2777 |
|
242 |
+
| 0.0589 | 30.45 | 18700 | 0.1071 | 0.2757 |
|
243 |
+
| 0.049 | 30.62 | 18800 | 0.1077 | 0.2749 |
|
244 |
+
| 0.0534 | 30.78 | 18900 | 0.1046 | 0.2703 |
|
245 |
+
| 0.0506 | 30.94 | 19000 | 0.1039 | 0.2728 |
|
246 |
+
| 0.0534 | 31.11 | 19100 | 0.1036 | 0.2719 |
|
247 |
+
| 0.0453 | 31.27 | 19200 | 0.1064 | 0.2717 |
|
248 |
+
| 0.0514 | 31.43 | 19300 | 0.1034 | 0.2712 |
|
249 |
+
| 0.0579 | 31.59 | 19400 | 0.1065 | 0.2726 |
|
250 |
+
| 0.0491 | 31.76 | 19500 | 0.1054 | 0.2749 |
|
251 |
+
| 0.0547 | 31.92 | 19600 | 0.1023 | 0.2720 |
|
252 |
+
| 0.08 | 32.08 | 19700 | 0.1037 | 0.2707 |
|
253 |
+
| 0.0649 | 32.24 | 19800 | 0.1037 | 0.2702 |
|
254 |
+
| 0.064 | 32.41 | 19900 | 0.1053 | 0.2714 |
|
255 |
+
| 0.064 | 32.57 | 20000 | 0.1035 | 0.2691 |
|
256 |
+
| 0.0658 | 32.73 | 20100 | 0.1017 | 0.2663 |
|
257 |
+
| 0.0636 | 32.9 | 20200 | 0.1031 | 0.2680 |
|
258 |
+
| 0.0439 | 33.06 | 20300 | 0.1010 | 0.2668 |
|
259 |
+
| 0.0518 | 33.22 | 20400 | 0.1016 | 0.2691 |
|
260 |
+
| 0.0498 | 33.38 | 20500 | 0.1028 | 0.2682 |
|
261 |
+
| 0.0516 | 33.55 | 20600 | 0.1009 | 0.2679 |
|
262 |
+
| 0.0534 | 33.71 | 20700 | 0.1022 | 0.2672 |
|
263 |
+
| 0.0464 | 33.87 | 20800 | 0.1029 | 0.2661 |
|
264 |
+
| 0.0522 | 34.03 | 20900 | 0.1002 | 0.2668 |
|
265 |
+
| 0.0458 | 34.2 | 21000 | 0.0981 | 0.2644 |
|
266 |
+
| 0.0425 | 34.36 | 21100 | 0.1004 | 0.2659 |
|
267 |
+
| 0.0461 | 34.52 | 21200 | 0.1009 | 0.2650 |
|
268 |
+
| 0.0436 | 34.69 | 21300 | 0.1007 | 0.2652 |
|
269 |
+
| 0.0507 | 34.85 | 21400 | 0.1005 | 0.2655 |
|
270 |
+
| 0.0437 | 35.01 | 21500 | 0.0992 | 0.2648 |
|
271 |
+
| 0.0492 | 35.17 | 21600 | 0.1022 | 0.2655 |
|
272 |
+
| 0.0456 | 35.34 | 21700 | 0.1030 | 0.2639 |
|
273 |
+
| 0.0421 | 35.5 | 21800 | 0.1054 | 0.2639 |
|
274 |
+
| 0.0759 | 35.67 | 21900 | 0.1253 | 0.2760 |
|
275 |
+
| 0.059 | 35.83 | 22000 | 0.1125 | 0.2710 |
|
276 |
+
| 0.0515 | 35.99 | 22100 | 0.1073 | 0.2667 |
|
277 |
+
| 0.0583 | 36.16 | 22200 | 0.1085 | 0.2671 |
|
278 |
+
| 0.0603 | 36.32 | 22300 | 0.1047 | 0.2658 |
|
279 |
+
| 0.0575 | 36.48 | 22400 | 0.1034 | 0.2652 |
|
280 |
+
| 0.0605 | 36.64 | 22500 | 0.1044 | 0.2656 |
|
281 |
+
| 0.0545 | 36.81 | 22600 | 0.1057 | 0.2649 |
|
282 |
+
| 0.0583 | 36.97 | 22700 | 0.1033 | 0.2641 |
|
283 |
+
| 0.0492 | 37.13 | 22800 | 0.1039 | 0.2641 |
|
284 |
+
| 0.0561 | 37.29 | 22900 | 0.1027 | 0.2640 |
|
285 |
+
| 0.0447 | 37.46 | 23000 | 0.1023 | 0.2631 |
|
286 |
+
| 0.0521 | 37.62 | 23100 | 0.1010 | 0.2636 |
|
287 |
+
| 0.0482 | 37.78 | 23200 | 0.1021 | 0.2635 |
|
288 |
+
| 0.0468 | 37.95 | 23300 | 0.0999 | 0.2631 |
|
289 |
+
| 0.0473 | 38.11 | 23400 | 0.1016 | 0.2629 |
|
290 |
+
| 0.0416 | 38.27 | 23500 | 0.1003 | 0.2621 |
|
291 |
+
| 0.0491 | 38.43 | 23600 | 0.1022 | 0.2618 |
|
292 |
+
| 0.0394 | 38.6 | 23700 | 0.1017 | 0.2622 |
|
293 |
+
| 0.0389 | 38.76 | 23800 | 0.1011 | 0.2620 |
|
294 |
+
| 0.0381 | 38.92 | 23900 | 0.0992 | 0.2608 |
|
295 |
+
| 0.0557 | 39.08 | 24000 | 0.0999 | 0.2613 |
|
296 |
+
| 0.0545 | 39.25 | 24100 | 0.1002 | 0.2608 |
|
297 |
+
| 0.0633 | 39.41 | 24200 | 0.0997 | 0.2607 |
|
298 |
+
| 0.0471 | 39.57 | 24300 | 0.0994 | 0.2609 |
|
299 |
+
| 0.0672 | 39.74 | 24400 | 0.0991 | 0.2606 |
|
300 |
+
| 0.066 | 39.9 | 24500 | 0.0992 | 0.2605 |
|
301 |
|
302 |
|
303 |
### Framework versions
|
model.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 1261975580
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:c66c9d78a1c791ee364d68a05c18e682f18d74f9c3b39e3d7f07f49a2e2b78a2
|
3 |
size 1261975580
|
runs/Nov28_16-50-44_4053f9fc2c05/events.out.tfevents.1701191936.4053f9fc2c05.1153.0
CHANGED
Binary files a/runs/Nov28_16-50-44_4053f9fc2c05/events.out.tfevents.1701191936.4053f9fc2c05.1153.0 and b/runs/Nov28_16-50-44_4053f9fc2c05/events.out.tfevents.1701191936.4053f9fc2c05.1153.0 differ
|
|