rossevine commited on
Commit
41b4abb
1 Parent(s): e838977

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +98 -0
README.md ADDED
@@ -0,0 +1,98 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: facebook/wav2vec2-xls-r-300m
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - wer
8
+ model-index:
9
+ - name: Model_S_D_Wav2Vec2
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # Model_S_D_Wav2Vec2
17
+
18
+ This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the None dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 0.0464
21
+ - Wer: 0.2319
22
+ - Cer: 0.0598
23
+
24
+ ## Model description
25
+
26
+ More information needed
27
+
28
+ ## Intended uses & limitations
29
+
30
+ More information needed
31
+
32
+ ## Training and evaluation data
33
+
34
+ More information needed
35
+
36
+ ## Training procedure
37
+
38
+ ### Training hyperparameters
39
+
40
+ The following hyperparameters were used during training:
41
+ - learning_rate: 0.0003
42
+ - train_batch_size: 16
43
+ - eval_batch_size: 8
44
+ - seed: 42
45
+ - gradient_accumulation_steps: 2
46
+ - total_train_batch_size: 32
47
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
48
+ - lr_scheduler_type: linear
49
+ - lr_scheduler_warmup_steps: 500
50
+ - num_epochs: 30
51
+
52
+ ### Training results
53
+
54
+ | Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
55
+ |:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|
56
+ | 3.5768 | 0.85 | 400 | 0.6152 | 0.5812 | 0.1905 |
57
+ | 0.3226 | 1.71 | 800 | 0.1026 | 0.3195 | 0.0722 |
58
+ | 0.1827 | 2.56 | 1200 | 0.0725 | 0.2048 | 0.0454 |
59
+ | 0.129 | 3.41 | 1600 | 0.0671 | 0.2393 | 0.0525 |
60
+ | 0.1075 | 4.26 | 2000 | 0.0556 | 0.2312 | 0.0497 |
61
+ | 0.0924 | 5.12 | 2400 | 0.0572 | 0.2040 | 0.0478 |
62
+ | 0.076 | 5.97 | 2800 | 0.0596 | 0.1472 | 0.0346 |
63
+ | 0.0695 | 6.82 | 3200 | 0.0608 | 0.2274 | 0.0510 |
64
+ | 0.0707 | 7.68 | 3600 | 0.0490 | 0.2665 | 0.0660 |
65
+ | 0.0597 | 8.53 | 4000 | 0.0509 | 0.2442 | 0.0593 |
66
+ | 0.0557 | 9.38 | 4400 | 0.0501 | 0.2533 | 0.0610 |
67
+ | 0.0503 | 10.23 | 4800 | 0.0519 | 0.2534 | 0.0622 |
68
+ | 0.0471 | 11.09 | 5200 | 0.0512 | 0.2585 | 0.0638 |
69
+ | 0.0417 | 11.94 | 5600 | 0.0497 | 0.2522 | 0.0610 |
70
+ | 0.0415 | 12.79 | 6000 | 0.0508 | 0.2547 | 0.0629 |
71
+ | 0.0372 | 13.65 | 6400 | 0.0497 | 0.2580 | 0.0643 |
72
+ | 0.0364 | 14.5 | 6800 | 0.0448 | 0.2498 | 0.0600 |
73
+ | 0.034 | 15.35 | 7200 | 0.0522 | 0.2419 | 0.0593 |
74
+ | 0.0306 | 16.2 | 7600 | 0.0510 | 0.2433 | 0.0560 |
75
+ | 0.0345 | 17.06 | 8000 | 0.0503 | 0.2610 | 0.0657 |
76
+ | 0.0266 | 17.91 | 8400 | 0.0462 | 0.2434 | 0.0620 |
77
+ | 0.0273 | 18.76 | 8800 | 0.0507 | 0.2456 | 0.0622 |
78
+ | 0.0216 | 19.62 | 9200 | 0.0466 | 0.2214 | 0.0531 |
79
+ | 0.0208 | 20.47 | 9600 | 0.0497 | 0.2396 | 0.0598 |
80
+ | 0.0201 | 21.32 | 10000 | 0.0470 | 0.2332 | 0.0559 |
81
+ | 0.0174 | 22.17 | 10400 | 0.0418 | 0.2346 | 0.0590 |
82
+ | 0.0198 | 23.03 | 10800 | 0.0472 | 0.2386 | 0.0602 |
83
+ | 0.0149 | 23.88 | 11200 | 0.0490 | 0.2446 | 0.0638 |
84
+ | 0.0133 | 24.73 | 11600 | 0.0497 | 0.2430 | 0.0632 |
85
+ | 0.0118 | 25.59 | 12000 | 0.0498 | 0.2368 | 0.0620 |
86
+ | 0.0106 | 26.44 | 12400 | 0.0453 | 0.2309 | 0.0590 |
87
+ | 0.0104 | 27.29 | 12800 | 0.0452 | 0.2296 | 0.0583 |
88
+ | 0.0085 | 28.14 | 13200 | 0.0467 | 0.2352 | 0.0604 |
89
+ | 0.0081 | 29.0 | 13600 | 0.0470 | 0.2310 | 0.0592 |
90
+ | 0.0079 | 29.85 | 14000 | 0.0464 | 0.2319 | 0.0598 |
91
+
92
+
93
+ ### Framework versions
94
+
95
+ - Transformers 4.31.0
96
+ - Pytorch 2.0.1+cu117
97
+ - Datasets 1.18.3
98
+ - Tokenizers 0.13.3