File size: 2,873 Bytes
089f809
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
0abbbca
 
 
089f809
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
0abbbca
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
089f809
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
---
license: apache-2.0
base_model: facebook/wav2vec2-large-xlsr-53
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: XLS-R_Jibbali_lang
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# XLS-R_Jibbali_lang

This model is a fine-tuned version of [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1752
- Wer: 0.1926
- Cer: 0.0770

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 20
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step | Validation Loss | Wer    | Cer    |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|
| 16.5992       | 0.99  | 56   | 15.4050         | 1.0    | 0.9812 |
| 3.79          | 2.0   | 113  | 3.3988          | 1.0    | 0.9812 |
| 3.1872        | 2.99  | 169  | 3.1498          | 1.0    | 0.9812 |
| 3.1705        | 4.0   | 226  | 3.1354          | 1.0    | 0.9812 |
| 3.1147        | 4.99  | 282  | 3.0947          | 1.0    | 0.9812 |
| 3.0616        | 6.0   | 339  | 2.9447          | 1.0    | 0.9460 |
| 2.8239        | 6.99  | 395  | 2.6661          | 1.0    | 0.9106 |
| 1.5494        | 8.0   | 452  | 1.0992          | 0.8684 | 0.3804 |
| 0.5291        | 8.99  | 508  | 0.2822          | 0.3026 | 0.1004 |
| 0.2022        | 10.0  | 565  | 0.2019          | 0.2080 | 0.0665 |
| 0.1721        | 10.99 | 621  | 0.2067          | 0.2032 | 0.0841 |
| 0.1705        | 12.0  | 678  | 0.1968          | 0.1996 | 0.0728 |
| 0.0989        | 12.99 | 734  | 0.2038          | 0.1955 | 0.0821 |
| 0.1299        | 14.0  | 791  | 0.1814          | 0.1963 | 0.0837 |
| 0.1352        | 14.99 | 847  | 0.1896          | 0.1941 | 0.0768 |
| 0.0487        | 16.0  | 904  | 0.1951          | 0.1933 | 0.0749 |
| 0.1412        | 16.99 | 960  | 0.1650          | 0.1970 | 0.0818 |
| 0.1027        | 18.0  | 1017 | 0.1720          | 0.1941 | 0.0783 |
| 0.0791        | 18.99 | 1073 | 0.1730          | 0.1933 | 0.0767 |
| 0.0406        | 19.82 | 1120 | 0.1752          | 0.1926 | 0.0770 |


### Framework versions

- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.2