File size: 7,466 Bytes
f4fecfc
 
 
 
 
c997970
 
 
 
 
f4fecfc
 
 
 
 
 
 
 
 
 
 
c997970
 
 
 
 
 
 
f4fecfc
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
c997970
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
f4fecfc
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
---

license: apache-2.0
base_model: facebook/wav2vec2-base
tags:
- generated_from_trainer
metrics:
- accuracy
- precision
- recall
- f1
model-index:
- name: wav2vec2-classifier
  results: []
---


<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# wav2vec2-classifier

This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8841
- Accuracy: 0.8015
- Precision: 0.8244
- Recall: 0.8015
- F1: 0.7954
- Binary: 0.8613

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 3e-05

- train_batch_size: 32

- eval_batch_size: 32

- seed: 42

- gradient_accumulation_steps: 4

- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10

- mixed_precision_training: Native AMP



### Training results



| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1     | Binary |

|:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|:------:|

| No log        | 0.17  | 50   | 4.2299          | 0.0680   | 0.0274    | 0.0680 | 0.0275 | 0.3352 |

| No log        | 0.35  | 100  | 3.9259          | 0.0631   | 0.0167    | 0.0631 | 0.0191 | 0.3408 |

| No log        | 0.52  | 150  | 3.6877          | 0.1117   | 0.0535    | 0.1117 | 0.0606 | 0.3738 |

| No log        | 0.69  | 200  | 3.4911          | 0.1650   | 0.0958    | 0.1650 | 0.1017 | 0.4119 |

| No log        | 0.86  | 250  | 3.3070          | 0.2233   | 0.1497    | 0.2233 | 0.1517 | 0.4541 |

| 3.8291        | 1.04  | 300  | 3.1733          | 0.2354   | 0.1675    | 0.2354 | 0.1635 | 0.4619 |

| 3.8291        | 1.21  | 350  | 3.0042          | 0.3034   | 0.2792    | 0.3034 | 0.2378 | 0.5087 |

| 3.8291        | 1.38  | 400  | 2.8631          | 0.3519   | 0.3257    | 0.3519 | 0.2808 | 0.5434 |

| 3.8291        | 1.55  | 450  | 2.7172          | 0.3981   | 0.4132    | 0.3981 | 0.3454 | 0.5765 |

| 3.8291        | 1.73  | 500  | 2.5629          | 0.4733   | 0.4595    | 0.4733 | 0.4250 | 0.6299 |

| 3.8291        | 1.9   | 550  | 2.4316          | 0.4927   | 0.4601    | 0.4927 | 0.4352 | 0.6420 |

| 2.9187        | 2.07  | 600  | 2.3206          | 0.5218   | 0.5337    | 0.5218 | 0.4800 | 0.6638 |

| 2.9187        | 2.24  | 650  | 2.1817          | 0.5801   | 0.5746    | 0.5801 | 0.5292 | 0.7039 |

| 2.9187        | 2.42  | 700  | 2.0852          | 0.5631   | 0.5165    | 0.5631 | 0.5089 | 0.6927 |

| 2.9187        | 2.59  | 750  | 1.9932          | 0.5752   | 0.5836    | 0.5752 | 0.5310 | 0.7005 |

| 2.9187        | 2.76  | 800  | 1.8989          | 0.6189   | 0.6288    | 0.6189 | 0.5850 | 0.7318 |

| 2.9187        | 2.93  | 850  | 1.8100          | 0.6529   | 0.6389    | 0.6529 | 0.6181 | 0.7563 |

| 2.291         | 3.11  | 900  | 1.7138          | 0.6650   | 0.7058    | 0.6650 | 0.6360 | 0.7641 |

| 2.291         | 3.28  | 950  | 1.6582          | 0.6869   | 0.7094    | 0.6869 | 0.6585 | 0.7786 |

| 2.291         | 3.45  | 1000 | 1.5810          | 0.7039   | 0.7509    | 0.7039 | 0.6886 | 0.7913 |

| 2.291         | 3.62  | 1050 | 1.5116          | 0.7306   | 0.7799    | 0.7306 | 0.7263 | 0.8100 |

| 2.291         | 3.8   | 1100 | 1.4638          | 0.7039   | 0.7450    | 0.7039 | 0.6850 | 0.7920 |

| 2.291         | 3.97  | 1150 | 1.4173          | 0.7233   | 0.7744    | 0.7233 | 0.7099 | 0.8056 |

| 1.8674        | 4.14  | 1200 | 1.4021          | 0.6869   | 0.7375    | 0.6869 | 0.6707 | 0.7794 |

| 1.8674        | 4.31  | 1250 | 1.3271          | 0.7282   | 0.7796    | 0.7282 | 0.7240 | 0.8090 |

| 1.8674        | 4.49  | 1300 | 1.2851          | 0.7403   | 0.7903    | 0.7403 | 0.7305 | 0.8175 |

| 1.8674        | 4.66  | 1350 | 1.2666          | 0.7257   | 0.7796    | 0.7257 | 0.7162 | 0.8066 |

| 1.8674        | 4.83  | 1400 | 1.2354          | 0.7379   | 0.7785    | 0.7379 | 0.7301 | 0.8158 |

| 1.5849        | 5.0   | 1450 | 1.1930          | 0.7451   | 0.7913    | 0.7451 | 0.7420 | 0.8201 |

| 1.5849        | 5.18  | 1500 | 1.1529          | 0.7549   | 0.8124    | 0.7549 | 0.7527 | 0.8277 |

| 1.5849        | 5.35  | 1550 | 1.1293          | 0.7621   | 0.8236    | 0.7621 | 0.7626 | 0.8328 |

| 1.5849        | 5.52  | 1600 | 1.0915          | 0.7646   | 0.8132    | 0.7646 | 0.7609 | 0.8345 |

| 1.5849        | 5.69  | 1650 | 1.1038          | 0.7549   | 0.8013    | 0.7549 | 0.7518 | 0.8277 |

| 1.5849        | 5.87  | 1700 | 1.0632          | 0.7670   | 0.8125    | 0.7670 | 0.7604 | 0.8362 |

| 1.379         | 6.04  | 1750 | 1.0175          | 0.7767   | 0.8151    | 0.7767 | 0.7708 | 0.8430 |

| 1.379         | 6.21  | 1800 | 0.9889          | 0.7791   | 0.8137    | 0.7791 | 0.7746 | 0.8447 |

| 1.379         | 6.38  | 1850 | 0.9825          | 0.7816   | 0.8255    | 0.7816 | 0.7793 | 0.8464 |

| 1.379         | 6.56  | 1900 | 0.9725          | 0.7864   | 0.8442    | 0.7864 | 0.7826 | 0.8498 |

| 1.379         | 6.73  | 1950 | 0.9357          | 0.7961   | 0.8341    | 0.7961 | 0.7943 | 0.8566 |

| 1.379         | 6.9   | 2000 | 0.9351          | 0.7888   | 0.8315    | 0.7888 | 0.7848 | 0.8515 |

| 1.2346        | 7.08  | 2050 | 0.9187          | 0.7888   | 0.8374    | 0.7888 | 0.7895 | 0.8515 |

| 1.2346        | 7.25  | 2100 | 0.9028          | 0.7840   | 0.8361    | 0.7840 | 0.7835 | 0.8481 |

| 1.2346        | 7.42  | 2150 | 0.8773          | 0.7961   | 0.8419    | 0.7961 | 0.7909 | 0.8566 |

| 1.2346        | 7.59  | 2200 | 0.8816          | 0.7985   | 0.8399    | 0.7985 | 0.8016 | 0.8583 |

| 1.2346        | 7.77  | 2250 | 0.8604          | 0.7913   | 0.8294    | 0.7913 | 0.7919 | 0.8532 |

| 1.2346        | 7.94  | 2300 | 0.8579          | 0.8010   | 0.8381    | 0.8010 | 0.8003 | 0.8600 |

| 1.1154        | 8.11  | 2350 | 0.8552          | 0.7985   | 0.8439    | 0.7985 | 0.7996 | 0.8583 |

| 1.1154        | 8.28  | 2400 | 0.8493          | 0.7985   | 0.8461    | 0.7985 | 0.8000 | 0.8583 |

| 1.1154        | 8.46  | 2450 | 0.8421          | 0.7985   | 0.8421    | 0.7985 | 0.8009 | 0.8583 |

| 1.1154        | 8.63  | 2500 | 0.8416          | 0.8010   | 0.8424    | 0.8010 | 0.8013 | 0.8600 |

| 1.1154        | 8.8   | 2550 | 0.8375          | 0.8010   | 0.8460    | 0.8010 | 0.8010 | 0.8600 |

| 1.1154        | 8.97  | 2600 | 0.8304          | 0.8058   | 0.8444    | 0.8058 | 0.8072 | 0.8633 |

| 1.0661        | 9.15  | 2650 | 0.8183          | 0.8107   | 0.8471    | 0.8107 | 0.8076 | 0.8667 |

| 1.0661        | 9.32  | 2700 | 0.8093          | 0.8131   | 0.8521    | 0.8131 | 0.8123 | 0.8684 |

| 1.0661        | 9.49  | 2750 | 0.8104          | 0.8155   | 0.8544    | 0.8155 | 0.8148 | 0.8701 |

| 1.0661        | 9.66  | 2800 | 0.8106          | 0.8204   | 0.8581    | 0.8204 | 0.8208 | 0.8735 |

| 1.0661        | 9.84  | 2850 | 0.8094          | 0.8131   | 0.8485    | 0.8131 | 0.8126 | 0.8684 |





### Framework versions



- Transformers 4.38.2

- Pytorch 2.3.0

- Datasets 2.19.1

- Tokenizers 0.15.1