File size: 13,617 Bytes
1113d9f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
58d5cf1
 
 
 
 
 
1113d9f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
58d5cf1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1113d9f
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
---

license: apache-2.0
base_model: facebook/hubert-base-ls960
tags:
- generated_from_trainer
metrics:
- accuracy
- precision
- recall
- f1
model-index:
- name: hubert-classifier-aug-fold-1
  results: []
---


<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# hubert-classifier-aug-fold-1

This model is a fine-tuned version of [facebook/hubert-base-ls960](https://huggingface.co/facebook/hubert-base-ls960) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5721
- Accuracy: 0.8706
- Precision: 0.8813
- Recall: 0.8706
- F1: 0.8644
- Binary: 0.9094

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0001

- train_batch_size: 32

- eval_batch_size: 32

- seed: 42

- gradient_accumulation_steps: 4

- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30

- mixed_precision_training: Native AMP



### Training results



| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1     | Binary |

|:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|:------:|

| No log        | 0.19  | 50   | 4.4259          | 0.0135   | 0.0002    | 0.0135 | 0.0004 | 0.1278 |

| No log        | 0.38  | 100  | 3.8225          | 0.0485   | 0.0033    | 0.0485 | 0.0060 | 0.3151 |

| No log        | 0.58  | 150  | 3.3928          | 0.0701   | 0.0104    | 0.0701 | 0.0169 | 0.3450 |

| No log        | 0.77  | 200  | 3.2088          | 0.0836   | 0.0166    | 0.0836 | 0.0258 | 0.3553 |

| No log        | 0.96  | 250  | 3.0752          | 0.1078   | 0.0279    | 0.1078 | 0.0388 | 0.3714 |

| 3.8204        | 1.15  | 300  | 2.9617          | 0.1186   | 0.0338    | 0.1186 | 0.0443 | 0.3776 |

| 3.8204        | 1.34  | 350  | 2.7928          | 0.1617   | 0.0704    | 0.1617 | 0.0848 | 0.4100 |

| 3.8204        | 1.53  | 400  | 2.5904          | 0.2237   | 0.1640    | 0.2237 | 0.1427 | 0.4542 |

| 3.8204        | 1.73  | 450  | 2.3895          | 0.2615   | 0.1634    | 0.2615 | 0.1771 | 0.4814 |

| 3.8204        | 1.92  | 500  | 2.2567          | 0.2992   | 0.2170    | 0.2992 | 0.2160 | 0.5097 |

| 2.9288        | 2.11  | 550  | 2.0903          | 0.3801   | 0.2993    | 0.3801 | 0.3008 | 0.5668 |

| 2.9288        | 2.3   | 600  | 1.9624          | 0.4151   | 0.3638    | 0.4151 | 0.3389 | 0.5900 |

| 2.9288        | 2.49  | 650  | 1.7353          | 0.5148   | 0.4641    | 0.5148 | 0.4447 | 0.6625 |

| 2.9288        | 2.68  | 700  | 1.6687          | 0.5013   | 0.4658    | 0.5013 | 0.4433 | 0.6526 |

| 2.9288        | 2.88  | 750  | 1.5726          | 0.5391   | 0.5299    | 0.5391 | 0.4914 | 0.6817 |

| 2.2908        | 3.07  | 800  | 1.4471          | 0.5714   | 0.5785    | 0.5714 | 0.5269 | 0.7003 |

| 2.2908        | 3.26  | 850  | 1.3350          | 0.6334   | 0.6432    | 0.6334 | 0.6040 | 0.7445 |

| 2.2908        | 3.45  | 900  | 1.1787          | 0.6685   | 0.6811    | 0.6685 | 0.6349 | 0.7690 |

| 2.2908        | 3.64  | 950  | 1.1315          | 0.6846   | 0.6962    | 0.6846 | 0.6524 | 0.7803 |

| 2.2908        | 3.84  | 1000 | 1.0283          | 0.7493   | 0.7751    | 0.7493 | 0.7298 | 0.8248 |

| 1.8601        | 4.03  | 1050 | 1.0553          | 0.7089   | 0.7311    | 0.7089 | 0.6824 | 0.7984 |

| 1.8601        | 4.22  | 1100 | 0.9262          | 0.7385   | 0.7627    | 0.7385 | 0.7190 | 0.8170 |

| 1.8601        | 4.41  | 1150 | 0.9209          | 0.7385   | 0.7586    | 0.7385 | 0.7221 | 0.8170 |

| 1.8601        | 4.6   | 1200 | 0.9163          | 0.7520   | 0.8027    | 0.7520 | 0.7416 | 0.8280 |

| 1.8601        | 4.79  | 1250 | 0.7954          | 0.8140   | 0.8394    | 0.8140 | 0.8028 | 0.8682 |

| 1.8601        | 4.99  | 1300 | 0.8244          | 0.7709   | 0.7999    | 0.7709 | 0.7651 | 0.8388 |

| 1.624         | 5.18  | 1350 | 0.8025          | 0.7655   | 0.8100    | 0.7655 | 0.7599 | 0.8364 |

| 1.624         | 5.37  | 1400 | 0.7521          | 0.7925   | 0.8238    | 0.7925 | 0.7827 | 0.8558 |

| 1.624         | 5.56  | 1450 | 0.7821          | 0.7898   | 0.8231    | 0.7898 | 0.7824 | 0.8528 |

| 1.624         | 5.75  | 1500 | 0.6841          | 0.7951   | 0.8237    | 0.7951 | 0.7887 | 0.8577 |

| 1.624         | 5.94  | 1550 | 0.6757          | 0.7978   | 0.8276    | 0.7978 | 0.7935 | 0.8596 |

| 1.4343        | 6.14  | 1600 | 0.6709          | 0.8140   | 0.8415    | 0.8140 | 0.8094 | 0.8709 |

| 1.4343        | 6.33  | 1650 | 0.6361          | 0.8113   | 0.8364    | 0.8113 | 0.8045 | 0.8690 |

| 1.4343        | 6.52  | 1700 | 0.6413          | 0.8275   | 0.8479    | 0.8275 | 0.8231 | 0.8814 |

| 1.4343        | 6.71  | 1750 | 0.6074          | 0.8302   | 0.8484    | 0.8302 | 0.8244 | 0.8803 |

| 1.4343        | 6.9   | 1800 | 0.6286          | 0.8005   | 0.8321    | 0.8005 | 0.7964 | 0.8606 |

| 1.3175        | 7.09  | 1850 | 0.5431          | 0.8356   | 0.8558    | 0.8356 | 0.8312 | 0.8860 |

| 1.3175        | 7.29  | 1900 | 0.5612          | 0.8491   | 0.8828    | 0.8491 | 0.8499 | 0.8927 |

| 1.3175        | 7.48  | 1950 | 0.5324          | 0.8491   | 0.8795    | 0.8491 | 0.8492 | 0.8954 |

| 1.3175        | 7.67  | 2000 | 0.5793          | 0.8383   | 0.8589    | 0.8383 | 0.8345 | 0.8871 |

| 1.3175        | 7.86  | 2050 | 0.5722          | 0.8248   | 0.8575    | 0.8248 | 0.8258 | 0.8757 |

| 1.2154        | 8.05  | 2100 | 0.6362          | 0.8167   | 0.8511    | 0.8167 | 0.8149 | 0.8701 |

| 1.2154        | 8.25  | 2150 | 0.5846          | 0.8302   | 0.8588    | 0.8302 | 0.8293 | 0.8814 |

| 1.2154        | 8.44  | 2200 | 0.6121          | 0.8113   | 0.8477    | 0.8113 | 0.8065 | 0.8704 |

| 1.2154        | 8.63  | 2250 | 0.5895          | 0.8356   | 0.8671    | 0.8356 | 0.8343 | 0.8871 |

| 1.2154        | 8.82  | 2300 | 0.5404          | 0.8437   | 0.8756    | 0.8437 | 0.8407 | 0.8927 |

| 1.1306        | 9.01  | 2350 | 0.5433          | 0.8410   | 0.8657    | 0.8410 | 0.8394 | 0.8900 |

| 1.1306        | 9.2   | 2400 | 0.5535          | 0.8383   | 0.8645    | 0.8383 | 0.8380 | 0.8881 |

| 1.1306        | 9.4   | 2450 | 0.5201          | 0.8518   | 0.8850    | 0.8518 | 0.8518 | 0.8957 |

| 1.1306        | 9.59  | 2500 | 0.5464          | 0.8383   | 0.8680    | 0.8383 | 0.8373 | 0.8881 |

| 1.1306        | 9.78  | 2550 | 0.5960          | 0.8329   | 0.8586    | 0.8329 | 0.8304 | 0.8841 |

| 1.1306        | 9.97  | 2600 | 0.5304          | 0.8518   | 0.8790    | 0.8518 | 0.8506 | 0.8957 |

| 1.0743        | 10.16 | 2650 | 0.4804          | 0.8706   | 0.8937    | 0.8706 | 0.8703 | 0.9086 |

| 1.0743        | 10.35 | 2700 | 0.5004          | 0.8652   | 0.8908    | 0.8652 | 0.8640 | 0.9059 |

| 1.0743        | 10.55 | 2750 | 0.4730          | 0.8652   | 0.8921    | 0.8652 | 0.8643 | 0.9070 |

| 1.0743        | 10.74 | 2800 | 0.4958          | 0.8383   | 0.8663    | 0.8383 | 0.8372 | 0.8879 |

| 1.0743        | 10.93 | 2850 | 0.4672          | 0.8544   | 0.8814    | 0.8544 | 0.8538 | 0.8973 |

| 1.0274        | 11.12 | 2900 | 0.5339          | 0.8571   | 0.8807    | 0.8571 | 0.8565 | 0.9003 |

| 1.0274        | 11.31 | 2950 | 0.5013          | 0.8491   | 0.8698    | 0.8491 | 0.8462 | 0.8954 |

| 1.0274        | 11.51 | 3000 | 0.4882          | 0.8679   | 0.8904    | 0.8679 | 0.8677 | 0.9078 |

| 1.0274        | 11.7  | 3050 | 0.5059          | 0.8518   | 0.8837    | 0.8518 | 0.8519 | 0.8965 |

| 1.0274        | 11.89 | 3100 | 0.4636          | 0.8679   | 0.8864    | 0.8679 | 0.8666 | 0.9075 |

| 0.9585        | 12.08 | 3150 | 0.4667          | 0.8787   | 0.8955    | 0.8787 | 0.8776 | 0.9143 |

| 0.9585        | 12.27 | 3200 | 0.5159          | 0.8544   | 0.8734    | 0.8544 | 0.8534 | 0.8976 |

| 0.9585        | 12.46 | 3250 | 0.5177          | 0.8518   | 0.8748    | 0.8518 | 0.8528 | 0.8987 |

| 0.9585        | 12.66 | 3300 | 0.4435          | 0.8841   | 0.9040    | 0.8841 | 0.8835 | 0.9189 |

| 0.9585        | 12.85 | 3350 | 0.5116          | 0.8544   | 0.8851    | 0.8544 | 0.8558 | 0.8992 |

| 0.9352        | 13.04 | 3400 | 0.4538          | 0.8706   | 0.8888    | 0.8706 | 0.8697 | 0.9105 |

| 0.9352        | 13.23 | 3450 | 0.4973          | 0.8706   | 0.8944    | 0.8706 | 0.8684 | 0.9086 |

| 0.9352        | 13.42 | 3500 | 0.4465          | 0.8760   | 0.8937    | 0.8760 | 0.8741 | 0.9135 |

| 0.9352        | 13.61 | 3550 | 0.4691          | 0.8814   | 0.9042    | 0.8814 | 0.8806 | 0.9154 |

| 0.9352        | 13.81 | 3600 | 0.5010          | 0.8652   | 0.8916    | 0.8652 | 0.8641 | 0.9051 |

| 0.9352        | 14.0  | 3650 | 0.5133          | 0.8410   | 0.8728    | 0.8410 | 0.8396 | 0.8879 |

| 0.8941        | 14.19 | 3700 | 0.4476          | 0.8706   | 0.8961    | 0.8706 | 0.8729 | 0.9086 |

| 0.8941        | 14.38 | 3750 | 0.4321          | 0.8679   | 0.8915    | 0.8679 | 0.8681 | 0.9067 |

| 0.8941        | 14.57 | 3800 | 0.4033          | 0.8841   | 0.8991    | 0.8841 | 0.8835 | 0.9181 |

| 0.8941        | 14.77 | 3850 | 0.4599          | 0.8841   | 0.9052    | 0.8841 | 0.8827 | 0.9181 |

| 0.8941        | 14.96 | 3900 | 0.4673          | 0.8625   | 0.8883    | 0.8625 | 0.8631 | 0.9040 |

| 0.8574        | 15.15 | 3950 | 0.4906          | 0.8760   | 0.8993    | 0.8760 | 0.8749 | 0.9135 |

| 0.8574        | 15.34 | 4000 | 0.5055          | 0.8544   | 0.8836    | 0.8544 | 0.8518 | 0.8984 |

| 0.8574        | 15.53 | 4050 | 0.4119          | 0.8841   | 0.8985    | 0.8841 | 0.8831 | 0.9191 |

| 0.8574        | 15.72 | 4100 | 0.4684          | 0.8760   | 0.8989    | 0.8760 | 0.8752 | 0.9135 |

| 0.8574        | 15.92 | 4150 | 0.4453          | 0.8787   | 0.8999    | 0.8787 | 0.8776 | 0.9151 |

| 0.8344        | 16.11 | 4200 | 0.4928          | 0.8787   | 0.9000    | 0.8787 | 0.8783 | 0.9143 |

| 0.8344        | 16.3  | 4250 | 0.4535          | 0.8868   | 0.9067    | 0.8868 | 0.8863 | 0.9191 |

| 0.8344        | 16.49 | 4300 | 0.4259          | 0.8787   | 0.8986    | 0.8787 | 0.8781 | 0.9154 |

| 0.8344        | 16.68 | 4350 | 0.4289          | 0.8787   | 0.8970    | 0.8787 | 0.8784 | 0.9154 |

| 0.8344        | 16.87 | 4400 | 0.4828          | 0.8814   | 0.9013    | 0.8814 | 0.8813 | 0.9154 |

| 0.8066        | 17.07 | 4450 | 0.4866          | 0.8787   | 0.8969    | 0.8787 | 0.8792 | 0.9135 |

| 0.8066        | 17.26 | 4500 | 0.4388          | 0.8760   | 0.8934    | 0.8760 | 0.8768 | 0.9116 |

| 0.8066        | 17.45 | 4550 | 0.5018          | 0.8787   | 0.8993    | 0.8787 | 0.8775 | 0.9143 |

| 0.8066        | 17.64 | 4600 | 0.4838          | 0.8814   | 0.8981    | 0.8814 | 0.8808 | 0.9154 |

| 0.8066        | 17.83 | 4650 | 0.5394          | 0.8679   | 0.8893    | 0.8679 | 0.8662 | 0.9059 |

| 0.7757        | 18.02 | 4700 | 0.4628          | 0.8814   | 0.8964    | 0.8814 | 0.8800 | 0.9162 |

| 0.7757        | 18.22 | 4750 | 0.5456          | 0.8733   | 0.8907    | 0.8733 | 0.8719 | 0.9097 |

| 0.7757        | 18.41 | 4800 | 0.4858          | 0.8814   | 0.8970    | 0.8814 | 0.8804 | 0.9154 |

| 0.7757        | 18.6  | 4850 | 0.5836          | 0.8571   | 0.8776    | 0.8571 | 0.8568 | 0.8984 |

| 0.7757        | 18.79 | 4900 | 0.5008          | 0.8787   | 0.8985    | 0.8787 | 0.8781 | 0.9143 |

| 0.7757        | 18.98 | 4950 | 0.5259          | 0.8760   | 0.8950    | 0.8760 | 0.8749 | 0.9116 |

| 0.7595        | 19.18 | 5000 | 0.5906          | 0.8652   | 0.8869    | 0.8652 | 0.8657 | 0.9040 |

| 0.7595        | 19.37 | 5050 | 0.4905          | 0.8841   | 0.8993    | 0.8841 | 0.8839 | 0.9173 |

| 0.7595        | 19.56 | 5100 | 0.5958          | 0.8598   | 0.8804    | 0.8598 | 0.8596 | 0.9003 |

| 0.7595        | 19.75 | 5150 | 0.5466          | 0.8679   | 0.8924    | 0.8679 | 0.8666 | 0.9059 |

| 0.7595        | 19.94 | 5200 | 0.4639          | 0.8841   | 0.9008    | 0.8841 | 0.8834 | 0.9173 |

| 0.7257        | 20.13 | 5250 | 0.5094          | 0.8787   | 0.9015    | 0.8787 | 0.8795 | 0.9135 |

| 0.7257        | 20.33 | 5300 | 0.5310          | 0.8733   | 0.8973    | 0.8733 | 0.8737 | 0.9097 |

| 0.7257        | 20.52 | 5350 | 0.5118          | 0.8733   | 0.8925    | 0.8733 | 0.8734 | 0.9097 |

| 0.7257        | 20.71 | 5400 | 0.5166          | 0.8814   | 0.9017    | 0.8814 | 0.8814 | 0.9154 |

| 0.7257        | 20.9  | 5450 | 0.4850          | 0.8814   | 0.8984    | 0.8814 | 0.8807 | 0.9164 |

| 0.7185        | 21.09 | 5500 | 0.5161          | 0.8841   | 0.9018    | 0.8841 | 0.8842 | 0.9183 |

| 0.7185        | 21.28 | 5550 | 0.5197          | 0.8706   | 0.8904    | 0.8706 | 0.8694 | 0.9086 |

| 0.7185        | 21.48 | 5600 | 0.5297          | 0.8733   | 0.8921    | 0.8733 | 0.8728 | 0.9097 |

| 0.7185        | 21.67 | 5650 | 0.5317          | 0.8706   | 0.8913    | 0.8706 | 0.8694 | 0.9078 |

| 0.7185        | 21.86 | 5700 | 0.5120          | 0.8625   | 0.8809    | 0.8625 | 0.8610 | 0.9022 |

| 0.6968        | 22.05 | 5750 | 0.5144          | 0.8760   | 0.8960    | 0.8760 | 0.8753 | 0.9116 |

| 0.6968        | 22.24 | 5800 | 0.5688          | 0.8733   | 0.8911    | 0.8733 | 0.8721 | 0.9097 |

| 0.6968        | 22.44 | 5850 | 0.5430          | 0.8733   | 0.8916    | 0.8733 | 0.8724 | 0.9097 |





### Framework versions



- Transformers 4.38.2

- Pytorch 2.3.0

- Datasets 2.19.1

- Tokenizers 0.15.1