File size: 7,474 Bytes
cc6c2d2
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
878d563
 
 
 
 
 
cc6c2d2
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
214ab2d
878d563
cc6c2d2
 
 
878d563
cc6c2d2
 
878d563
cc6c2d2
 
 
 
 
 
878d563
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
cc6c2d2
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
---

license: apache-2.0
base_model: facebook/hubert-base-ls960
tags:
- generated_from_trainer
metrics:
- accuracy
- precision
- recall
- f1
model-index:
- name: hubert-classifier
  results: []
---


<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# hubert-classifier

This model is a fine-tuned version of [facebook/hubert-base-ls960](https://huggingface.co/facebook/hubert-base-ls960) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1058
- Accuracy: 0.7748
- Precision: 0.8018
- Recall: 0.7748
- F1: 0.7651
- Binary: 0.8455

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 3e-05

- train_batch_size: 32

- eval_batch_size: 32

- seed: 42

- gradient_accumulation_steps: 4

- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10

- mixed_precision_training: Native AMP



### Training results



| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1     | Binary |

|:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|:------:|

| No log        | 0.17  | 50   | 4.2665          | 0.0412   | 0.0107    | 0.0412 | 0.0127 | 0.2923 |

| No log        | 0.35  | 100  | 3.9427          | 0.0339   | 0.0016    | 0.0339 | 0.0030 | 0.3172 |

| No log        | 0.52  | 150  | 3.7412          | 0.0363   | 0.0025    | 0.0363 | 0.0041 | 0.3206 |

| No log        | 0.69  | 200  | 3.6193          | 0.0654   | 0.0238    | 0.0654 | 0.0259 | 0.3373 |

| No log        | 0.86  | 250  | 3.4784          | 0.1041   | 0.0460    | 0.1041 | 0.0459 | 0.3663 |

| No log        | 1.04  | 300  | 3.3705          | 0.1211   | 0.0602    | 0.1211 | 0.0466 | 0.3789 |

| No log        | 1.21  | 350  | 3.2597          | 0.1768   | 0.0811    | 0.1768 | 0.0894 | 0.4218 |

| No log        | 1.38  | 400  | 3.1606          | 0.2082   | 0.1867    | 0.2082 | 0.1416 | 0.4424 |

| No log        | 1.55  | 450  | 3.0720          | 0.1913   | 0.1490    | 0.1913 | 0.1296 | 0.4312 |

| 3.6525        | 1.73  | 500  | 2.9557          | 0.2446   | 0.1432    | 0.2446 | 0.1609 | 0.4671 |

| 3.6525        | 1.9   | 550  | 2.8287          | 0.2857   | 0.2265    | 0.2857 | 0.2059 | 0.4973 |

| 3.6525        | 2.07  | 600  | 2.7005          | 0.3075   | 0.2103    | 0.3075 | 0.2154 | 0.5136 |

| 3.6525        | 2.24  | 650  | 2.6183          | 0.3414   | 0.2398    | 0.3414 | 0.2486 | 0.5341 |

| 3.6525        | 2.42  | 700  | 2.5133          | 0.3632   | 0.2942    | 0.3632 | 0.2732 | 0.5516 |

| 3.6525        | 2.59  | 750  | 2.4277          | 0.3753   | 0.3322    | 0.3753 | 0.2948 | 0.5615 |

| 3.6525        | 2.76  | 800  | 2.3329          | 0.4092   | 0.3538    | 0.4092 | 0.3338 | 0.5845 |

| 3.6525        | 2.93  | 850  | 2.2465          | 0.4407   | 0.4125    | 0.4407 | 0.3745 | 0.6073 |

| 3.6525        | 3.11  | 900  | 2.1792          | 0.4600   | 0.4329    | 0.4600 | 0.3995 | 0.6203 |

| 3.6525        | 3.28  | 950  | 2.1004          | 0.5109   | 0.4995    | 0.5109 | 0.4540 | 0.6550 |

| 2.6844        | 3.45  | 1000 | 2.0314          | 0.5109   | 0.4799    | 0.5109 | 0.4520 | 0.6557 |

| 2.6844        | 3.62  | 1050 | 1.9561          | 0.5400   | 0.5309    | 0.5400 | 0.4859 | 0.6743 |

| 2.6844        | 3.8   | 1100 | 1.9362          | 0.5472   | 0.5441    | 0.5472 | 0.5066 | 0.6804 |

| 2.6844        | 3.97  | 1150 | 1.8666          | 0.5642   | 0.5647    | 0.5642 | 0.5232 | 0.6930 |

| 2.6844        | 4.14  | 1200 | 1.8204          | 0.5811   | 0.5716    | 0.5811 | 0.5416 | 0.7048 |

| 2.6844        | 4.31  | 1250 | 1.7494          | 0.5908   | 0.6153    | 0.5908 | 0.5618 | 0.7109 |

| 2.6844        | 4.49  | 1300 | 1.6973          | 0.6126   | 0.6062    | 0.6126 | 0.5804 | 0.7291 |

| 2.6844        | 4.66  | 1350 | 1.6615          | 0.6053   | 0.5864    | 0.6053 | 0.5707 | 0.7211 |

| 2.6844        | 4.83  | 1400 | 1.6120          | 0.6295   | 0.6304    | 0.6295 | 0.6000 | 0.7385 |

| 2.6844        | 5.0   | 1450 | 1.5620          | 0.6610   | 0.6605    | 0.6610 | 0.6333 | 0.7615 |

| 2.1096        | 5.18  | 1500 | 1.5330          | 0.6538   | 0.6424    | 0.6538 | 0.6223 | 0.7581 |

| 2.1096        | 5.35  | 1550 | 1.5112          | 0.6707   | 0.6830    | 0.6707 | 0.6484 | 0.7707 |

| 2.1096        | 5.52  | 1600 | 1.4732          | 0.6659   | 0.6793    | 0.6659 | 0.6430 | 0.7685 |

| 2.1096        | 5.69  | 1650 | 1.4420          | 0.6755   | 0.6969    | 0.6755 | 0.6538 | 0.7734 |

| 2.1096        | 5.87  | 1700 | 1.4011          | 0.7094   | 0.7461    | 0.7094 | 0.6929 | 0.7988 |

| 2.1096        | 6.04  | 1750 | 1.3924          | 0.6780   | 0.6835    | 0.6780 | 0.6557 | 0.7760 |

| 2.1096        | 6.21  | 1800 | 1.3604          | 0.7022   | 0.7116    | 0.7022 | 0.6838 | 0.7937 |

| 2.1096        | 6.38  | 1850 | 1.3271          | 0.7070   | 0.7079    | 0.7070 | 0.6882 | 0.7954 |

| 2.1096        | 6.56  | 1900 | 1.3104          | 0.7264   | 0.7338    | 0.7264 | 0.7110 | 0.8099 |

| 2.1096        | 6.73  | 1950 | 1.2804          | 0.7312   | 0.7591    | 0.7312 | 0.7159 | 0.8131 |

| 1.7648        | 6.9   | 2000 | 1.2722          | 0.7312   | 0.7739    | 0.7312 | 0.7185 | 0.8131 |

| 1.7648        | 7.08  | 2050 | 1.2777          | 0.7240   | 0.7581    | 0.7240 | 0.7109 | 0.8099 |

| 1.7648        | 7.25  | 2100 | 1.2319          | 0.7288   | 0.7373    | 0.7288 | 0.7114 | 0.8123 |

| 1.7648        | 7.42  | 2150 | 1.2074          | 0.7433   | 0.7717    | 0.7433 | 0.7317 | 0.8215 |

| 1.7648        | 7.59  | 2200 | 1.2150          | 0.7433   | 0.7850    | 0.7433 | 0.7348 | 0.8235 |

| 1.7648        | 7.77  | 2250 | 1.1787          | 0.7603   | 0.7930    | 0.7603 | 0.7462 | 0.8344 |

| 1.7648        | 7.94  | 2300 | 1.1815          | 0.7676   | 0.7932    | 0.7676 | 0.7576 | 0.8404 |

| 1.7648        | 8.11  | 2350 | 1.1578          | 0.7676   | 0.7972    | 0.7676 | 0.7601 | 0.8404 |

| 1.7648        | 8.28  | 2400 | 1.1605          | 0.7651   | 0.7982    | 0.7651 | 0.7560 | 0.8387 |

| 1.7648        | 8.46  | 2450 | 1.1563          | 0.7627   | 0.7937    | 0.7627 | 0.7548 | 0.8370 |

| 1.5781        | 8.63  | 2500 | 1.1303          | 0.7579   | 0.7847    | 0.7579 | 0.7476 | 0.8337 |

| 1.5781        | 8.8   | 2550 | 1.1217          | 0.7797   | 0.8117    | 0.7797 | 0.7702 | 0.8489 |

| 1.5781        | 8.97  | 2600 | 1.1278          | 0.7724   | 0.8025    | 0.7724 | 0.7640 | 0.8438 |

| 1.5781        | 9.15  | 2650 | 1.1188          | 0.7748   | 0.8022    | 0.7748 | 0.7653 | 0.8455 |

| 1.5781        | 9.32  | 2700 | 1.1161          | 0.7676   | 0.7979    | 0.7676 | 0.7588 | 0.8404 |

| 1.5781        | 9.49  | 2750 | 1.1078          | 0.7748   | 0.8012    | 0.7748 | 0.7650 | 0.8446 |

| 1.5781        | 9.66  | 2800 | 1.1104          | 0.7724   | 0.7973    | 0.7724 | 0.7632 | 0.8429 |

| 1.5781        | 9.84  | 2850 | 1.1058          | 0.7748   | 0.8018    | 0.7748 | 0.7651 | 0.8455 |





### Framework versions



- Transformers 4.38.2

- Pytorch 2.3.0

- Datasets 2.19.1

- Tokenizers 0.15.1