File size: 6,260 Bytes
cc6c2d2
 
 
 
 
 
 
 
 
 
 
9089351
cc6c2d2
 
 
 
 
 
9089351
cc6c2d2
 
 
40612a1
 
 
 
 
 
cc6c2d2
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
e4c6798
878d563
cc6c2d2
 
 
878d563
cc6c2d2
 
b9fa467
cc6c2d2
 
 
 
 
 
40612a1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
cc6c2d2
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
---

license: apache-2.0
base_model: facebook/hubert-base-ls960
tags:
- generated_from_trainer
metrics:
- accuracy
- precision
- recall
- f1
model-index:
- name: hubert-classifier-aug
  results: []
---


<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# hubert-classifier-aug

This model is a fine-tuned version of [facebook/hubert-base-ls960](https://huggingface.co/facebook/hubert-base-ls960) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5817
- Accuracy: 0.8356
- Precision: 0.8647
- Recall: 0.8356
- F1: 0.8286
- Binary: 0.8852

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0001

- train_batch_size: 32

- eval_batch_size: 32

- seed: 42

- gradient_accumulation_steps: 4

- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100

- mixed_precision_training: Native AMP



### Training results



| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1     | Binary |

|:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|:------:|

| No log        | 0.19  | 50   | 4.0265          | 0.0404   | 0.0071    | 0.0404 | 0.0079 | 0.3011 |

| No log        | 0.38  | 100  | 3.5428          | 0.0485   | 0.0048    | 0.0485 | 0.0080 | 0.3286 |

| No log        | 0.58  | 150  | 3.3405          | 0.0836   | 0.0175    | 0.0836 | 0.0251 | 0.3555 |

| No log        | 0.77  | 200  | 3.2238          | 0.0809   | 0.0143    | 0.0809 | 0.0216 | 0.3512 |

| No log        | 0.96  | 250  | 3.1041          | 0.0728   | 0.0135    | 0.0728 | 0.0202 | 0.3493 |

| No log        | 1.15  | 300  | 2.9851          | 0.1078   | 0.0517    | 0.1078 | 0.0480 | 0.3730 |

| No log        | 1.34  | 350  | 2.8525          | 0.1779   | 0.0780    | 0.1779 | 0.0876 | 0.4197 |

| No log        | 1.53  | 400  | 2.7647          | 0.1752   | 0.1108    | 0.1752 | 0.1064 | 0.4221 |

| No log        | 1.73  | 450  | 2.5521          | 0.2291   | 0.1539    | 0.2291 | 0.1450 | 0.4547 |

| 3.3693        | 1.92  | 500  | 2.4121          | 0.2372   | 0.1655    | 0.2372 | 0.1618 | 0.4668 |

| 3.3693        | 2.11  | 550  | 2.2312          | 0.2992   | 0.2286    | 0.2992 | 0.2185 | 0.5081 |

| 3.3693        | 2.3   | 600  | 2.0065          | 0.4124   | 0.2985    | 0.4124 | 0.3133 | 0.5865 |

| 3.3693        | 2.49  | 650  | 1.8816          | 0.4313   | 0.3359    | 0.4313 | 0.3461 | 0.6013 |

| 3.3693        | 2.68  | 700  | 1.8069          | 0.4906   | 0.4702    | 0.4906 | 0.4308 | 0.6426 |

| 3.3693        | 2.88  | 750  | 1.6310          | 0.5418   | 0.4981    | 0.5418 | 0.4728 | 0.6803 |

| 3.3693        | 3.07  | 800  | 1.5274          | 0.5580   | 0.5219    | 0.5580 | 0.5002 | 0.6908 |

| 3.3693        | 3.26  | 850  | 1.3417          | 0.6415   | 0.6343    | 0.6415 | 0.5980 | 0.7544 |

| 3.3693        | 3.45  | 900  | 1.3121          | 0.6173   | 0.6059    | 0.6173 | 0.5690 | 0.7334 |

| 3.3693        | 3.64  | 950  | 1.2298          | 0.6523   | 0.6501    | 0.6523 | 0.6183 | 0.7577 |

| 2.2303        | 3.84  | 1000 | 1.1427          | 0.7197   | 0.7323    | 0.7197 | 0.6897 | 0.8040 |

| 2.2303        | 4.03  | 1050 | 1.0947          | 0.6765   | 0.6891    | 0.6765 | 0.6387 | 0.7741 |

| 2.2303        | 4.22  | 1100 | 1.1233          | 0.6361   | 0.6473    | 0.6361 | 0.6054 | 0.7447 |

| 2.2303        | 4.41  | 1150 | 0.9765          | 0.7547   | 0.7606    | 0.7547 | 0.7331 | 0.8296 |

| 2.2303        | 4.6   | 1200 | 0.9206          | 0.7547   | 0.7546    | 0.7547 | 0.7270 | 0.8305 |

| 2.2303        | 4.79  | 1250 | 0.8658          | 0.7790   | 0.7868    | 0.7790 | 0.7625 | 0.8456 |

| 2.2303        | 4.99  | 1300 | 0.8961          | 0.7385   | 0.7576    | 0.7385 | 0.7254 | 0.8186 |

| 2.2303        | 5.18  | 1350 | 0.7709          | 0.8005   | 0.8185    | 0.8005 | 0.7912 | 0.8596 |

| 2.2303        | 5.37  | 1400 | 0.7638          | 0.7925   | 0.8118    | 0.7925 | 0.7760 | 0.8547 |

| 2.2303        | 5.56  | 1450 | 0.7085          | 0.8194   | 0.8415    | 0.8194 | 0.8081 | 0.8741 |

| 1.6078        | 5.75  | 1500 | 0.7230          | 0.7790   | 0.8195    | 0.7790 | 0.7739 | 0.8456 |

| 1.6078        | 5.94  | 1550 | 0.6475          | 0.7951   | 0.8174    | 0.7951 | 0.7813 | 0.8558 |

| 1.6078        | 6.14  | 1600 | 0.6910          | 0.7844   | 0.8082    | 0.7844 | 0.7686 | 0.8504 |

| 1.6078        | 6.33  | 1650 | 0.6233          | 0.8194   | 0.8462    | 0.8194 | 0.8111 | 0.8730 |

| 1.6078        | 6.52  | 1700 | 0.6599          | 0.8059   | 0.8429    | 0.8059 | 0.8031 | 0.8633 |

| 1.6078        | 6.71  | 1750 | 0.6999          | 0.7925   | 0.8119    | 0.7925 | 0.7751 | 0.8550 |

| 1.6078        | 6.9   | 1800 | 0.6271          | 0.8140   | 0.8266    | 0.8140 | 0.8018 | 0.8701 |

| 1.6078        | 7.09  | 1850 | 0.5545          | 0.8329   | 0.8557    | 0.8329 | 0.8288 | 0.8822 |

| 1.6078        | 7.29  | 1900 | 0.6343          | 0.8032   | 0.8179    | 0.8032 | 0.7930 | 0.8625 |

| 1.6078        | 7.48  | 1950 | 0.6007          | 0.8194   | 0.8447    | 0.8194 | 0.8136 | 0.8728 |

| 1.2974        | 7.67  | 2000 | 0.5878          | 0.8356   | 0.8674    | 0.8356 | 0.8333 | 0.8841 |

| 1.2974        | 7.86  | 2050 | 0.6410          | 0.8086   | 0.8344    | 0.8086 | 0.8011 | 0.8652 |

| 1.2974        | 8.05  | 2100 | 0.6430          | 0.8005   | 0.8201    | 0.8005 | 0.7894 | 0.8598 |

| 1.2974        | 8.25  | 2150 | 0.5540          | 0.8221   | 0.8414    | 0.8221 | 0.8177 | 0.8747 |

| 1.2974        | 8.44  | 2200 | 0.5511          | 0.8356   | 0.8635    | 0.8356 | 0.8317 | 0.8833 |

| 1.2974        | 8.63  | 2250 | 0.5817          | 0.8356   | 0.8647    | 0.8356 | 0.8286 | 0.8852 |





### Framework versions



- Transformers 4.38.2

- Pytorch 2.3.0

- Datasets 2.19.1

- Tokenizers 0.15.1