File size: 6,972 Bytes
cc6c2d2
 
 
 
 
 
 
 
 
 
 
9089351
cc6c2d2
 
 
 
 
 
9089351
cc6c2d2
 
 
9089351
 
 
 
 
 
cc6c2d2
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9089351
878d563
cc6c2d2
 
 
878d563
cc6c2d2
 
878d563
cc6c2d2
 
 
 
 
 
9089351
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
cc6c2d2
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
---

license: apache-2.0
base_model: facebook/hubert-base-ls960
tags:
- generated_from_trainer
metrics:
- accuracy
- precision
- recall
- f1
model-index:
- name: hubert-classifier-aug
  results: []
---


<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# hubert-classifier-aug

This model is a fine-tuned version of [facebook/hubert-base-ls960](https://huggingface.co/facebook/hubert-base-ls960) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 3.0783
- Accuracy: 0.2075
- Precision: 0.1563
- Recall: 0.2075
- F1: 0.1504
- Binary: 0.4396

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 1e-05

- train_batch_size: 32

- eval_batch_size: 32

- seed: 42

- gradient_accumulation_steps: 4

- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10

- mixed_precision_training: Native AMP



### Training results



| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1     | Binary |

|:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|:------:|

| No log        | 0.19  | 50   | 4.4148          | 0.0243   | 0.0244    | 0.0243 | 0.0114 | 0.1755 |

| No log        | 0.38  | 100  | 4.3445          | 0.0404   | 0.0101    | 0.0404 | 0.0123 | 0.2865 |

| No log        | 0.58  | 150  | 4.2100          | 0.0350   | 0.0154    | 0.0350 | 0.0080 | 0.3043 |

| No log        | 0.77  | 200  | 4.1002          | 0.0350   | 0.0185    | 0.0350 | 0.0088 | 0.3140 |

| No log        | 0.96  | 250  | 4.0141          | 0.0512   | 0.0295    | 0.0512 | 0.0232 | 0.3253 |

| No log        | 1.15  | 300  | 3.9438          | 0.0593   | 0.0321    | 0.0593 | 0.0293 | 0.3318 |

| No log        | 1.34  | 350  | 3.8728          | 0.0647   | 0.0290    | 0.0647 | 0.0281 | 0.3372 |

| No log        | 1.53  | 400  | 3.8297          | 0.0755   | 0.0242    | 0.0755 | 0.0331 | 0.3423 |

| No log        | 1.73  | 450  | 3.7627          | 0.0620   | 0.0186    | 0.0620 | 0.0263 | 0.3385 |

| 4.147         | 1.92  | 500  | 3.7166          | 0.0728   | 0.0364    | 0.0728 | 0.0360 | 0.3437 |

| 4.147         | 2.11  | 550  | 3.6779          | 0.0889   | 0.0425    | 0.0889 | 0.0486 | 0.3558 |

| 4.147         | 2.3   | 600  | 3.6396          | 0.0755   | 0.0345    | 0.0755 | 0.0407 | 0.3447 |

| 4.147         | 2.49  | 650  | 3.6005          | 0.0889   | 0.0336    | 0.0889 | 0.0412 | 0.3550 |

| 4.147         | 2.68  | 700  | 3.5602          | 0.0970   | 0.0314    | 0.0970 | 0.0420 | 0.3631 |

| 4.147         | 2.88  | 750  | 3.5309          | 0.0997   | 0.0473    | 0.0997 | 0.0507 | 0.3642 |

| 4.147         | 3.07  | 800  | 3.5331          | 0.1051   | 0.0385    | 0.1051 | 0.0490 | 0.3615 |

| 4.147         | 3.26  | 850  | 3.4774          | 0.1105   | 0.0507    | 0.1105 | 0.0604 | 0.3701 |

| 4.147         | 3.45  | 900  | 3.4571          | 0.1159   | 0.0568    | 0.1159 | 0.0611 | 0.3730 |

| 4.147         | 3.64  | 950  | 3.4265          | 0.1132   | 0.0431    | 0.1132 | 0.0582 | 0.3736 |

| 3.6862        | 3.84  | 1000 | 3.4260          | 0.0970   | 0.0406    | 0.0970 | 0.0502 | 0.3582 |

| 3.6862        | 4.03  | 1050 | 3.3821          | 0.1105   | 0.0421    | 0.1105 | 0.0542 | 0.3709 |

| 3.6862        | 4.22  | 1100 | 3.3825          | 0.1186   | 0.0448    | 0.1186 | 0.0578 | 0.3725 |

| 3.6862        | 4.41  | 1150 | 3.3575          | 0.1213   | 0.0507    | 0.1213 | 0.0634 | 0.3776 |

| 3.6862        | 4.6   | 1200 | 3.3453          | 0.1267   | 0.0659    | 0.1267 | 0.0653 | 0.3790 |

| 3.6862        | 4.79  | 1250 | 3.3205          | 0.1321   | 0.0592    | 0.1321 | 0.0736 | 0.3871 |

| 3.6862        | 4.99  | 1300 | 3.2912          | 0.1294   | 0.0552    | 0.1294 | 0.0724 | 0.3868 |

| 3.6862        | 5.18  | 1350 | 3.2741          | 0.1536   | 0.0731    | 0.1536 | 0.0880 | 0.4022 |

| 3.6862        | 5.37  | 1400 | 3.2767          | 0.1509   | 0.0723    | 0.1509 | 0.0893 | 0.3978 |

| 3.6862        | 5.56  | 1450 | 3.2485          | 0.1509   | 0.0743    | 0.1509 | 0.0907 | 0.4003 |

| 3.4619        | 5.75  | 1500 | 3.2421          | 0.1509   | 0.0783    | 0.1509 | 0.0855 | 0.4003 |

| 3.4619        | 5.94  | 1550 | 3.2366          | 0.1375   | 0.0686    | 0.1375 | 0.0754 | 0.3892 |

| 3.4619        | 6.14  | 1600 | 3.2102          | 0.1456   | 0.0959    | 0.1456 | 0.0862 | 0.3965 |

| 3.4619        | 6.33  | 1650 | 3.1962          | 0.1456   | 0.0688    | 0.1456 | 0.0858 | 0.3957 |

| 3.4619        | 6.52  | 1700 | 3.1917          | 0.1590   | 0.1160    | 0.1590 | 0.0994 | 0.4035 |

| 3.4619        | 6.71  | 1750 | 3.1746          | 0.1590   | 0.0922    | 0.1590 | 0.0978 | 0.4051 |

| 3.4619        | 6.9   | 1800 | 3.1791          | 0.1590   | 0.0671    | 0.1590 | 0.0863 | 0.4059 |

| 3.4619        | 7.09  | 1850 | 3.1714          | 0.1725   | 0.0952    | 0.1725 | 0.1028 | 0.4135 |

| 3.4619        | 7.29  | 1900 | 3.1427          | 0.1725   | 0.1084    | 0.1725 | 0.1090 | 0.4194 |

| 3.4619        | 7.48  | 1950 | 3.1410          | 0.1833   | 0.1313    | 0.1833 | 0.1221 | 0.4226 |

| 3.3361        | 7.67  | 2000 | 3.1334          | 0.1806   | 0.1385    | 0.1806 | 0.1239 | 0.4197 |

| 3.3361        | 7.86  | 2050 | 3.1246          | 0.1806   | 0.1474    | 0.1806 | 0.1193 | 0.4208 |

| 3.3361        | 8.05  | 2100 | 3.1151          | 0.1995   | 0.1582    | 0.1995 | 0.1388 | 0.4332 |

| 3.3361        | 8.25  | 2150 | 3.1085          | 0.2049   | 0.1578    | 0.2049 | 0.1439 | 0.4377 |

| 3.3361        | 8.44  | 2200 | 3.0897          | 0.2102   | 0.1546    | 0.2102 | 0.1483 | 0.4445 |

| 3.3361        | 8.63  | 2250 | 3.0934          | 0.2210   | 0.1511    | 0.2210 | 0.1541 | 0.4469 |

| 3.3361        | 8.82  | 2300 | 3.0906          | 0.2102   | 0.1625    | 0.2102 | 0.1535 | 0.4394 |

| 3.3361        | 9.01  | 2350 | 3.0792          | 0.2129   | 0.1586    | 0.2129 | 0.1573 | 0.4437 |

| 3.3361        | 9.2   | 2400 | 3.0849          | 0.2049   | 0.1442    | 0.2049 | 0.1446 | 0.4358 |

| 3.3361        | 9.4   | 2450 | 3.0794          | 0.2102   | 0.1576    | 0.2102 | 0.1532 | 0.4396 |

| 3.2647        | 9.59  | 2500 | 3.0801          | 0.2129   | 0.1560    | 0.2129 | 0.1552 | 0.4415 |

| 3.2647        | 9.78  | 2550 | 3.0823          | 0.2075   | 0.1669    | 0.2075 | 0.1521 | 0.4396 |

| 3.2647        | 9.97  | 2600 | 3.0783          | 0.2075   | 0.1563    | 0.2075 | 0.1504 | 0.4396 |





### Framework versions



- Transformers 4.38.2

- Pytorch 2.3.0

- Datasets 2.19.1

- Tokenizers 0.15.1