File size: 7,066 Bytes
cc6c2d2
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
214ab2d
 
 
 
 
 
cc6c2d2
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
214ab2d
cc6c2d2
 
 
 
 
 
 
 
 
 
 
 
 
 
214ab2d
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
cc6c2d2
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
---

license: apache-2.0
base_model: facebook/hubert-base-ls960
tags:
- generated_from_trainer
metrics:
- accuracy
- precision
- recall
- f1
model-index:
- name: hubert-classifier
  results: []
---


<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# hubert-classifier

This model is a fine-tuned version of [facebook/hubert-base-ls960](https://huggingface.co/facebook/hubert-base-ls960) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1320
- Accuracy: 0.7724
- Precision: 0.8107
- Recall: 0.7724
- F1: 0.7633
- Binary: 0.8448

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 3e-05

- train_batch_size: 32

- eval_batch_size: 32

- seed: 42

- gradient_accumulation_steps: 4

- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10

- mixed_precision_training: Native AMP



### Training results



| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1     | Binary |

|:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|:------:|

| No log        | 0.17  | 50   | 4.2981          | 0.0242   | 0.0171    | 0.0242 | 0.0108 | 0.2760 |

| No log        | 0.35  | 100  | 3.9571          | 0.0315   | 0.0038    | 0.0315 | 0.0059 | 0.3133 |

| No log        | 0.52  | 150  | 3.7311          | 0.0678   | 0.0243    | 0.0678 | 0.0277 | 0.3424 |

| No log        | 0.69  | 200  | 3.5663          | 0.0896   | 0.0557    | 0.0896 | 0.0513 | 0.3593 |

| No log        | 0.86  | 250  | 3.4934          | 0.0944   | 0.0408    | 0.0944 | 0.0477 | 0.3545 |

| No log        | 1.04  | 300  | 3.3705          | 0.1235   | 0.0864    | 0.1235 | 0.0706 | 0.3748 |

| No log        | 1.21  | 350  | 3.2630          | 0.1429   | 0.1042    | 0.1429 | 0.0811 | 0.3906 |

| No log        | 1.38  | 400  | 3.1551          | 0.1574   | 0.1413    | 0.1574 | 0.1118 | 0.4029 |

| No log        | 1.55  | 450  | 3.0426          | 0.2349   | 0.1580    | 0.2349 | 0.1585 | 0.4593 |

| 3.6339        | 1.73  | 500  | 2.9462          | 0.2542   | 0.1854    | 0.2542 | 0.1806 | 0.4736 |

| 3.6339        | 1.9   | 550  | 2.8439          | 0.2663   | 0.2105    | 0.2663 | 0.2084 | 0.4814 |

| 3.6339        | 2.07  | 600  | 2.7192          | 0.3051   | 0.2797    | 0.3051 | 0.2411 | 0.5092 |

| 3.6339        | 2.24  | 650  | 2.6390          | 0.3293   | 0.3167    | 0.3293 | 0.2650 | 0.5274 |

| 3.6339        | 2.42  | 700  | 2.5491          | 0.3753   | 0.3731    | 0.3753 | 0.3290 | 0.5600 |

| 3.6339        | 2.59  | 750  | 2.4728          | 0.4092   | 0.3891    | 0.4092 | 0.3595 | 0.5823 |

| 3.6339        | 2.76  | 800  | 2.3395          | 0.4431   | 0.4205    | 0.4431 | 0.3917 | 0.6082 |

| 3.6339        | 2.93  | 850  | 2.2685          | 0.4552   | 0.4355    | 0.4552 | 0.4028 | 0.6160 |

| 3.6339        | 3.11  | 900  | 2.1883          | 0.4915   | 0.4680    | 0.4915 | 0.4387 | 0.6414 |

| 3.6339        | 3.28  | 950  | 2.1182          | 0.4843   | 0.5102    | 0.4843 | 0.4440 | 0.6363 |

| 2.6665        | 3.45  | 1000 | 2.0197          | 0.5448   | 0.5629    | 0.5448 | 0.5028 | 0.6804 |

| 2.6665        | 3.62  | 1050 | 1.9782          | 0.5327   | 0.5532    | 0.5327 | 0.4935 | 0.6712 |

| 2.6665        | 3.8   | 1100 | 1.9313          | 0.5593   | 0.5486    | 0.5593 | 0.5156 | 0.6930 |

| 2.6665        | 3.97  | 1150 | 1.8627          | 0.5908   | 0.5893    | 0.5908 | 0.5513 | 0.7119 |

| 2.6665        | 4.14  | 1200 | 1.8169          | 0.5908   | 0.5834    | 0.5908 | 0.5543 | 0.7128 |

| 2.6665        | 4.31  | 1250 | 1.7702          | 0.5835   | 0.5843    | 0.5835 | 0.5487 | 0.7077 |

| 2.6665        | 4.49  | 1300 | 1.7007          | 0.6344   | 0.6857    | 0.6344 | 0.6124 | 0.7438 |

| 2.6665        | 4.66  | 1350 | 1.6638          | 0.6199   | 0.6156    | 0.6199 | 0.5850 | 0.7354 |

| 2.6665        | 4.83  | 1400 | 1.6198          | 0.6368   | 0.6325    | 0.6368 | 0.6004 | 0.7482 |

| 2.6665        | 5.0   | 1450 | 1.5672          | 0.6804   | 0.6888    | 0.6804 | 0.6529 | 0.7753 |

| 2.0909        | 5.18  | 1500 | 1.5308          | 0.6683   | 0.6870    | 0.6683 | 0.6437 | 0.7692 |

| 2.0909        | 5.35  | 1550 | 1.4946          | 0.6877   | 0.6969    | 0.6877 | 0.6632 | 0.7811 |

| 2.0909        | 5.52  | 1600 | 1.4698          | 0.6755   | 0.6767    | 0.6755 | 0.6454 | 0.7743 |

| 2.0909        | 5.69  | 1650 | 1.4228          | 0.6804   | 0.7066    | 0.6804 | 0.6612 | 0.7785 |

| 2.0909        | 5.87  | 1700 | 1.3709          | 0.7312   | 0.7432    | 0.7312 | 0.7128 | 0.8140 |

| 2.0909        | 6.04  | 1750 | 1.3780          | 0.7215   | 0.7356    | 0.7215 | 0.7010 | 0.8082 |

| 2.0909        | 6.21  | 1800 | 1.3291          | 0.7215   | 0.7370    | 0.7215 | 0.7007 | 0.8090 |

| 2.0909        | 6.38  | 1850 | 1.3296          | 0.7191   | 0.7333    | 0.7191 | 0.7028 | 0.8056 |

| 2.0909        | 6.56  | 1900 | 1.3195          | 0.7191   | 0.7584    | 0.7191 | 0.7069 | 0.8048 |

| 2.0909        | 6.73  | 1950 | 1.2939          | 0.7191   | 0.7609    | 0.7191 | 0.7019 | 0.8065 |

| 1.75          | 6.9   | 2000 | 1.2800          | 0.7191   | 0.7353    | 0.7191 | 0.7018 | 0.8065 |

| 1.75          | 7.08  | 2050 | 1.2767          | 0.7094   | 0.7175    | 0.7094 | 0.6920 | 0.7998 |

| 1.75          | 7.25  | 2100 | 1.2280          | 0.7264   | 0.7689    | 0.7264 | 0.7148 | 0.8116 |

| 1.75          | 7.42  | 2150 | 1.2231          | 0.7385   | 0.7585    | 0.7385 | 0.7246 | 0.8201 |

| 1.75          | 7.59  | 2200 | 1.2198          | 0.7385   | 0.7563    | 0.7385 | 0.7248 | 0.8201 |

| 1.75          | 7.77  | 2250 | 1.1782          | 0.7482   | 0.7634    | 0.7482 | 0.7352 | 0.8269 |

| 1.75          | 7.94  | 2300 | 1.1848          | 0.7579   | 0.7900    | 0.7579 | 0.7519 | 0.8337 |

| 1.75          | 8.11  | 2350 | 1.1773          | 0.7579   | 0.7875    | 0.7579 | 0.7484 | 0.8346 |

| 1.75          | 8.28  | 2400 | 1.1752          | 0.7676   | 0.7965    | 0.7676 | 0.7594 | 0.8404 |

| 1.75          | 8.46  | 2450 | 1.1563          | 0.7724   | 0.8048    | 0.7724 | 0.7649 | 0.8438 |

| 1.5635        | 8.63  | 2500 | 1.1320          | 0.7724   | 0.8107    | 0.7724 | 0.7633 | 0.8448 |

| 1.5635        | 8.8   | 2550 | 1.1194          | 0.7700   | 0.8018    | 0.7700 | 0.7601 | 0.8421 |

| 1.5635        | 8.97  | 2600 | 1.1268          | 0.7554   | 0.7756    | 0.7554 | 0.7448 | 0.8329 |

| 1.5635        | 9.15  | 2650 | 1.1176          | 0.7676   | 0.7844    | 0.7676 | 0.7567 | 0.8404 |





### Framework versions



- Transformers 4.38.2

- Pytorch 2.3.0

- Datasets 2.19.1

- Tokenizers 0.15.1