File size: 4,925 Bytes
2a65515
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
---
tags:
- generated_from_trainer
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: tabert-2k-naamapadam
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# tabert-2k-naamapadam

This model is a fine-tuned version of [livinNector/tabert-2k](https://huggingface.co/livinNector/tabert-2k) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2850
- Precision: 0.7765
- Recall: 0.8041
- F1: 0.7901
- Accuracy: 0.9065

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 64
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2

### Training results

| Training Loss | Epoch | Step  | Validation Loss | Precision | Recall | F1     | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:---------:|:------:|:------:|:--------:|
| 0.4679        | 0.05  | 400   | 0.3991          | 0.7155    | 0.6561 | 0.6845 | 0.8720   |
| 0.3907        | 0.1   | 800   | 0.3632          | 0.7181    | 0.7233 | 0.7207 | 0.8822   |
| 0.3663        | 0.15  | 1200  | 0.3483          | 0.7271    | 0.7371 | 0.7321 | 0.8857   |
| 0.3557        | 0.21  | 1600  | 0.3457          | 0.7286    | 0.7506 | 0.7395 | 0.8874   |
| 0.3533        | 0.26  | 2000  | 0.3413          | 0.7371    | 0.7435 | 0.7403 | 0.8895   |
| 0.3396        | 0.31  | 2400  | 0.3326          | 0.7435    | 0.7546 | 0.7490 | 0.8910   |
| 0.3302        | 0.36  | 2800  | 0.3264          | 0.7528    | 0.7553 | 0.7540 | 0.8937   |
| 0.3344        | 0.41  | 3200  | 0.3231          | 0.7503    | 0.7720 | 0.7610 | 0.8951   |
| 0.3262        | 0.46  | 3600  | 0.3228          | 0.7387    | 0.7762 | 0.7570 | 0.8941   |
| 0.3186        | 0.51  | 4000  | 0.3158          | 0.7699    | 0.7666 | 0.7683 | 0.8986   |
| 0.3163        | 0.57  | 4400  | 0.3130          | 0.7453    | 0.7798 | 0.7622 | 0.8955   |
| 0.3143        | 0.62  | 4800  | 0.3150          | 0.7572    | 0.7751 | 0.7660 | 0.8961   |
| 0.3088        | 0.67  | 5200  | 0.3151          | 0.7543    | 0.7828 | 0.7683 | 0.8972   |
| 0.3115        | 0.72  | 5600  | 0.3141          | 0.7708    | 0.7706 | 0.7707 | 0.8977   |
| 0.3095        | 0.77  | 6000  | 0.3043          | 0.7657    | 0.7831 | 0.7743 | 0.8991   |
| 0.3044        | 0.82  | 6400  | 0.3087          | 0.7526    | 0.7881 | 0.7699 | 0.8972   |
| 0.2964        | 0.87  | 6800  | 0.3070          | 0.7644    | 0.7928 | 0.7783 | 0.8992   |
| 0.2972        | 0.93  | 7200  | 0.3102          | 0.7692    | 0.7738 | 0.7715 | 0.8999   |
| 0.2985        | 0.98  | 7600  | 0.3016          | 0.7731    | 0.7858 | 0.7794 | 0.9018   |
| 0.2822        | 1.03  | 8000  | 0.3049          | 0.7734    | 0.7909 | 0.7820 | 0.9031   |
| 0.2764        | 1.08  | 8400  | 0.3059          | 0.7575    | 0.7976 | 0.7770 | 0.9011   |
| 0.2752        | 1.13  | 8800  | 0.3052          | 0.7553    | 0.7996 | 0.7768 | 0.9015   |
| 0.2689        | 1.18  | 9200  | 0.2990          | 0.7642    | 0.7982 | 0.7808 | 0.9037   |
| 0.2738        | 1.23  | 9600  | 0.2985          | 0.7698    | 0.7987 | 0.7840 | 0.9035   |
| 0.2731        | 1.29  | 10000 | 0.2950          | 0.7713    | 0.7982 | 0.7845 | 0.9037   |
| 0.2694        | 1.34  | 10400 | 0.2920          | 0.7743    | 0.8017 | 0.7878 | 0.9059   |
| 0.2727        | 1.39  | 10800 | 0.2931          | 0.7693    | 0.7979 | 0.7834 | 0.9040   |
| 0.2622        | 1.44  | 11200 | 0.2946          | 0.7702    | 0.7942 | 0.7820 | 0.9032   |
| 0.2672        | 1.49  | 11600 | 0.2894          | 0.7724    | 0.8062 | 0.7890 | 0.9060   |
| 0.2601        | 1.54  | 12000 | 0.2907          | 0.7706    | 0.8010 | 0.7855 | 0.9058   |
| 0.2629        | 1.59  | 12400 | 0.2930          | 0.7628    | 0.8150 | 0.7880 | 0.9052   |
| 0.2635        | 1.65  | 12800 | 0.2907          | 0.7775    | 0.7970 | 0.7871 | 0.9047   |
| 0.2673        | 1.7   | 13200 | 0.2909          | 0.7753    | 0.7982 | 0.7866 | 0.9045   |
| 0.2726        | 1.75  | 13600 | 0.2880          | 0.7714    | 0.8048 | 0.7877 | 0.9054   |
| 0.2607        | 1.8   | 14000 | 0.2850          | 0.7760    | 0.8010 | 0.7883 | 0.9053   |
| 0.2684        | 1.85  | 14400 | 0.2847          | 0.7709    | 0.8077 | 0.7889 | 0.9059   |
| 0.2625        | 1.9   | 14800 | 0.2849          | 0.7742    | 0.8079 | 0.7907 | 0.9067   |
| 0.2631        | 1.95  | 15200 | 0.2850          | 0.7765    | 0.8041 | 0.7901 | 0.9065   |


### Framework versions

- Transformers 4.29.2
- Pytorch 2.0.0
- Datasets 2.12.0
- Tokenizers 0.13.3