pingkeest commited on
Commit
823f2f7
1 Parent(s): 9078ef7

Model save

Browse files
README.md CHANGED
@@ -19,10 +19,10 @@ should probably proofread and complete it, then remove this comment. -->
19
 
20
  This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on an unknown dataset.
21
  It achieves the following results on the evaluation set:
22
- - Loss: 0.1276
23
- - F1: 0.0
24
- - Roc Auc: 0.4994
25
- - Accuracy: 0.0
26
 
27
  ## Model description
28
 
@@ -41,7 +41,7 @@ More information needed
41
  ### Training hyperparameters
42
 
43
  The following hyperparameters were used during training:
44
- - learning_rate: 5e-06
45
  - train_batch_size: 8
46
  - eval_batch_size: 8
47
  - seed: 42
@@ -53,13 +53,206 @@ The following hyperparameters were used during training:
53
 
54
  | Training Loss | Epoch | Step | Validation Loss | F1 | Roc Auc | Accuracy |
55
  |:-------------:|:-----:|:----:|:---------------:|:------:|:-------:|:--------:|
56
- | 0.2255 | 1.0 | 22 | 0.2126 | 0.0590 | 0.5616 | 0.0 |
57
- | 0.1991 | 2.0 | 44 | 0.1911 | 0.0634 | 0.5695 | 0.0 |
58
- | 0.1838 | 3.0 | 66 | 0.1751 | 0.0596 | 0.5482 | 0.0 |
59
- | 0.1701 | 4.0 | 88 | 0.1614 | 0.0305 | 0.4835 | 0.0 |
60
- | 0.153 | 5.0 | 110 | 0.1491 | 0.0244 | 0.4878 | 0.0 |
61
- | 0.1427 | 6.0 | 132 | 0.1378 | 0.0408 | 0.5079 | 0.05 |
62
- | 0.1332 | 7.0 | 154 | 0.1276 | 0.0 | 0.4994 | 0.0 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
63
 
64
 
65
  ### Framework versions
 
19
 
20
  This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on an unknown dataset.
21
  It achieves the following results on the evaluation set:
22
+ - Loss: 0.0565
23
+ - F1: 0.6207
24
+ - Roc Auc: 0.725
25
+ - Accuracy: 0.45
26
 
27
  ## Model description
28
 
 
41
  ### Training hyperparameters
42
 
43
  The following hyperparameters were used during training:
44
+ - learning_rate: 2e-05
45
  - train_batch_size: 8
46
  - eval_batch_size: 8
47
  - seed: 42
 
53
 
54
  | Training Loss | Epoch | Step | Validation Loss | F1 | Roc Auc | Accuracy |
55
  |:-------------:|:-----:|:----:|:---------------:|:------:|:-------:|:--------:|
56
+ | No log | 1.0 | 22 | 0.4798 | 0.0 | 0.4957 | 0.0 |
57
+ | No log | 2.0 | 44 | 0.3421 | 0.0 | 0.5 | 0.0 |
58
+ | No log | 3.0 | 66 | 0.2614 | 0.0 | 0.5 | 0.0 |
59
+ | No log | 4.0 | 88 | 0.2105 | 0.0 | 0.5 | 0.0 |
60
+ | No log | 5.0 | 110 | 0.1780 | 0.0 | 0.5 | 0.0 |
61
+ | No log | 6.0 | 132 | 0.1579 | 0.0 | 0.5 | 0.0 |
62
+ | No log | 7.0 | 154 | 0.1448 | 0.0 | 0.5 | 0.0 |
63
+ | No log | 8.0 | 176 | 0.1356 | 0.0 | 0.5 | 0.0 |
64
+ | No log | 9.0 | 198 | 0.1295 | 0.0 | 0.5 | 0.0 |
65
+ | No log | 10.0 | 220 | 0.1252 | 0.0 | 0.5 | 0.0 |
66
+ | No log | 11.0 | 242 | 0.1221 | 0.0 | 0.5 | 0.0 |
67
+ | No log | 12.0 | 264 | 0.1199 | 0.0 | 0.5 | 0.0 |
68
+ | No log | 13.0 | 286 | 0.1184 | 0.0 | 0.5 | 0.0 |
69
+ | No log | 14.0 | 308 | 0.1169 | 0.0 | 0.5 | 0.0 |
70
+ | No log | 15.0 | 330 | 0.1161 | 0.0 | 0.5 | 0.0 |
71
+ | No log | 16.0 | 352 | 0.1153 | 0.0 | 0.5 | 0.0 |
72
+ | No log | 17.0 | 374 | 0.1147 | 0.0 | 0.5 | 0.0 |
73
+ | No log | 18.0 | 396 | 0.1142 | 0.0 | 0.5 | 0.0 |
74
+ | No log | 19.0 | 418 | 0.1136 | 0.0 | 0.5 | 0.0 |
75
+ | No log | 20.0 | 440 | 0.1129 | 0.0 | 0.5 | 0.0 |
76
+ | No log | 21.0 | 462 | 0.1127 | 0.0 | 0.5 | 0.0 |
77
+ | No log | 22.0 | 484 | 0.1121 | 0.0 | 0.5 | 0.0 |
78
+ | 0.1722 | 23.0 | 506 | 0.1113 | 0.0 | 0.5 | 0.0 |
79
+ | 0.1722 | 24.0 | 528 | 0.1112 | 0.0 | 0.5 | 0.0 |
80
+ | 0.1722 | 25.0 | 550 | 0.1104 | 0.0 | 0.5 | 0.0 |
81
+ | 0.1722 | 26.0 | 572 | 0.1098 | 0.0 | 0.5 | 0.0 |
82
+ | 0.1722 | 27.0 | 594 | 0.1099 | 0.0 | 0.5 | 0.0 |
83
+ | 0.1722 | 28.0 | 616 | 0.1086 | 0.0 | 0.5 | 0.0 |
84
+ | 0.1722 | 29.0 | 638 | 0.1081 | 0.0 | 0.5 | 0.0 |
85
+ | 0.1722 | 30.0 | 660 | 0.1074 | 0.0 | 0.5 | 0.0 |
86
+ | 0.1722 | 31.0 | 682 | 0.1058 | 0.0 | 0.5 | 0.0 |
87
+ | 0.1722 | 32.0 | 704 | 0.1056 | 0.0 | 0.5 | 0.0 |
88
+ | 0.1722 | 33.0 | 726 | 0.1049 | 0.0 | 0.5 | 0.0 |
89
+ | 0.1722 | 34.0 | 748 | 0.1041 | 0.0 | 0.5 | 0.0 |
90
+ | 0.1722 | 35.0 | 770 | 0.1030 | 0.0 | 0.5 | 0.0 |
91
+ | 0.1722 | 36.0 | 792 | 0.1013 | 0.0 | 0.5 | 0.0 |
92
+ | 0.1722 | 37.0 | 814 | 0.1006 | 0.0 | 0.5 | 0.0 |
93
+ | 0.1722 | 38.0 | 836 | 0.1000 | 0.0 | 0.5 | 0.0 |
94
+ | 0.1722 | 39.0 | 858 | 0.0981 | 0.0 | 0.5 | 0.0 |
95
+ | 0.1722 | 40.0 | 880 | 0.0977 | 0.0 | 0.5 | 0.0 |
96
+ | 0.1722 | 41.0 | 902 | 0.0968 | 0.0 | 0.5 | 0.0 |
97
+ | 0.1722 | 42.0 | 924 | 0.0959 | 0.0 | 0.5 | 0.0 |
98
+ | 0.1722 | 43.0 | 946 | 0.0944 | 0.0 | 0.5 | 0.0 |
99
+ | 0.1722 | 44.0 | 968 | 0.0938 | 0.0 | 0.5 | 0.0 |
100
+ | 0.1722 | 45.0 | 990 | 0.0928 | 0.0 | 0.5 | 0.0 |
101
+ | 0.0886 | 46.0 | 1012 | 0.0927 | 0.0 | 0.5 | 0.0 |
102
+ | 0.0886 | 47.0 | 1034 | 0.0911 | 0.0 | 0.5 | 0.0 |
103
+ | 0.0886 | 48.0 | 1056 | 0.0894 | 0.0 | 0.5 | 0.0 |
104
+ | 0.0886 | 49.0 | 1078 | 0.0888 | 0.0 | 0.5 | 0.0 |
105
+ | 0.0886 | 50.0 | 1100 | 0.0885 | 0.0 | 0.5 | 0.0 |
106
+ | 0.0886 | 51.0 | 1122 | 0.0861 | 0.0 | 0.5 | 0.0 |
107
+ | 0.0886 | 52.0 | 1144 | 0.0869 | 0.0 | 0.5 | 0.0 |
108
+ | 0.0886 | 53.0 | 1166 | 0.0853 | 0.0 | 0.5 | 0.0 |
109
+ | 0.0886 | 54.0 | 1188 | 0.0850 | 0.0 | 0.5 | 0.0 |
110
+ | 0.0886 | 55.0 | 1210 | 0.0853 | 0.0 | 0.5 | 0.0 |
111
+ | 0.0886 | 56.0 | 1232 | 0.0845 | 0.0 | 0.5 | 0.0 |
112
+ | 0.0886 | 57.0 | 1254 | 0.0822 | 0.0952 | 0.525 | 0.05 |
113
+ | 0.0886 | 58.0 | 1276 | 0.0815 | 0.0952 | 0.525 | 0.05 |
114
+ | 0.0886 | 59.0 | 1298 | 0.0796 | 0.2609 | 0.575 | 0.15 |
115
+ | 0.0886 | 60.0 | 1320 | 0.0797 | 0.2609 | 0.575 | 0.15 |
116
+ | 0.0886 | 61.0 | 1342 | 0.0800 | 0.1818 | 0.55 | 0.1 |
117
+ | 0.0886 | 62.0 | 1364 | 0.0794 | 0.1818 | 0.55 | 0.1 |
118
+ | 0.0886 | 63.0 | 1386 | 0.0785 | 0.2609 | 0.575 | 0.15 |
119
+ | 0.0886 | 64.0 | 1408 | 0.0778 | 0.2609 | 0.575 | 0.15 |
120
+ | 0.0886 | 65.0 | 1430 | 0.0754 | 0.2609 | 0.575 | 0.15 |
121
+ | 0.0886 | 66.0 | 1452 | 0.0761 | 0.2609 | 0.575 | 0.15 |
122
+ | 0.0886 | 67.0 | 1474 | 0.0744 | 0.3333 | 0.6 | 0.2 |
123
+ | 0.0886 | 68.0 | 1496 | 0.0747 | 0.2609 | 0.575 | 0.15 |
124
+ | 0.0559 | 69.0 | 1518 | 0.0740 | 0.3333 | 0.6 | 0.2 |
125
+ | 0.0559 | 70.0 | 1540 | 0.0742 | 0.2609 | 0.575 | 0.15 |
126
+ | 0.0559 | 71.0 | 1562 | 0.0729 | 0.3333 | 0.6 | 0.2 |
127
+ | 0.0559 | 72.0 | 1584 | 0.0727 | 0.3333 | 0.6 | 0.2 |
128
+ | 0.0559 | 73.0 | 1606 | 0.0698 | 0.4 | 0.625 | 0.25 |
129
+ | 0.0559 | 74.0 | 1628 | 0.0707 | 0.3333 | 0.6 | 0.2 |
130
+ | 0.0559 | 75.0 | 1650 | 0.0712 | 0.3333 | 0.6 | 0.2 |
131
+ | 0.0559 | 76.0 | 1672 | 0.0698 | 0.4 | 0.625 | 0.25 |
132
+ | 0.0559 | 77.0 | 1694 | 0.0693 | 0.3333 | 0.6 | 0.2 |
133
+ | 0.0559 | 78.0 | 1716 | 0.0698 | 0.4 | 0.625 | 0.25 |
134
+ | 0.0559 | 79.0 | 1738 | 0.0694 | 0.4 | 0.625 | 0.25 |
135
+ | 0.0559 | 80.0 | 1760 | 0.0690 | 0.4 | 0.625 | 0.25 |
136
+ | 0.0559 | 81.0 | 1782 | 0.0683 | 0.4615 | 0.65 | 0.3 |
137
+ | 0.0559 | 82.0 | 1804 | 0.0673 | 0.4615 | 0.65 | 0.3 |
138
+ | 0.0559 | 83.0 | 1826 | 0.0680 | 0.4615 | 0.65 | 0.3 |
139
+ | 0.0559 | 84.0 | 1848 | 0.0679 | 0.4 | 0.625 | 0.25 |
140
+ | 0.0559 | 85.0 | 1870 | 0.0674 | 0.4615 | 0.65 | 0.3 |
141
+ | 0.0559 | 86.0 | 1892 | 0.0676 | 0.4 | 0.625 | 0.25 |
142
+ | 0.0559 | 87.0 | 1914 | 0.0660 | 0.4615 | 0.65 | 0.3 |
143
+ | 0.0559 | 88.0 | 1936 | 0.0658 | 0.4615 | 0.65 | 0.3 |
144
+ | 0.0559 | 89.0 | 1958 | 0.0671 | 0.4615 | 0.65 | 0.3 |
145
+ | 0.0559 | 90.0 | 1980 | 0.0654 | 0.4615 | 0.65 | 0.3 |
146
+ | 0.0369 | 91.0 | 2002 | 0.0651 | 0.4615 | 0.65 | 0.3 |
147
+ | 0.0369 | 92.0 | 2024 | 0.0653 | 0.4615 | 0.65 | 0.3 |
148
+ | 0.0369 | 93.0 | 2046 | 0.0654 | 0.4615 | 0.65 | 0.3 |
149
+ | 0.0369 | 94.0 | 2068 | 0.0650 | 0.4615 | 0.65 | 0.3 |
150
+ | 0.0369 | 95.0 | 2090 | 0.0653 | 0.4615 | 0.65 | 0.3 |
151
+ | 0.0369 | 96.0 | 2112 | 0.0639 | 0.4615 | 0.65 | 0.3 |
152
+ | 0.0369 | 97.0 | 2134 | 0.0645 | 0.4615 | 0.65 | 0.3 |
153
+ | 0.0369 | 98.0 | 2156 | 0.0636 | 0.4615 | 0.65 | 0.3 |
154
+ | 0.0369 | 99.0 | 2178 | 0.0638 | 0.4615 | 0.65 | 0.3 |
155
+ | 0.0369 | 100.0 | 2200 | 0.0625 | 0.4615 | 0.65 | 0.3 |
156
+ | 0.0369 | 101.0 | 2222 | 0.0628 | 0.4615 | 0.65 | 0.3 |
157
+ | 0.0369 | 102.0 | 2244 | 0.0620 | 0.4615 | 0.65 | 0.3 |
158
+ | 0.0369 | 103.0 | 2266 | 0.0632 | 0.4615 | 0.65 | 0.3 |
159
+ | 0.0369 | 104.0 | 2288 | 0.0616 | 0.4615 | 0.65 | 0.3 |
160
+ | 0.0369 | 105.0 | 2310 | 0.0624 | 0.4615 | 0.65 | 0.3 |
161
+ | 0.0369 | 106.0 | 2332 | 0.0616 | 0.4615 | 0.65 | 0.3 |
162
+ | 0.0369 | 107.0 | 2354 | 0.0620 | 0.4615 | 0.65 | 0.3 |
163
+ | 0.0369 | 108.0 | 2376 | 0.0617 | 0.4615 | 0.65 | 0.3 |
164
+ | 0.0369 | 109.0 | 2398 | 0.0615 | 0.4615 | 0.65 | 0.3 |
165
+ | 0.0369 | 110.0 | 2420 | 0.0621 | 0.4615 | 0.65 | 0.3 |
166
+ | 0.0369 | 111.0 | 2442 | 0.0604 | 0.4615 | 0.65 | 0.3 |
167
+ | 0.0369 | 112.0 | 2464 | 0.0621 | 0.4615 | 0.65 | 0.3 |
168
+ | 0.0369 | 113.0 | 2486 | 0.0605 | 0.5185 | 0.675 | 0.35 |
169
+ | 0.0269 | 114.0 | 2508 | 0.0608 | 0.5185 | 0.675 | 0.35 |
170
+ | 0.0269 | 115.0 | 2530 | 0.0606 | 0.5185 | 0.675 | 0.35 |
171
+ | 0.0269 | 116.0 | 2552 | 0.0608 | 0.5185 | 0.675 | 0.35 |
172
+ | 0.0269 | 117.0 | 2574 | 0.0606 | 0.5185 | 0.675 | 0.35 |
173
+ | 0.0269 | 118.0 | 2596 | 0.0593 | 0.5185 | 0.675 | 0.35 |
174
+ | 0.0269 | 119.0 | 2618 | 0.0599 | 0.5185 | 0.675 | 0.35 |
175
+ | 0.0269 | 120.0 | 2640 | 0.0596 | 0.5185 | 0.675 | 0.35 |
176
+ | 0.0269 | 121.0 | 2662 | 0.0600 | 0.5185 | 0.675 | 0.35 |
177
+ | 0.0269 | 122.0 | 2684 | 0.0598 | 0.5185 | 0.675 | 0.35 |
178
+ | 0.0269 | 123.0 | 2706 | 0.0592 | 0.5185 | 0.675 | 0.35 |
179
+ | 0.0269 | 124.0 | 2728 | 0.0603 | 0.5185 | 0.675 | 0.35 |
180
+ | 0.0269 | 125.0 | 2750 | 0.0586 | 0.5714 | 0.7 | 0.4 |
181
+ | 0.0269 | 126.0 | 2772 | 0.0592 | 0.5714 | 0.7 | 0.4 |
182
+ | 0.0269 | 127.0 | 2794 | 0.0584 | 0.5714 | 0.7 | 0.4 |
183
+ | 0.0269 | 128.0 | 2816 | 0.0585 | 0.5185 | 0.675 | 0.35 |
184
+ | 0.0269 | 129.0 | 2838 | 0.0600 | 0.5185 | 0.675 | 0.35 |
185
+ | 0.0269 | 130.0 | 2860 | 0.0590 | 0.5185 | 0.675 | 0.35 |
186
+ | 0.0269 | 131.0 | 2882 | 0.0591 | 0.5714 | 0.7 | 0.4 |
187
+ | 0.0269 | 132.0 | 2904 | 0.0585 | 0.6207 | 0.725 | 0.45 |
188
+ | 0.0269 | 133.0 | 2926 | 0.0592 | 0.5714 | 0.7 | 0.4 |
189
+ | 0.0269 | 134.0 | 2948 | 0.0575 | 0.6207 | 0.725 | 0.45 |
190
+ | 0.0269 | 135.0 | 2970 | 0.0579 | 0.6207 | 0.725 | 0.45 |
191
+ | 0.0269 | 136.0 | 2992 | 0.0581 | 0.5714 | 0.7 | 0.4 |
192
+ | 0.0211 | 137.0 | 3014 | 0.0580 | 0.6207 | 0.725 | 0.45 |
193
+ | 0.0211 | 138.0 | 3036 | 0.0589 | 0.5714 | 0.7 | 0.4 |
194
+ | 0.0211 | 139.0 | 3058 | 0.0576 | 0.6207 | 0.725 | 0.45 |
195
+ | 0.0211 | 140.0 | 3080 | 0.0583 | 0.5714 | 0.7 | 0.4 |
196
+ | 0.0211 | 141.0 | 3102 | 0.0578 | 0.6207 | 0.725 | 0.45 |
197
+ | 0.0211 | 142.0 | 3124 | 0.0579 | 0.5714 | 0.7 | 0.4 |
198
+ | 0.0211 | 143.0 | 3146 | 0.0576 | 0.6207 | 0.725 | 0.45 |
199
+ | 0.0211 | 144.0 | 3168 | 0.0581 | 0.6207 | 0.725 | 0.45 |
200
+ | 0.0211 | 145.0 | 3190 | 0.0572 | 0.6207 | 0.725 | 0.45 |
201
+ | 0.0211 | 146.0 | 3212 | 0.0574 | 0.6207 | 0.725 | 0.45 |
202
+ | 0.0211 | 147.0 | 3234 | 0.0572 | 0.5714 | 0.7 | 0.4 |
203
+ | 0.0211 | 148.0 | 3256 | 0.0573 | 0.6207 | 0.725 | 0.45 |
204
+ | 0.0211 | 149.0 | 3278 | 0.0570 | 0.6207 | 0.725 | 0.45 |
205
+ | 0.0211 | 150.0 | 3300 | 0.0564 | 0.6207 | 0.725 | 0.45 |
206
+ | 0.0211 | 151.0 | 3322 | 0.0568 | 0.6207 | 0.725 | 0.45 |
207
+ | 0.0211 | 152.0 | 3344 | 0.0565 | 0.6207 | 0.725 | 0.45 |
208
+ | 0.0211 | 153.0 | 3366 | 0.0567 | 0.6207 | 0.725 | 0.45 |
209
+ | 0.0211 | 154.0 | 3388 | 0.0578 | 0.6207 | 0.725 | 0.45 |
210
+ | 0.0211 | 155.0 | 3410 | 0.0571 | 0.6207 | 0.725 | 0.45 |
211
+ | 0.0211 | 156.0 | 3432 | 0.0572 | 0.6207 | 0.725 | 0.45 |
212
+ | 0.0211 | 157.0 | 3454 | 0.0565 | 0.6207 | 0.725 | 0.45 |
213
+ | 0.0211 | 158.0 | 3476 | 0.0576 | 0.6207 | 0.725 | 0.45 |
214
+ | 0.0211 | 159.0 | 3498 | 0.0565 | 0.6207 | 0.725 | 0.45 |
215
+ | 0.0178 | 160.0 | 3520 | 0.0575 | 0.6207 | 0.725 | 0.45 |
216
+ | 0.0178 | 161.0 | 3542 | 0.0566 | 0.6207 | 0.725 | 0.45 |
217
+ | 0.0178 | 162.0 | 3564 | 0.0565 | 0.6207 | 0.725 | 0.45 |
218
+ | 0.0178 | 163.0 | 3586 | 0.0565 | 0.6207 | 0.725 | 0.45 |
219
+ | 0.0178 | 164.0 | 3608 | 0.0571 | 0.6207 | 0.725 | 0.45 |
220
+ | 0.0178 | 165.0 | 3630 | 0.0574 | 0.6207 | 0.725 | 0.45 |
221
+ | 0.0178 | 166.0 | 3652 | 0.0567 | 0.6207 | 0.725 | 0.45 |
222
+ | 0.0178 | 167.0 | 3674 | 0.0572 | 0.6207 | 0.725 | 0.45 |
223
+ | 0.0178 | 168.0 | 3696 | 0.0573 | 0.6207 | 0.725 | 0.45 |
224
+ | 0.0178 | 169.0 | 3718 | 0.0568 | 0.6207 | 0.725 | 0.45 |
225
+ | 0.0178 | 170.0 | 3740 | 0.0572 | 0.6207 | 0.725 | 0.45 |
226
+ | 0.0178 | 171.0 | 3762 | 0.0568 | 0.6207 | 0.725 | 0.45 |
227
+ | 0.0178 | 172.0 | 3784 | 0.0559 | 0.6207 | 0.725 | 0.45 |
228
+ | 0.0178 | 173.0 | 3806 | 0.0566 | 0.6207 | 0.725 | 0.45 |
229
+ | 0.0178 | 174.0 | 3828 | 0.0570 | 0.6207 | 0.725 | 0.45 |
230
+ | 0.0178 | 175.0 | 3850 | 0.0565 | 0.6207 | 0.725 | 0.45 |
231
+ | 0.0178 | 176.0 | 3872 | 0.0571 | 0.6207 | 0.725 | 0.45 |
232
+ | 0.0178 | 177.0 | 3894 | 0.0564 | 0.6207 | 0.725 | 0.45 |
233
+ | 0.0178 | 178.0 | 3916 | 0.0570 | 0.6207 | 0.725 | 0.45 |
234
+ | 0.0178 | 179.0 | 3938 | 0.0571 | 0.6207 | 0.725 | 0.45 |
235
+ | 0.0178 | 180.0 | 3960 | 0.0568 | 0.6207 | 0.725 | 0.45 |
236
+ | 0.0178 | 181.0 | 3982 | 0.0568 | 0.6207 | 0.725 | 0.45 |
237
+ | 0.0158 | 182.0 | 4004 | 0.0564 | 0.6207 | 0.725 | 0.45 |
238
+ | 0.0158 | 183.0 | 4026 | 0.0567 | 0.6207 | 0.725 | 0.45 |
239
+ | 0.0158 | 184.0 | 4048 | 0.0567 | 0.6207 | 0.725 | 0.45 |
240
+ | 0.0158 | 185.0 | 4070 | 0.0568 | 0.6207 | 0.725 | 0.45 |
241
+ | 0.0158 | 186.0 | 4092 | 0.0560 | 0.6207 | 0.725 | 0.45 |
242
+ | 0.0158 | 187.0 | 4114 | 0.0563 | 0.6207 | 0.725 | 0.45 |
243
+ | 0.0158 | 188.0 | 4136 | 0.0565 | 0.6207 | 0.725 | 0.45 |
244
+ | 0.0158 | 189.0 | 4158 | 0.0564 | 0.6207 | 0.725 | 0.45 |
245
+ | 0.0158 | 190.0 | 4180 | 0.0566 | 0.6207 | 0.725 | 0.45 |
246
+ | 0.0158 | 191.0 | 4202 | 0.0566 | 0.6207 | 0.725 | 0.45 |
247
+ | 0.0158 | 192.0 | 4224 | 0.0568 | 0.6207 | 0.725 | 0.45 |
248
+ | 0.0158 | 193.0 | 4246 | 0.0567 | 0.6207 | 0.725 | 0.45 |
249
+ | 0.0158 | 194.0 | 4268 | 0.0565 | 0.6207 | 0.725 | 0.45 |
250
+ | 0.0158 | 195.0 | 4290 | 0.0563 | 0.6207 | 0.725 | 0.45 |
251
+ | 0.0158 | 196.0 | 4312 | 0.0563 | 0.6207 | 0.725 | 0.45 |
252
+ | 0.0158 | 197.0 | 4334 | 0.0564 | 0.6207 | 0.725 | 0.45 |
253
+ | 0.0158 | 198.0 | 4356 | 0.0565 | 0.6207 | 0.725 | 0.45 |
254
+ | 0.0158 | 199.0 | 4378 | 0.0565 | 0.6207 | 0.725 | 0.45 |
255
+ | 0.0158 | 200.0 | 4400 | 0.0565 | 0.6207 | 0.725 | 0.45 |
256
 
257
 
258
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:e17856c88d1b32bc0242b64cfc59a225686d690ad178378c90bfab3424a7e6a7
3
  size 438081688
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d1d1252a6cdeccb7dd33c4a12290ee360ab4070e11c190f9281143742d2d7295
3
  size 438081688
runs/Nov03_17-12-40_PingkeePC/events.out.tfevents.1730625163.PingkeePC CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:4ea2270774b6fe477753418a417b703998a6ebbc847c6289c56df73208837b94
3
- size 80913
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1dc641d30cc5dff4d3934b85fbf5d3c1d48c0c1065f71d7b858252850095ab07
3
+ size 92398