MF21377197 commited on
Commit
9f97fd5
1 Parent(s): 62825fd

End of training

Browse files
README.md CHANGED
@@ -15,12 +15,12 @@ should probably proofread and complete it, then remove this comment. -->
15
 
16
  This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on an unknown dataset.
17
  It achieves the following results on the evaluation set:
18
- - Loss: 2.9804
19
- - Mean Iou: 0.0270
20
- - Mean Accuracy: 0.0832
21
- - Overall Accuracy: 0.5068
22
- - Per Category Iou: [0.8021452931267776, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.00757909387963703, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.000748031223377361, 0.0, 0.15502171352863242, nan, 0.0, nan, 0.056221004367996964, nan, 0.0, nan, 0.0, nan, 0.040391599317767694, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.07047672462142457, 0.0, nan, nan, 0.05062781753814023, nan, 0.0, 0.03166986564299424, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.08493484197676761, 0.0, nan, 0.04267983360952349, nan, 0.0003925481280361144, 0.0, nan, nan, 0.00012044806680852773, 0.0, 0.0, 0.002362410370862814, nan, nan, nan, nan, 0.0022562095679258846, 0.0, 0.0]
23
- - Per Category Accuracy: [0.8694985349631393, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, 0.008096131396611705, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0010361203050798676, nan, 0.8231816690745898, nan, 0.0, nan, 0.11201351351351352, nan, 0.0, nan, nan, nan, 0.07139561448697561, nan, nan, nan, nan, nan, nan, nan, 0.10334561484308179, 0.0, nan, nan, 0.12090652522991582, nan, 0.0, 0.037541019470575365, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.8687054097111588, 0.0, nan, 0.05824436171194732, nan, 0.00040998308819761186, nan, nan, nan, 0.00012146732528949713, 0.0, 0.0, 0.0023685118842166654, nan, nan, nan, nan, 0.0027728020023065785, nan, 0.0]
24
 
25
  ## Model description
26
 
@@ -40,20 +40,27 @@ More information needed
40
 
41
  The following hyperparameters were used during training:
42
  - learning_rate: 5e-05
43
- - train_batch_size: 1
44
- - eval_batch_size: 1
45
  - seed: 42
46
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
47
  - lr_scheduler_type: linear
48
- - num_epochs: 3
49
 
50
  ### Training results
51
 
52
- | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Per Category Iou | Per Category Accuracy |
53
- |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|
54
- | 4.0562 | 1.0 | 80 | 3.4450 | 0.0153 | 0.0686 | 0.4812 | [0.7793520968900188, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.02432875250844616, nan, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.0032801739760782893, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.12812692281805033, nan, 0.0, 0.0, 0.05380656279838841, nan, 0.0, nan, nan, nan, 0.0003018650007218511, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.008481858011825258, 0.0003758691975192633, nan, 0.0, 0.010826130083604932, 0.0, 0.0, 0.005665119052689867, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.07572116691493873, 0.0009061488673139158, nan, 0.00028572287888357543, 0.0, 0.0005691934965994337, 0.0, nan, nan, 0.0, 0.005145141665218147, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0016498912362201822, 0.0, 0.0] | [0.8320357969624665, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, 0.044499683472624736, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0033331798720132593, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.843680423236109, nan, 0.0, nan, 0.1561081081081081, nan, 0.0, nan, nan, nan, 0.00030909131632584497, nan, nan, nan, nan, nan, nan, nan, 0.016399105204289756, 0.0004157427937915743, nan, nan, 0.01858790326504793, nan, 0.0, 0.005819295558958652, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.6029646048347995, 0.0009201324990798675, nan, 0.00029229869850158456, nan, 0.0006662225183211193, nan, nan, nan, 0.0, 0.0084153067606755, 0.0, 0.0, nan, nan, nan, nan, 0.002196157338110078, nan, 0.0] |
55
- | 3.4584 | 2.0 | 160 | 3.1139 | 0.0241 | 0.0797 | 0.4821 | [0.7782469073708838, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.010029985991945368, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0017241143388842517, 0.0, 0.1557832833948993, nan, 0.0, nan, 0.07580358123695485, nan, 0.0, nan, 0.0, nan, 0.028521816696650947, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.021143006947175254, 0.0, nan, nan, 0.018894200915169017, nan, 0.0, 0.0064420428056791695, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.06671080655150016, 0.0, nan, 0.016629320979883684, nan, 0.0010633250993799996, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.000702695105998854, 0.0, 0.0] | [0.8244354154654462, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, 0.01064577392395212, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.002417614045186358, nan, 0.8166523587177286, nan, 0.0, nan, 0.23312162162162162, nan, 0.0, nan, nan, nan, 0.050991108025175265, nan, nan, nan, nan, nan, nan, nan, 0.026531350746759656, 0.0, nan, nan, 0.03812466546640066, nan, 0.0, 0.0066506234959527455, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.9177460691279352, 0.0, nan, 0.02032245161687333, nan, 0.0011103708638685321, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0007974873015483523, nan, 0.0] |
56
- | 3.1748 | 3.0 | 240 | 2.9804 | 0.0270 | 0.0832 | 0.5068 | [0.8021452931267776, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.00757909387963703, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.000748031223377361, 0.0, 0.15502171352863242, nan, 0.0, nan, 0.056221004367996964, nan, 0.0, nan, 0.0, nan, 0.040391599317767694, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.07047672462142457, 0.0, nan, nan, 0.05062781753814023, nan, 0.0, 0.03166986564299424, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.08493484197676761, 0.0, nan, 0.04267983360952349, nan, 0.0003925481280361144, 0.0, nan, nan, 0.00012044806680852773, 0.0, 0.0, 0.002362410370862814, nan, nan, nan, nan, 0.0022562095679258846, 0.0, 0.0] | [0.8694985349631393, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, 0.008096131396611705, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0010361203050798676, nan, 0.8231816690745898, nan, 0.0, nan, 0.11201351351351352, nan, 0.0, nan, nan, nan, 0.07139561448697561, nan, nan, nan, nan, nan, nan, nan, 0.10334561484308179, 0.0, nan, nan, 0.12090652522991582, nan, 0.0, 0.037541019470575365, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.8687054097111588, 0.0, nan, 0.05824436171194732, nan, 0.00040998308819761186, nan, nan, nan, 0.00012146732528949713, 0.0, 0.0, 0.0023685118842166654, nan, nan, nan, nan, 0.0027728020023065785, nan, 0.0] |
 
 
 
 
 
 
 
57
 
58
 
59
  ### Framework versions
 
15
 
16
  This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on an unknown dataset.
17
  It achieves the following results on the evaluation set:
18
+ - Loss: 2.2049
19
+ - Mean Iou: 0.0529
20
+ - Mean Accuracy: 0.1563
21
+ - Overall Accuracy: 0.5690
22
+ - Per Category Iou: [0.8411342620340232, nan, nan, 0.00440539598785405, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0009979755353425907, nan, nan, nan, 0.05215870239325753, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.02585601844888956, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0023056841784332783, 0.0, 0.15720250055027643, nan, 0.0, nan, 0.010010171425736188, nan, 0.0, nan, nan, nan, 0.3278824063816368, 0.0, nan, nan, nan, nan, nan, nan, 0.16577780294023287, 0.0, nan, nan, 0.07446100952634326, nan, 0.0, 0.07748760838659272, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.19954901550549783, 0.0, nan, 0.17496882019195398, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, 0.00033325795355812374, nan, nan, nan, nan, 0.000906274111177789, nan, 0.0]
23
+ - Per Category Accuracy: [0.9215607553461405, nan, nan, 0.00444544906570223, 0.0, 0.0, nan, nan, nan, 0.0, 0.0010163723058874091, nan, nan, nan, 0.052159531836397176, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.027675301503830005, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.002321676979901185, nan, 0.46607271229855457, nan, 0.0, nan, 0.010905405405405405, nan, 0.0, nan, nan, nan, 0.8989002620556812, nan, nan, nan, nan, nan, nan, nan, 0.722448845318771, 0.0, nan, nan, 0.08852367281397498, nan, 0.0, 0.7941369503390943, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.8918750432915425, 0.0, nan, 0.8999876926863789, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, 0.00033325795355812374, nan, nan, nan, nan, 0.000907908620224278, nan, 0.0]
24
 
25
  ## Model description
26
 
 
40
 
41
  The following hyperparameters were used during training:
42
  - learning_rate: 5e-05
43
+ - train_batch_size: 2
44
+ - eval_batch_size: 2
45
  - seed: 42
46
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
47
  - lr_scheduler_type: linear
48
+ - num_epochs: 10
49
 
50
  ### Training results
51
 
52
+ | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Per Category Iou | Per Category Accuracy |
53
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|
54
+ | 3.0427 | 1.0 | 40 | 2.9270 | 0.0309 | 0.0973 | 0.5028 | [0.7941427394940275, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.009428203215085125, nan, nan, 0.0, 0.0013435124156353233, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.001934640522875817, 0.0, 0.16045113550578763, nan, 0.0, nan, 0.03580332494192905, nan, 0.0, nan, nan, nan, 0.061999426140432926, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0010348020260334404, 2.388401920275144e-05, nan, nan, 0.03135198802522555, nan, 0.0, 0.020500368822227687, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.07843223883410502, 0.0, nan, 0.188387511552216, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0035724679130164274, 0.0, 0.0] | [0.8432045749709368, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, 0.009687480035543991, nan, nan, nan, 0.001351695184784683, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0024847699908859787, nan, 0.8544672208379572, nan, 0.0, nan, 0.039783783783783784, nan, 0.0, nan, nan, nan, 0.12292875221179475, nan, nan, nan, nan, nan, nan, nan, 0.0012500822422527797, 2.5983924611973392e-05, nan, nan, 0.046627901318670624, nan, 0.0, 0.02918398599868738, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.9222483895546166, 0.0, nan, 0.7212701147656995, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.004367776605403283, nan, 0.0] |
55
+ | 3.1428 | 2.0 | 80 | 2.6637 | 0.0401 | 0.1177 | 0.5226 | [0.8074618180190414, nan, nan, 0.0016479812230018227, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.012663610529154308, nan, nan, nan, 0.00022241639526570815, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0017822941051287508, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, 0.1701888230556834, nan, 0.0, nan, 0.09306181364230402, nan, 0.0, nan, nan, nan, 0.16484283173158512, 0.0, nan, nan, nan, nan, nan, 0.0, 0.15211849507735584, 0.0, nan, nan, 8.05134457454395e-05, nan, 0.0, 0.07005691912505234, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.09765688073394495, 0.0, nan, 0.14866396338688687, nan, 0.0002550933641712867, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0030113184036551174, 0.0, 0.0] | [0.863908907671061, nan, nan, 0.0016576250753465944, 0.0, 0.0, nan, nan, nan, 0.0, 0.014182749548440303, nan, nan, nan, 0.00022263214808218308, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0017976201655420362, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.6607759793965535, nan, 0.0, nan, 0.12894594594594594, nan, 0.0, nan, nan, nan, 0.4699263108383542, nan, nan, nan, nan, nan, nan, nan, 0.2277123494966774, 0.0, nan, nan, 8.515400710427716e-05, nan, 0.0, 0.2708816451542332, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.9216423079587172, 0.0, nan, 0.7890680286760408, nan, 0.0002562394301235074, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0032022182416018455, nan, 0.0] |
56
+ | 2.3713 | 3.0 | 120 | 2.4315 | 0.0467 | 0.1308 | 0.5506 | [0.816217909207174, nan, nan, 0.010999749436231521, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.019900776410572637, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.00397223196832239, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, 0.1398504132771441, nan, 0.0, nan, 0.0910590813486316, nan, 0.0, nan, nan, nan, 0.22628639476729678, 0.0, nan, nan, nan, nan, nan, nan, 0.20183578232866675, 0.0, nan, nan, 0.01505267877157588, nan, 0.0, 0.10048424832689483, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.14180992148191232, 0.0, nan, 0.14801565612477238, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, 0.0, 0.0] | [0.9126766191641265, nan, nan, 0.011025718304199317, 0.0, 0.0, nan, nan, nan, 0.0, 0.024741405846173504, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.004252579048334518, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.4705326046748606, nan, 0.0, nan, 0.0977027027027027, nan, 0.0, nan, nan, nan, 0.6945953815485922, nan, nan, nan, nan, nan, nan, nan, 0.43040660569774325, 0.0, nan, nan, 0.016337404505863462, nan, 0.0, 0.4040253773791293, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.8950959340583223, 0.0, nan, 0.879065259530476, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, 0.0] |
57
+ | 2.2125 | 4.0 | 160 | 2.3685 | 0.0463 | 0.1304 | 0.5546 | [0.8240177612989271, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.006803304462167339, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.00824945727254786, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, 0.1381139431859967, nan, 0.0, nan, 0.039363159391072856, nan, 0.0, nan, nan, nan, 0.24887735034466427, 0.0, nan, nan, nan, nan, nan, nan, 0.17423648403782163, 0.0, nan, nan, 0.05281855212576441, nan, 0.0, 0.07896994964159726, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.11143188600880262, 0.0, nan, 0.16601047612497993, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.001870436315240459, nan, 0.0] | [0.9216902158885502, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, 0.006911331680034383, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.008411252565633258, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.41289250891629503, nan, 0.0, nan, 0.04116216216216216, nan, 0.0, nan, nan, nan, 0.7721101081819607, nan, nan, nan, nan, nan, nan, nan, 0.4349463780511876, 0.0, nan, nan, 0.061675830859812174, nan, 0.0, 0.3986436228396412, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.8715799681374247, 0.0, nan, 0.891280268299437, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0019139695237160455, nan, 0.0] |
58
+ | 2.8488 | 5.0 | 200 | 2.2311 | 0.0486 | 0.1392 | 0.5544 | [0.831052781471127, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0028386300524529465, nan, nan, nan, 0.026585319592843142, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.02941326205931017, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 5.7327944506549715e-05, 0.0, 0.11295426868350211, nan, 0.0, nan, 0.02263840296357275, nan, 0.0, nan, nan, nan, 0.19558659801687306, 0.0, nan, nan, nan, nan, nan, nan, 0.19623804567041833, 0.0, nan, nan, 0.06682466334807528, nan, 0.0, 0.10155024946543122, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.1748498883719982, 0.0, nan, 0.18346828477051688, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, 0.0] | [0.9226013038098442, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, 0.0029387679244515944, nan, nan, nan, 0.02674766236244514, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.031002240317668996, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 5.7562239171103755e-05, nan, 0.2952009045415524, nan, 0.0, nan, 0.023783783783783784, nan, 0.0, nan, nan, nan, 0.8719376441866195, nan, nan, nan, nan, nan, nan, nan, 0.39691756036581355, 0.0, nan, nan, 0.08197897912510341, nan, 0.0, 0.7480638809888427, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.8910265290572834, 0.0, nan, 0.859604319867081, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, 0.0] |
59
+ | 2.3995 | 6.0 | 240 | 2.2327 | 0.0477 | 0.1453 | 0.5525 | [0.8316146337349258, nan, nan, 0.006053989678443827, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0008784419998622052, nan, nan, nan, 0.0005562000413177173, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.011662171611227649, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0024991809800159123, 0.0, 0.08550018416572924, nan, 0.0, nan, 0.057912407483680935, nan, 0.0, nan, nan, nan, 0.25940642629384353, 0.0, nan, nan, nan, nan, nan, nan, 0.1372000698181386, 0.0, nan, nan, 0.05451270188941885, nan, 0.0, 0.11116843080097706, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.1955015275622071, 0.0, nan, 0.15274004891223666, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0021475461844449326, nan, 0.0] | [0.9186460223248345, nan, nan, 0.00612818967249347, 0.0, 0.0, nan, nan, nan, 0.0, 0.000888599787432992, nan, nan, nan, 0.0005565803702054576, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.012529680855345237, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.002561519643114117, nan, 0.23978028574020951, nan, 0.0, nan, 0.0627027027027027, nan, 0.0, nan, nan, nan, 0.8764575447398482, nan, nan, nan, nan, nan, nan, nan, 0.6076715573392987, 0.0, nan, nan, 0.06479003454819716, nan, 0.0, 0.7825858674250711, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.8964639468033525, 0.0, nan, 0.9022030091381804, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.002159350231884769, nan, 0.0] |
60
+ | 2.518 | 7.0 | 280 | 2.1968 | 0.0483 | 0.1448 | 0.5565 | [0.835294581918955, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0004969440795110527, nan, nan, nan, 0.0070991820852854765, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.024552328387392183, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0008260853051768012, 0.0, 0.10751554522165623, nan, 0.0, nan, 0.0497325466320919, nan, 0.0, nan, nan, nan, 0.26028636001713373, 0.0, nan, nan, nan, nan, nan, nan, 0.1383557089002794, 0.0, nan, nan, 0.056956059603313076, nan, 0.0, 0.08924809533015103, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.1890732919633464, 0.0, nan, 0.17234102039159804, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, 0.0008926552327449743, nan, nan, nan, nan, 0.000489913897632491, nan, 0.0] | [0.918687180300745, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, 0.0005052822320697406, nan, nan, nan, 0.0071083264423382735, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.026615510510711936, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0008346524679810045, nan, 0.3170885766731576, nan, 0.0, nan, 0.05364864864864865, nan, 0.0, nan, nan, nan, 0.9064438820077497, nan, nan, nan, nan, nan, nan, nan, 0.5270083558128824, 0.0, nan, nan, 0.06909639433604205, nan, 0.0, 0.787792605556771, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.858956154325691, 0.0, nan, 0.8819267099473862, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, 0.0008926552327449743, nan, nan, nan, nan, 0.0004907614163374475, nan, 0.0] |
61
+ | 2.4214 | 8.0 | 320 | 2.1804 | 0.0526 | 0.1562 | 0.5716 | [0.8397840862162049, nan, nan, 0.007671739510707949, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.00209624394322959, nan, nan, nan, 0.035521210964828596, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.02139811148339933, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.000916669053825661, 0.0, 0.16681998970302855, nan, 0.0, nan, 0.009098397009440538, nan, 0.0, nan, nan, nan, 0.3631096079535375, 0.0, nan, nan, nan, nan, nan, nan, 0.1466350801069115, 0.0, nan, nan, 0.07112335603264543, nan, 0.0, 0.08216884509226945, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.1942664510008243, 0.0, nan, 0.16464797567919703, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0006617565961201455, nan, 0.0] | [0.925969897056419, nan, nan, 0.007710468153506128, 0.0, 0.0, nan, nan, nan, 0.0, 0.0021256700797416673, nan, nan, nan, 0.03552572991539978, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.022617817903760246, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0009209958267376601, nan, 0.5144859258638861, nan, 0.0, nan, 0.009702702702702702, nan, 0.0, nan, nan, nan, 0.8435639572647659, nan, nan, nan, nan, nan, nan, nan, 0.7859892098164353, 0.0, nan, nan, 0.0832197946571943, nan, 0.0, 0.7880551301684533, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.8651901364549421, 0.0, nan, 0.8923417740992585, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0006625279120555542, nan, 0.0] |
62
+ | 1.9603 | 9.0 | 360 | 2.1835 | 0.0518 | 0.1540 | 0.5683 | [0.8380163682298087, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0006993929040283885, nan, nan, nan, 0.029337910428784846, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.02851986446005009, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.005728912725499756, 0.0, 0.1444569157593463, nan, 0.0, nan, 0.012090736673986824, nan, 0.0, nan, nan, nan, 0.31104943195798573, 0.0, nan, nan, nan, nan, nan, nan, 0.15770486109880608, 0.0, nan, nan, 0.08617187894710251, nan, 0.0, 0.0785529760713944, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.20691699604743083, 0.0, nan, 0.16953245084769092, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, 0.0008450469536652424, nan, nan, nan, nan, 0.003599703267703608, nan, 0.0] | [0.9288517036969964, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, 0.0007085566932472224, nan, nan, nan, 0.029355638954264997, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.031163221228016045, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.005861754688924066, nan, 0.3911967726850786, nan, 0.0, nan, 0.013094594594594595, nan, 0.0, nan, nan, nan, 0.8890675745290837, nan, nan, nan, nan, nan, nan, nan, 0.7319724981906705, 0.0, nan, nan, 0.10374191036932509, nan, 0.0, 0.7902865893677532, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.8702639052434716, 0.0, nan, 0.9094797083166671, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, 0.0008450469536652424, nan, nan, nan, nan, 0.003631634480897112, nan, 0.0] |
63
+ | 2.0252 | 10.0 | 400 | 2.2049 | 0.0529 | 0.1563 | 0.5690 | [0.8411342620340232, nan, nan, 0.00440539598785405, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0009979755353425907, nan, nan, nan, 0.05215870239325753, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.02585601844888956, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0023056841784332783, 0.0, 0.15720250055027643, nan, 0.0, nan, 0.010010171425736188, nan, 0.0, nan, nan, nan, 0.3278824063816368, 0.0, nan, nan, nan, nan, nan, nan, 0.16577780294023287, 0.0, nan, nan, 0.07446100952634326, nan, 0.0, 0.07748760838659272, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.19954901550549783, 0.0, nan, 0.17496882019195398, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, 0.00033325795355812374, nan, nan, nan, nan, 0.000906274111177789, nan, 0.0] | [0.9215607553461405, nan, nan, 0.00444544906570223, 0.0, 0.0, nan, nan, nan, 0.0, 0.0010163723058874091, nan, nan, nan, 0.052159531836397176, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.027675301503830005, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.002321676979901185, nan, 0.46607271229855457, nan, 0.0, nan, 0.010905405405405405, nan, 0.0, nan, nan, nan, 0.8989002620556812, nan, nan, nan, nan, nan, nan, nan, 0.722448845318771, 0.0, nan, nan, 0.08852367281397498, nan, 0.0, 0.7941369503390943, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.8918750432915425, 0.0, nan, 0.8999876926863789, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, 0.00033325795355812374, nan, nan, nan, nan, 0.000907908620224278, nan, 0.0] |
64
 
65
 
66
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:0fa79fe674d5d4eb6bf68351e9c64d82ad6dfaa1518d87fd9920d54dc5527ae5
3
  size 14989656
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5a63965920bef98762e667999c40e37c6a68bb9d6643863c57e354dc9c734c3a
3
  size 14989656
runs/Apr13_18-15-26_b1fdc3b2e8ea/events.out.tfevents.1713032133.b1fdc3b2e8ea.3243.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:820964fd4759f94de7435bf702c87d61afb596d80a31e9f1d428218aad3aa61f
3
+ size 98089
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:5d804272fd4a0307298bef737f7a1d6c1e36a5f7ba235fe73ba429e600f12a19
3
  size 4920
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:369c0ce5fc7fc48bf7c3ced052533c5b8e16242ae1362b472303aa7088f72cec
3
  size 4920