segformer-b0-finetuned-busigt2
This model is a fine-tuned version of nvidia/mit-b1 on the kasumi222/busigt5 dataset. It achieves the following results on the evaluation set:
- Loss: 0.2904
- Mean Iou: 0.4458
- Mean Accuracy: 0.6980
- Overall Accuracy: 0.6969
- Per Category Iou: [0.0, 0.6551336334577343, 0.6821319425157643]
- Per Category Accuracy: [nan, 0.6913100552356098, 0.70464740289276]
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.00013
- train_batch_size: 20
- eval_batch_size: 20
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
Training results
Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Per Category Iou | Per Category Accuracy |
---|---|---|---|---|---|---|---|---|
0.1095 | 0.77 | 20 | 0.2086 | 0.4674 | 0.7410 | 0.7419 | [0.0, 0.6978460673452154, 0.704309291034096] | [nan, 0.7461995349612959, 0.7357650020760118] |
0.1156 | 1.54 | 40 | 0.1980 | 0.4186 | 0.6721 | 0.6783 | [0.0, 0.6446507442278364, 0.6112330250576428] | [nan, 0.7089917293749448, 0.635300900559587] |
0.1039 | 2.31 | 60 | 0.1987 | 0.3706 | 0.5810 | 0.5757 | [0.0, 0.5345322994102119, 0.5773860979625277] | [nan, 0.5495831330265778, 0.6123860258526792] |
0.0672 | 3.08 | 80 | 0.1960 | 0.4099 | 0.6407 | 0.6439 | [0.0, 0.6194380206711395, 0.6103561290824698] | [nan, 0.6596136450596995, 0.6218662960315686] |
0.0992 | 3.85 | 100 | 0.1969 | 0.4201 | 0.6684 | 0.6695 | [0.0, 0.6251984513525223, 0.6351366565306488] | [nan, 0.675036447653713, 0.661700391303438] |
0.085 | 4.62 | 120 | 0.2075 | 0.4383 | 0.6997 | 0.6964 | [0.0, 0.6407576836532538, 0.6742246105299582] | [nan, 0.6804532655724195, 0.718889834811138] |
0.0561 | 5.38 | 140 | 0.2037 | 0.4401 | 0.7033 | 0.7071 | [0.0, 0.6545188689920507, 0.665783897448558] | [nan, 0.7263735810923504, 0.6801427547189345] |
0.0841 | 6.15 | 160 | 0.2119 | 0.3651 | 0.5891 | 0.5934 | [0.0, 0.5494216923933923, 0.5458843877102458] | [nan, 0.6146571565924632, 0.5634664881039569] |
0.1034 | 6.92 | 180 | 0.2371 | 0.3684 | 0.6193 | 0.6367 | [0.0, 0.6047004430113216, 0.5003660220404046] | [nan, 0.7229919452156935, 0.5156554415186935] |
0.0691 | 7.69 | 200 | 0.2266 | 0.4285 | 0.6991 | 0.7117 | [0.0, 0.6730686627556878, 0.6124621276402561] | [nan, 0.7742042834577688, 0.6240342690621383] |
0.0601 | 8.46 | 220 | 0.2106 | 0.4198 | 0.6674 | 0.6704 | [0.0, 0.6308213023617786, 0.6287108585057931] | [nan, 0.6851880267250091, 0.6497046776895365] |
0.0647 | 9.23 | 240 | 0.2234 | 0.4229 | 0.6746 | 0.6777 | [0.0, 0.6338885508159525, 0.6349404984513296] | [nan, 0.6928998204597407, 0.6563077167064432] |
0.0626 | 10.0 | 260 | 0.2322 | 0.3991 | 0.6540 | 0.6655 | [0.0, 0.6267222060572648, 0.570544858752452] | [nan, 0.7227113522422911, 0.5852409330048426] |
0.0604 | 10.77 | 280 | 0.2021 | 0.4660 | 0.7283 | 0.7288 | [0.0, 0.6990308020264264, 0.6989818924111941] | [nan, 0.7310753774760368, 0.7255727204344536] |
0.0573 | 11.54 | 300 | 0.2227 | 0.4513 | 0.7014 | 0.6951 | [0.0, 0.6488805486358904, 0.7049138389320693] | [nan, 0.6638350976679388, 0.7389417956785915] |
0.0474 | 12.31 | 320 | 0.2108 | 0.4781 | 0.7468 | 0.7371 | [0.0, 0.6761855871787447, 0.7580093480444655] | [nan, 0.6890590324447889, 0.8044529075728725] |
0.0805 | 13.08 | 340 | 0.2257 | 0.4325 | 0.6902 | 0.6940 | [0.0, 0.6550347525850334, 0.6423545682885212] | [nan, 0.7128733309133007, 0.6675247882412931] |
0.0545 | 13.85 | 360 | 0.2155 | 0.4609 | 0.7230 | 0.7167 | [0.0, 0.6629649481906197, 0.7196967289093881] | [nan, 0.6853650161390015, 0.7606061073292577] |
0.0628 | 14.62 | 380 | 0.2397 | 0.4150 | 0.6561 | 0.6611 | [0.0, 0.6377593821077956, 0.6070948266377257] | [nan, 0.6861969841160831, 0.6259296622984148] |
0.0576 | 15.38 | 400 | 0.2177 | 0.4661 | 0.7274 | 0.7272 | [0.0, 0.6936915190759695, 0.7046022162863222] | [nan, 0.7263017649886684, 0.7284576609239519] |
0.0808 | 16.15 | 420 | 0.2263 | 0.4248 | 0.6707 | 0.6740 | [0.0, 0.6438773235874202, 0.6304024210524071] | [nan, 0.6904172594111472, 0.6510802419847774] |
0.0458 | 16.92 | 440 | 0.2342 | 0.4006 | 0.6449 | 0.6525 | [0.0, 0.6208902028936363, 0.5809796433249929] | [nan, 0.6898132977523129, 0.6000533044931062] |
0.0477 | 17.69 | 460 | 0.2683 | 0.3789 | 0.6170 | 0.6232 | [0.0, 0.5741692028709614, 0.5625631837395161] | [nan, 0.6539633266945951, 0.5800762342358019] |
0.0501 | 18.46 | 480 | 0.2364 | 0.4280 | 0.6700 | 0.6675 | [0.0, 0.6223049989658083, 0.6617065588280534] | [nan, 0.6552936905824757, 0.6846169180090992] |
0.039 | 19.23 | 500 | 0.2378 | 0.4500 | 0.7052 | 0.6986 | [0.0, 0.6391919313721981, 0.7106968345576296] | [nan, 0.665670921345669, 0.7446979100013106] |
0.041 | 20.0 | 520 | 0.2477 | 0.4142 | 0.6612 | 0.6659 | [0.0, 0.6273087938535062, 0.6153514032911991] | [nan, 0.6890233206118104, 0.6333526433632052] |
0.0331 | 20.77 | 540 | 0.2488 | 0.4353 | 0.6814 | 0.6778 | [0.0, 0.6267198588955959, 0.6791644212315564] | [nan, 0.6603973431966015, 0.7023153313193633] |
0.0316 | 21.54 | 560 | 0.2468 | 0.4500 | 0.7025 | 0.6974 | [0.0, 0.6405571933079939, 0.7093320446678179] | [nan, 0.6719456081313097, 0.7331179494069875] |
0.0333 | 22.31 | 580 | 0.2477 | 0.4384 | 0.6899 | 0.6906 | [0.0, 0.6520329743081146, 0.6630535380613215] | [nan, 0.6937796658392771, 0.6860558089232162] |
0.0269 | 23.08 | 600 | 0.2603 | 0.4477 | 0.7018 | 0.6996 | [0.0, 0.6514078130357787, 0.6916101875532822] | [nan, 0.6888588892050193, 0.7147725032516842] |
0.033 | 23.85 | 620 | 0.2424 | 0.4499 | 0.7061 | 0.6986 | [0.0, 0.6447352671115818, 0.7048670621273163] | [nan, 0.6616131152687708, 0.750523958937919] |
0.0555 | 24.62 | 640 | 0.2471 | 0.4342 | 0.6830 | 0.6823 | [0.0, 0.636756610371055, 0.6659104633164847] | [nan, 0.6791280033749645, 0.6868014110272018] |
0.0583 | 25.38 | 660 | 0.2517 | 0.4434 | 0.6922 | 0.6879 | [0.0, 0.6386719513699022, 0.6913843141331489] | [nan, 0.6666374954624388, 0.7178391636040445] |
0.154 | 26.15 | 680 | 0.2535 | 0.4235 | 0.6597 | 0.6487 | [0.0, 0.5750726006840868, 0.695285501846172] | [nan, 0.5943477194462704, 0.7250215035171054] |
0.0292 | 26.92 | 700 | 0.2768 | 0.3679 | 0.6035 | 0.6135 | [0.0, 0.5756677002657924, 0.5279750019379379] | [nan, 0.6631412677700708, 0.5438385402498483] |
0.0288 | 27.69 | 720 | 0.2455 | 0.4676 | 0.7235 | 0.7188 | [0.0, 0.6761224569996822, 0.7268002447671437] | [nan, 0.6954373227898398, 0.7515024928661187] |
0.0321 | 28.46 | 740 | 0.2618 | 0.4324 | 0.6745 | 0.6691 | [0.0, 0.6201514037000198, 0.6770266576179022] | [nan, 0.6425218048210974, 0.7064552401951121] |
0.0309 | 29.23 | 760 | 0.2742 | 0.3944 | 0.6348 | 0.6407 | [0.0, 0.6008533572398147, 0.5822751024176394] | [nan, 0.6701804232440864, 0.599451426280657] |
0.0244 | 30.0 | 780 | 0.2667 | 0.4386 | 0.6819 | 0.6750 | [0.0, 0.6224630782821559, 0.693390305711243] | [nan, 0.6412495217165226, 0.7224713681082742] |
0.0642 | 30.77 | 800 | 0.2501 | 0.4581 | 0.7121 | 0.7096 | [0.0, 0.6722145834845955, 0.7021141065136746] | [nan, 0.6976031865943273, 0.7265325317101161] |
0.0481 | 31.54 | 820 | 0.2685 | 0.4137 | 0.6689 | 0.6766 | [0.0, 0.6379976664903103, 0.6031984018650592] | [nan, 0.7145859291453688, 0.6231961550279683] |
0.0311 | 32.31 | 840 | 0.2570 | 0.4284 | 0.6804 | 0.6832 | [0.0, 0.6426329055663264, 0.6425854743219936] | [nan, 0.6969752862342657, 0.6639063603053335] |
0.0389 | 33.08 | 860 | 0.2795 | 0.3918 | 0.6456 | 0.6590 | [0.0, 0.6244554318979076, 0.5508200429573112] | [nan, 0.7254125011037311, 0.5658618862962298] |
0.0282 | 33.85 | 880 | 0.2568 | 0.4242 | 0.6759 | 0.6775 | [0.0, 0.6282787291971401, 0.6442735430594793] | [nan, 0.6857107537747603, 0.6660974613184492] |
0.0245 | 34.62 | 900 | 0.2635 | 0.4503 | 0.7043 | 0.7037 | [0.0, 0.6658605581388065, 0.6850412042515538] | [nan, 0.7008356961354695, 0.7076892832638209] |
0.0315 | 35.38 | 920 | 0.2769 | 0.4443 | 0.7038 | 0.7055 | [0.0, 0.6610872730365329, 0.6718978137221756] | [nan, 0.7138198907060935, 0.6938235070611933] |
0.0283 | 36.15 | 940 | 0.2697 | 0.4392 | 0.6920 | 0.6907 | [0.0, 0.6405508279799802, 0.6769668218170816] | [nan, 0.6841213809883544, 0.6998318265269149] |
0.0257 | 36.92 | 960 | 0.2712 | 0.4562 | 0.7099 | 0.7082 | [0.0, 0.6720494469697227, 0.6964887349332429] | [nan, 0.6999154296702542, 0.7197879714666775] |
0.0188 | 37.69 | 980 | 0.2857 | 0.4300 | 0.6763 | 0.6771 | [0.0, 0.6397832221652129, 0.6501046733477022] | [nan, 0.6811686795451647, 0.6713607293464362] |
0.0259 | 38.46 | 1000 | 0.2812 | 0.4368 | 0.6851 | 0.6838 | [0.0, 0.6396217765000503, 0.6707000380577134] | [nan, 0.6772780519391329, 0.6929027930893589] |
0.0169 | 39.23 | 1020 | 0.2795 | 0.4542 | 0.7084 | 0.7054 | [0.0, 0.6598929743362643, 0.7028156867427239] | [nan, 0.6906225043413423, 0.7260947520404938] |
0.0296 | 40.0 | 1040 | 0.2834 | 0.4470 | 0.7015 | 0.7013 | [0.0, 0.6608002641121026, 0.6801095152287282] | [nan, 0.7006602764723773, 0.7022773353480376] |
0.0183 | 40.77 | 1060 | 0.2874 | 0.4386 | 0.6909 | 0.6903 | [0.0, 0.6432231900832152, 0.6726091072738183] | [nan, 0.6874296310104291, 0.694422081276136] |
0.0199 | 41.54 | 1080 | 0.2741 | 0.4594 | 0.7175 | 0.7154 | [0.0, 0.6721657359810768, 0.7061664449453671] | [nan, 0.7051238631569653, 0.7298866398455491] |
0.0162 | 42.31 | 1100 | 0.2883 | 0.4414 | 0.6921 | 0.6913 | [0.0, 0.6492915338226911, 0.6750215527697642] | [nan, 0.6870752597447193, 0.6971930338516571] |
0.0179 | 43.08 | 1120 | 0.2927 | 0.4425 | 0.6936 | 0.6927 | [0.0, 0.651082790586508, 0.6764744769464034] | [nan, 0.6884633119781804, 0.6987260886947118] |
0.0228 | 43.85 | 1140 | 0.2954 | 0.4273 | 0.6807 | 0.6841 | [0.0, 0.6418083531582984, 0.6399672125377378] | [nan, 0.7006630235364526, 0.6608033559804007] |
0.0164 | 44.62 | 1160 | 0.2954 | 0.4264 | 0.6740 | 0.6756 | [0.0, 0.6356634502412776, 0.6436554266840772] | [nan, 0.6834636553611899, 0.6644801545389767] |
0.0158 | 45.38 | 1180 | 0.2906 | 0.4433 | 0.6956 | 0.6951 | [0.0, 0.6536928350497138, 0.6760836624911459] | [nan, 0.6927067410990219, 0.6985223421818058] |
0.0198 | 46.15 | 1200 | 0.2881 | 0.4441 | 0.6969 | 0.6961 | [0.0, 0.6527988151987781, 0.6794425179962712] | [nan, 0.6919179412716945, 0.7019810769049473] |
0.018 | 46.92 | 1220 | 0.2961 | 0.4350 | 0.6844 | 0.6839 | [0.0, 0.6395287774950378, 0.6655290939553297] | [nan, 0.6815206961845243, 0.6872821426644097] |
0.0179 | 47.69 | 1240 | 0.2898 | 0.4459 | 0.6987 | 0.6982 | [0.0, 0.6581945977423002, 0.6796217960953337] | [nan, 0.6955130632707722, 0.701934270273604] |
0.0213 | 48.46 | 1260 | 0.2902 | 0.4469 | 0.7004 | 0.6998 | [0.0, 0.6595482974648909, 0.6811920247361126] | [nan, 0.6971510983350829, 0.7036303223269834] |
0.0227 | 49.23 | 1280 | 0.2888 | 0.4452 | 0.6967 | 0.6953 | [0.0, 0.6532891096762087, 0.6823149709479772] | [nan, 0.6885578894699147, 0.7047801134592744] |
0.0266 | 50.0 | 1300 | 0.2904 | 0.4458 | 0.6980 | 0.6969 | [0.0, 0.6551336334577343, 0.6821319425157643] | [nan, 0.6913100552356098, 0.70464740289276] |
Framework versions
- Transformers 4.21.3
- Pytorch 1.12.1+cu113
- Datasets 2.4.0
- Tokenizers 0.12.1
- Downloads last month
- 18
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.