Edit model card

segformer-b1-finetuned-segments-sidewalks-6

This model is a fine-tuned version of nvidia/mit-b1 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0281
  • Mean Iou: 0.7734
  • Mean Accuracy: 0.8242
  • Overall Accuracy: 0.9921
  • Accuracy Bkg: 0.9973
  • Accuracy Knife: 0.6511
  • Accuracy Gun: nan
  • Iou Bkg: 0.9920
  • Iou Knife: 0.5548
  • Iou Gun: nan

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 6
  • eval_batch_size: 6
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Bkg Accuracy Knife Accuracy Gun Iou Bkg Iou Knife Iou Gun
0.755 0.1613 20 0.6655 0.3281 0.4997 0.9841 0.9993 0.0002 nan 0.9841 0.0002 0.0
0.5966 0.3226 40 0.4548 0.3440 0.5334 0.9799 0.9939 0.0728 nan 0.9799 0.0521 0.0
0.5055 0.4839 60 0.4195 0.3744 0.6108 0.9791 0.9906 0.2311 nan 0.9790 0.1441 0.0
0.3988 0.6452 80 0.3316 0.5972 0.6199 0.9861 0.9975 0.2422 nan 0.9860 0.2085 nan
0.3471 0.8065 100 0.2676 0.6187 0.6461 0.9867 0.9973 0.2949 nan 0.9866 0.2509 nan
0.2608 0.9677 120 0.2266 0.6353 0.6712 0.9868 0.9966 0.3457 nan 0.9867 0.2838 nan
0.2139 1.1290 140 0.1955 0.6424 0.6891 0.9864 0.9957 0.3825 nan 0.9863 0.2985 nan
0.1642 1.2903 160 0.1454 0.6391 0.6643 0.9878 0.9979 0.3307 nan 0.9877 0.2904 nan
0.1391 1.4516 180 0.1125 0.6450 0.6657 0.9883 0.9984 0.3330 nan 0.9883 0.3017 nan
0.1115 1.6129 200 0.0950 0.6473 0.6632 0.9888 0.9989 0.3274 nan 0.9887 0.3059 nan
0.1074 1.7742 220 0.0867 0.6669 0.6906 0.9890 0.9983 0.3828 nan 0.9889 0.3449 nan
0.0838 1.9355 240 0.0774 0.6813 0.7068 0.9894 0.9983 0.4154 nan 0.9894 0.3732 nan
0.0762 2.0968 260 0.0717 0.6959 0.7398 0.9892 0.9969 0.4827 nan 0.9891 0.4027 nan
0.0708 2.2581 280 0.0756 0.7036 0.7833 0.9880 0.9944 0.5722 nan 0.9879 0.4193 nan
0.0596 2.4194 300 0.0599 0.7063 0.7545 0.9894 0.9968 0.5123 nan 0.9894 0.4233 nan
0.0655 2.5806 320 0.0621 0.7189 0.7816 0.9894 0.9959 0.5673 nan 0.9893 0.4485 nan
0.0578 2.7419 340 0.0568 0.7269 0.7940 0.9897 0.9958 0.5923 nan 0.9896 0.4643 nan
0.0481 2.9032 360 0.0506 0.7208 0.7571 0.9905 0.9978 0.5165 nan 0.9904 0.4511 nan
0.0479 3.0645 380 0.0490 0.7181 0.7532 0.9904 0.9978 0.5087 nan 0.9903 0.4458 nan
0.0471 3.2258 400 0.0473 0.7237 0.7575 0.9907 0.9980 0.5170 nan 0.9906 0.4568 nan
0.044 3.3871 420 0.0451 0.7329 0.7725 0.9908 0.9977 0.5474 nan 0.9908 0.4751 nan
0.039 3.5484 440 0.0465 0.7370 0.7848 0.9907 0.9972 0.5725 nan 0.9907 0.4834 nan
0.0372 3.7097 460 0.0429 0.7366 0.7860 0.9907 0.9971 0.5749 nan 0.9906 0.4826 nan
0.0389 3.8710 480 0.0399 0.7297 0.7668 0.9908 0.9978 0.5358 nan 0.9907 0.4687 nan
0.0347 4.0323 500 0.0390 0.7306 0.7607 0.9911 0.9983 0.5232 nan 0.9910 0.4701 nan
0.0331 4.1935 520 0.0408 0.7430 0.7949 0.9909 0.9970 0.5927 nan 0.9908 0.4953 nan
0.0417 4.3548 540 0.0367 0.7454 0.7885 0.9912 0.9975 0.5795 nan 0.9911 0.4996 nan
0.0379 4.5161 560 0.0381 0.7515 0.8102 0.9910 0.9966 0.6237 nan 0.9909 0.5120 nan
0.029 4.6774 580 0.0352 0.7479 0.7862 0.9915 0.9979 0.5745 nan 0.9914 0.5045 nan
0.0405 4.8387 600 0.0375 0.7547 0.8267 0.9908 0.9959 0.6574 nan 0.9907 0.5187 nan
0.0284 5.0 620 0.0365 0.7498 0.8161 0.9907 0.9962 0.6361 nan 0.9906 0.5091 nan
0.0294 5.1613 640 0.0362 0.7533 0.8184 0.9909 0.9963 0.6406 nan 0.9908 0.5157 nan
0.0296 5.3226 660 0.0347 0.7486 0.8032 0.9910 0.9969 0.6095 nan 0.9909 0.5062 nan
0.0228 5.4839 680 0.0340 0.7434 0.7820 0.9913 0.9978 0.5663 nan 0.9912 0.4955 nan
0.0251 5.6452 700 0.0339 0.7509 0.7983 0.9913 0.9973 0.5992 nan 0.9912 0.5106 nan
0.0255 5.8065 720 0.0319 0.7388 0.7663 0.9915 0.9985 0.5342 nan 0.9914 0.4862 nan
0.0248 5.9677 740 0.0324 0.7450 0.7835 0.9913 0.9978 0.5692 nan 0.9913 0.4988 nan
0.0249 6.1290 760 0.0322 0.7556 0.8044 0.9914 0.9973 0.6116 nan 0.9914 0.5199 nan
0.0291 6.2903 780 0.0339 0.7524 0.8363 0.9903 0.9951 0.6775 nan 0.9902 0.5146 nan
0.0191 6.4516 800 0.0334 0.7549 0.8350 0.9905 0.9954 0.6747 nan 0.9904 0.5194 nan
0.0228 6.6129 820 0.0311 0.7556 0.8047 0.9914 0.9973 0.6121 nan 0.9914 0.5199 nan
0.0233 6.7742 840 0.0304 0.7588 0.8101 0.9915 0.9972 0.6229 nan 0.9914 0.5262 nan
0.0215 6.9355 860 0.0308 0.7604 0.8271 0.9912 0.9963 0.6579 nan 0.9911 0.5297 nan
0.0197 7.0968 880 0.0305 0.7626 0.8258 0.9913 0.9965 0.6551 nan 0.9913 0.5338 nan
0.0206 7.2581 900 0.0294 0.7534 0.7929 0.9916 0.9978 0.5879 nan 0.9915 0.5152 nan
0.0311 7.4194 920 0.0298 0.7618 0.8185 0.9915 0.9969 0.6402 nan 0.9914 0.5322 nan
0.0175 7.5806 940 0.0299 0.7534 0.7971 0.9915 0.9976 0.5967 nan 0.9914 0.5154 nan
0.0228 7.7419 960 0.0294 0.7545 0.7956 0.9916 0.9977 0.5934 nan 0.9915 0.5175 nan
0.0215 7.9032 980 0.0294 0.7612 0.8147 0.9915 0.9971 0.6323 nan 0.9915 0.5310 nan
0.0177 8.0645 1000 0.0291 0.7608 0.8152 0.9915 0.9970 0.6334 nan 0.9914 0.5302 nan
0.0201 8.2258 1020 0.0295 0.7594 0.8148 0.9914 0.9969 0.6326 nan 0.9913 0.5274 nan
0.0182 8.3871 1040 0.0289 0.7617 0.8175 0.9915 0.9969 0.6380 nan 0.9914 0.5319 nan
0.019 8.5484 1060 0.0287 0.7612 0.8154 0.9915 0.9970 0.6337 nan 0.9914 0.5310 nan
0.0165 8.7097 1080 0.0280 0.7624 0.8129 0.9917 0.9972 0.6286 nan 0.9916 0.5331 nan
0.02 8.8710 1100 0.0289 0.7617 0.8253 0.9913 0.9965 0.6540 nan 0.9912 0.5322 nan
0.0211 9.0323 1120 0.0283 0.7582 0.8006 0.9917 0.9977 0.6035 nan 0.9916 0.5248 nan
0.0139 9.1935 1140 0.0282 0.7632 0.8152 0.9917 0.9972 0.6333 nan 0.9916 0.5348 nan
0.0225 9.3548 1160 0.0281 0.7614 0.8072 0.9918 0.9975 0.6168 nan 0.9917 0.5310 nan
0.02 9.5161 1180 0.0287 0.7605 0.8199 0.9914 0.9967 0.6432 nan 0.9913 0.5297 nan
0.0184 9.6774 1200 0.0289 0.7557 0.8004 0.9916 0.9975 0.6033 nan 0.9915 0.5199 nan
0.0192 9.8387 1220 0.0286 0.7593 0.8088 0.9916 0.9973 0.6203 nan 0.9915 0.5270 nan
0.0188 10.0 1240 0.0286 0.7645 0.8252 0.9915 0.9967 0.6537 nan 0.9914 0.5375 nan
0.0167 10.1613 1260 0.0284 0.7670 0.8390 0.9913 0.9961 0.6820 nan 0.9912 0.5427 nan
0.0162 10.3226 1280 0.0273 0.7667 0.8271 0.9916 0.9967 0.6574 nan 0.9915 0.5420 nan
0.0154 10.4839 1300 0.0276 0.7610 0.8048 0.9918 0.9976 0.6119 nan 0.9917 0.5303 nan
0.015 10.6452 1320 0.0279 0.7612 0.8102 0.9917 0.9973 0.6231 nan 0.9916 0.5308 nan
0.0151 10.8065 1340 0.0281 0.7630 0.8207 0.9915 0.9968 0.6445 nan 0.9914 0.5346 nan
0.0157 10.9677 1360 0.0278 0.7648 0.8234 0.9916 0.9968 0.6501 nan 0.9915 0.5381 nan
0.0145 11.1290 1380 0.0273 0.7642 0.8145 0.9917 0.9973 0.6317 nan 0.9917 0.5367 nan
0.0136 11.2903 1400 0.0275 0.7670 0.8284 0.9916 0.9967 0.6602 nan 0.9915 0.5425 nan
0.0125 11.4516 1420 0.0274 0.7680 0.8253 0.9917 0.9969 0.6537 nan 0.9916 0.5444 nan
0.0147 11.6129 1440 0.0279 0.7668 0.8289 0.9915 0.9966 0.6613 nan 0.9915 0.5421 nan
0.0188 11.7742 1460 0.0274 0.7667 0.8295 0.9915 0.9966 0.6624 nan 0.9914 0.5420 nan
0.0141 11.9355 1480 0.0272 0.7664 0.8232 0.9917 0.9969 0.6495 nan 0.9916 0.5413 nan
0.0148 12.0968 1500 0.0276 0.7636 0.8092 0.9918 0.9975 0.6208 nan 0.9918 0.5354 nan
0.012 12.2581 1520 0.0268 0.7697 0.8302 0.9917 0.9967 0.6636 nan 0.9916 0.5477 nan
0.0106 12.4194 1540 0.0269 0.7715 0.8350 0.9917 0.9966 0.6735 nan 0.9916 0.5514 nan
0.0137 12.5806 1560 0.0269 0.7705 0.8228 0.9919 0.9972 0.6484 nan 0.9919 0.5492 nan
0.018 12.7419 1580 0.0270 0.7721 0.8315 0.9918 0.9968 0.6661 nan 0.9917 0.5525 nan
0.012 12.9032 1600 0.0269 0.7672 0.8245 0.9917 0.9969 0.6521 nan 0.9916 0.5429 nan
0.0151 13.0645 1620 0.0270 0.7670 0.8174 0.9919 0.9973 0.6374 nan 0.9918 0.5423 nan
0.0133 13.2258 1640 0.0270 0.7695 0.8183 0.9920 0.9974 0.6393 nan 0.9919 0.5470 nan
0.0146 13.3871 1660 0.0278 0.7687 0.8440 0.9913 0.9959 0.6921 nan 0.9912 0.5462 nan
0.0114 13.5484 1680 0.0269 0.7705 0.8326 0.9917 0.9967 0.6686 nan 0.9916 0.5495 nan
0.0159 13.7097 1700 0.0272 0.7690 0.8306 0.9916 0.9967 0.6646 nan 0.9916 0.5464 nan
0.0193 13.8710 1720 0.0272 0.7677 0.8167 0.9919 0.9974 0.6360 nan 0.9918 0.5435 nan
0.0112 14.0323 1740 0.0271 0.7695 0.8215 0.9919 0.9972 0.6457 nan 0.9918 0.5471 nan
0.0117 14.1935 1760 0.0278 0.7618 0.8048 0.9918 0.9977 0.6120 nan 0.9918 0.5318 nan
0.0169 14.3548 1780 0.0276 0.7686 0.8455 0.9913 0.9958 0.6952 nan 0.9912 0.5461 nan
0.0117 14.5161 1800 0.0274 0.7711 0.8412 0.9915 0.9962 0.6862 nan 0.9914 0.5508 nan
0.0139 14.6774 1820 0.0266 0.7718 0.8304 0.9918 0.9969 0.6639 nan 0.9918 0.5518 nan
0.0138 14.8387 1840 0.0273 0.7703 0.8285 0.9918 0.9969 0.6602 nan 0.9917 0.5489 nan
0.0142 15.0 1860 0.0271 0.7679 0.8128 0.9920 0.9976 0.6279 nan 0.9919 0.5439 nan
0.0113 15.1613 1880 0.0273 0.7707 0.8353 0.9916 0.9965 0.6741 nan 0.9916 0.5499 nan
0.0093 15.3226 1900 0.0269 0.7664 0.8087 0.9920 0.9978 0.6196 nan 0.9920 0.5407 nan
0.0119 15.4839 1920 0.0279 0.7623 0.7984 0.9920 0.9981 0.5986 nan 0.9920 0.5327 nan
0.0149 15.6452 1940 0.0270 0.7679 0.8138 0.9920 0.9976 0.6301 nan 0.9919 0.5439 nan
0.0128 15.8065 1960 0.0269 0.7700 0.8248 0.9919 0.9971 0.6524 nan 0.9918 0.5483 nan
0.0145 15.9677 1980 0.0273 0.7702 0.8212 0.9920 0.9973 0.6450 nan 0.9919 0.5485 nan
0.0101 16.1290 2000 0.0268 0.7713 0.8250 0.9919 0.9971 0.6530 nan 0.9919 0.5508 nan
0.0128 16.2903 2020 0.0276 0.7685 0.8195 0.9919 0.9973 0.6417 nan 0.9918 0.5451 nan
0.0097 16.4516 2040 0.0274 0.7675 0.8247 0.9917 0.9969 0.6525 nan 0.9916 0.5434 nan
0.0129 16.6129 2060 0.0274 0.7685 0.8196 0.9919 0.9973 0.6420 nan 0.9918 0.5452 nan
0.0122 16.7742 2080 0.0269 0.7707 0.8249 0.9919 0.9971 0.6528 nan 0.9918 0.5495 nan
0.0143 16.9355 2100 0.0273 0.7695 0.8264 0.9918 0.9970 0.6558 nan 0.9917 0.5473 nan
0.0091 17.0968 2120 0.0277 0.7699 0.8273 0.9918 0.9969 0.6577 nan 0.9917 0.5482 nan
0.0096 17.2581 2140 0.0268 0.7712 0.8267 0.9919 0.9970 0.6563 nan 0.9918 0.5505 nan
0.0094 17.4194 2160 0.0270 0.7707 0.8235 0.9919 0.9972 0.6499 nan 0.9919 0.5496 nan
0.0108 17.5806 2180 0.0270 0.7718 0.8240 0.9920 0.9972 0.6508 nan 0.9919 0.5517 nan
0.0123 17.7419 2200 0.0283 0.7646 0.8030 0.9921 0.9980 0.6079 nan 0.9920 0.5372 nan
0.0097 17.9032 2220 0.0268 0.7731 0.8300 0.9919 0.9970 0.6631 nan 0.9918 0.5544 nan
0.0103 18.0645 2240 0.0272 0.7725 0.8291 0.9919 0.9970 0.6612 nan 0.9918 0.5532 nan
0.01 18.2258 2260 0.0268 0.7717 0.8275 0.9919 0.9970 0.6579 nan 0.9918 0.5516 nan
0.0132 18.3871 2280 0.0270 0.7725 0.8441 0.9915 0.9962 0.6921 nan 0.9915 0.5536 nan
0.0086 18.5484 2300 0.0269 0.7716 0.8270 0.9919 0.9971 0.6569 nan 0.9918 0.5513 nan
0.011 18.7097 2320 0.0269 0.7737 0.8299 0.9920 0.9970 0.6628 nan 0.9919 0.5554 nan
0.0152 18.8710 2340 0.0270 0.7721 0.8241 0.9920 0.9973 0.6509 nan 0.9919 0.5523 nan
0.0103 19.0323 2360 0.0273 0.7719 0.8231 0.9920 0.9973 0.6489 nan 0.9919 0.5518 nan
0.0087 19.1935 2380 0.0272 0.7686 0.8116 0.9921 0.9977 0.6255 nan 0.9920 0.5452 nan
0.0108 19.3548 2400 0.0270 0.7726 0.8236 0.9921 0.9973 0.6500 nan 0.9920 0.5532 nan
0.0064 19.5161 2420 0.0274 0.7725 0.8220 0.9921 0.9974 0.6465 nan 0.9920 0.5529 nan
0.0135 19.6774 2440 0.0278 0.7658 0.8053 0.9921 0.9979 0.6127 nan 0.9920 0.5396 nan
0.0109 19.8387 2460 0.0274 0.7691 0.8116 0.9921 0.9978 0.6255 nan 0.9921 0.5461 nan
0.0078 20.0 2480 0.0273 0.7707 0.8197 0.9920 0.9974 0.6420 nan 0.9920 0.5494 nan
0.0087 20.1613 2500 0.0276 0.7677 0.8083 0.9921 0.9979 0.6187 nan 0.9921 0.5433 nan
0.0101 20.3226 2520 0.0276 0.7688 0.8111 0.9921 0.9978 0.6245 nan 0.9920 0.5456 nan
0.0126 20.4839 2540 0.0271 0.7721 0.8190 0.9921 0.9975 0.6404 nan 0.9921 0.5522 nan
0.0102 20.6452 2560 0.0276 0.7688 0.8133 0.9921 0.9976 0.6290 nan 0.9920 0.5455 nan
0.0112 20.8065 2580 0.0273 0.7732 0.8376 0.9917 0.9966 0.6786 nan 0.9917 0.5546 nan
0.0099 20.9677 2600 0.0272 0.7729 0.8360 0.9918 0.9966 0.6754 nan 0.9917 0.5541 nan
0.0103 21.1290 2620 0.0279 0.7686 0.8144 0.9920 0.9976 0.6313 nan 0.9919 0.5452 nan
0.0089 21.2903 2640 0.0273 0.7719 0.8269 0.9919 0.9971 0.6567 nan 0.9918 0.5520 nan
0.0094 21.4516 2660 0.0278 0.7687 0.8169 0.9920 0.9974 0.6364 nan 0.9919 0.5456 nan
0.0077 21.6129 2680 0.0276 0.7697 0.8199 0.9920 0.9973 0.6424 nan 0.9919 0.5476 nan
0.0106 21.7742 2700 0.0277 0.7688 0.8125 0.9921 0.9977 0.6272 nan 0.9920 0.5456 nan
0.0128 21.9355 2720 0.0277 0.7727 0.8277 0.9920 0.9971 0.6583 nan 0.9919 0.5535 nan
0.0098 22.0968 2740 0.0275 0.7723 0.8258 0.9920 0.9972 0.6544 nan 0.9919 0.5528 nan
0.0102 22.2581 2760 0.0279 0.7717 0.8249 0.9920 0.9972 0.6526 nan 0.9919 0.5515 nan
0.0093 22.4194 2780 0.0275 0.7733 0.8310 0.9919 0.9969 0.6650 nan 0.9918 0.5548 nan
0.0092 22.5806 2800 0.0277 0.7715 0.8219 0.9920 0.9973 0.6465 nan 0.9919 0.5510 nan
0.0131 22.7419 2820 0.0279 0.7704 0.8227 0.9919 0.9972 0.6482 nan 0.9919 0.5490 nan
0.0136 22.9032 2840 0.0276 0.7719 0.8236 0.9920 0.9973 0.6500 nan 0.9919 0.5519 nan
0.0108 23.0645 2860 0.0278 0.7709 0.8220 0.9920 0.9973 0.6467 nan 0.9919 0.5499 nan
0.0098 23.2258 2880 0.0289 0.7641 0.8033 0.9920 0.9979 0.6087 nan 0.9920 0.5363 nan
0.0103 23.3871 2900 0.0280 0.7697 0.8174 0.9920 0.9975 0.6373 nan 0.9919 0.5474 nan
0.0074 23.5484 2920 0.0280 0.7709 0.8204 0.9920 0.9974 0.6434 nan 0.9919 0.5498 nan
0.0133 23.7097 2940 0.0276 0.7729 0.8285 0.9920 0.9971 0.6599 nan 0.9919 0.5539 nan
0.0092 23.8710 2960 0.0274 0.7734 0.8317 0.9919 0.9969 0.6665 nan 0.9918 0.5551 nan
0.0088 24.0323 2980 0.0279 0.7723 0.8212 0.9921 0.9974 0.6450 nan 0.9920 0.5525 nan
0.0092 24.1935 3000 0.0282 0.7698 0.8136 0.9921 0.9977 0.6295 nan 0.9920 0.5475 nan
0.0088 24.3548 3020 0.0278 0.7716 0.8186 0.9921 0.9975 0.6397 nan 0.9920 0.5511 nan
0.0086 24.5161 3040 0.0279 0.7727 0.8252 0.9920 0.9972 0.6532 nan 0.9919 0.5535 nan
0.0091 24.6774 3060 0.0282 0.7707 0.8189 0.9920 0.9975 0.6404 nan 0.9920 0.5494 nan
0.0078 24.8387 3080 0.0278 0.7718 0.8223 0.9920 0.9973 0.6473 nan 0.9920 0.5516 nan
0.0081 25.0 3100 0.0281 0.7708 0.8206 0.9920 0.9974 0.6439 nan 0.9919 0.5497 nan
0.0075 25.1613 3120 0.0283 0.7707 0.8199 0.9920 0.9974 0.6423 nan 0.9919 0.5494 nan
0.0087 25.3226 3140 0.0276 0.7734 0.8296 0.9920 0.9970 0.6621 nan 0.9919 0.5548 nan
0.011 25.4839 3160 0.0284 0.7691 0.8131 0.9921 0.9977 0.6284 nan 0.9920 0.5462 nan
0.0106 25.6452 3180 0.0278 0.7729 0.8219 0.9921 0.9974 0.6463 nan 0.9920 0.5537 nan
0.0093 25.8065 3200 0.0277 0.7740 0.8285 0.9920 0.9971 0.6599 nan 0.9919 0.5561 nan
0.0118 25.9677 3220 0.0274 0.7748 0.8317 0.9920 0.9970 0.6664 nan 0.9919 0.5577 nan
0.0176 26.1290 3240 0.0280 0.7702 0.8149 0.9921 0.9977 0.6322 nan 0.9920 0.5484 nan
0.0101 26.2903 3260 0.0282 0.7718 0.8222 0.9920 0.9973 0.6471 nan 0.9920 0.5517 nan
0.0124 26.4516 3280 0.0281 0.7725 0.8232 0.9921 0.9973 0.6491 nan 0.9920 0.5531 nan
0.0069 26.6129 3300 0.0282 0.7716 0.8206 0.9921 0.9974 0.6438 nan 0.9920 0.5513 nan
0.0088 26.7742 3320 0.0282 0.7716 0.8207 0.9921 0.9974 0.6440 nan 0.9920 0.5512 nan
0.0119 26.9355 3340 0.0279 0.7729 0.8238 0.9921 0.9973 0.6503 nan 0.9920 0.5538 nan
0.0071 27.0968 3360 0.0278 0.7737 0.8274 0.9920 0.9972 0.6577 nan 0.9920 0.5555 nan
0.009 27.2581 3380 0.0282 0.7719 0.8201 0.9921 0.9975 0.6428 nan 0.9920 0.5519 nan
0.0089 27.4194 3400 0.0283 0.7720 0.8185 0.9921 0.9976 0.6394 nan 0.9921 0.5519 nan
0.008 27.5806 3420 0.0278 0.7749 0.8277 0.9921 0.9972 0.6581 nan 0.9920 0.5578 nan
0.0083 27.7419 3440 0.0279 0.7734 0.8216 0.9922 0.9975 0.6458 nan 0.9921 0.5547 nan
0.0085 27.9032 3460 0.0281 0.7725 0.8205 0.9921 0.9975 0.6436 nan 0.9920 0.5530 nan
0.0093 28.0645 3480 0.0282 0.7727 0.8217 0.9921 0.9974 0.6461 nan 0.9920 0.5533 nan
0.0096 28.2258 3500 0.0281 0.7741 0.8260 0.9921 0.9973 0.6547 nan 0.9920 0.5562 nan
0.0085 28.3871 3520 0.0280 0.7737 0.8238 0.9921 0.9974 0.6503 nan 0.9920 0.5554 nan
0.0099 28.5484 3540 0.0279 0.7729 0.8206 0.9921 0.9975 0.6437 nan 0.9921 0.5537 nan
0.0107 28.7097 3560 0.0280 0.7738 0.8247 0.9921 0.9973 0.6521 nan 0.9920 0.5556 nan
0.0093 28.8710 3580 0.0279 0.7733 0.8227 0.9921 0.9974 0.6480 nan 0.9920 0.5545 nan
0.0092 29.0323 3600 0.0281 0.7720 0.8192 0.9921 0.9975 0.6409 nan 0.9920 0.5520 nan
0.0086 29.1935 3620 0.0279 0.7732 0.8231 0.9921 0.9974 0.6488 nan 0.9920 0.5545 nan
0.0083 29.3548 3640 0.0280 0.7731 0.8220 0.9921 0.9974 0.6467 nan 0.9920 0.5541 nan
0.0083 29.5161 3660 0.0281 0.7726 0.8207 0.9921 0.9975 0.6439 nan 0.9920 0.5531 nan
0.0079 29.6774 3680 0.0281 0.7728 0.8218 0.9921 0.9974 0.6461 nan 0.9920 0.5536 nan
0.0083 29.8387 3700 0.0281 0.7731 0.8229 0.9921 0.9974 0.6484 nan 0.9920 0.5542 nan
0.0072 30.0 3720 0.0281 0.7734 0.8242 0.9921 0.9973 0.6511 nan 0.9920 0.5548 nan

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.1+cu121
  • Datasets 3.0.1
  • Tokenizers 0.19.1
Downloads last month
9
Safetensors
Model size
13.7M params
Tensor type
F32
·
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for uwaisasghar/segformer-b1-finetuned-segments-sidewalks-6

Base model

nvidia/mit-b1
Finetuned
(15)
this model