Edit model card

segformer-b1-improved

This model is a fine-tuned version of nvidia/mit-b1 on the samitizerxu/kelp_data dataset. It achieves the following results on the evaluation set:

  • Iou Kelp: 0.0
  • Loss: 0.0024

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.2
  • num_epochs: 40

Training results

Training Loss Epoch Step Iou Kelp Validation Loss
0.0769 0.11 30 0.0 0.0650
0.075 0.21 60 0.0 0.0710
0.0701 0.32 90 0.0 0.0738
0.0704 0.43 120 0.0 0.0710
0.0669 0.53 150 0.0 0.0699
0.0694 0.64 180 0.0 0.0679
0.0635 0.74 210 0.0 0.0676
0.0584 0.85 240 0.0 0.0647
0.0552 0.96 270 0.0 0.0631
0.0525 1.06 300 0.0 0.0592
0.052 1.17 330 0.0 0.0540
0.0449 1.28 360 0.0 0.0517
0.0443 1.38 390 0.0 0.0459
0.0364 1.49 420 0.0 0.0422
0.0361 1.6 450 0.0 0.0374
0.0339 1.7 480 0.0 0.0352
0.0307 1.81 510 0.0 0.0319
0.0283 1.91 540 0.0 0.0283
0.0302 2.02 570 0.0 0.0275
0.0277 2.13 600 0.0 0.0252
0.0231 2.23 630 0.0 0.0223
0.0261 2.34 660 0.0 0.0219
0.025 2.45 690 0.0 0.0183
0.0262 2.55 720 0.0 0.0163
0.0182 2.66 750 0.0 0.0175
0.0186 2.77 780 0.0 0.0147
0.0156 2.87 810 0.0 0.0151
0.0171 2.98 840 0.0 0.0148
0.0164 3.09 870 0.0 0.0161
0.0155 3.19 900 0.0 0.0139
0.0127 3.3 930 0.0 0.0126
0.0111 3.4 960 0.0 0.0105
0.0139 3.51 990 0.0 0.0103
0.0113 3.62 1020 0.0 0.0096
0.0093 3.72 1050 0.0 0.0088
0.0099 3.83 1080 0.0 0.0096
0.0092 3.94 1110 0.0 0.0096
0.0095 4.04 1140 0.0 0.0083
0.0113 4.15 1170 0.0 0.0086
0.0084 4.26 1200 0.0 0.0073
0.0066 4.36 1230 0.0 0.0081
0.01 4.47 1260 0.0 0.0074
0.0062 4.57 1290 0.0 0.0067
0.0053 4.68 1320 0.0 0.0065
0.0073 4.79 1350 0.0 0.0058
0.0069 4.89 1380 0.0 0.0065
0.0063 5.0 1410 0.0 0.0060
0.0062 5.11 1440 0.0 0.0055
0.0069 5.21 1470 0.0 0.0058
0.0049 5.32 1500 0.0 0.0055
0.0047 5.43 1530 0.0 0.0050
0.0092 5.53 1560 0.0 0.0054
0.0041 5.64 1590 0.0 0.0049
0.0042 5.74 1620 0.0 0.0049
0.005 5.85 1650 0.0 0.0044
0.0036 5.96 1680 0.0 0.0047
0.0059 6.06 1710 0.0 0.0048
0.0078 6.17 1740 0.0 0.0045
0.0036 6.28 1770 0.0 0.0042
0.004 6.38 1800 0.0 0.0039
0.0038 6.49 1830 0.0 0.0041
0.0038 6.6 1860 0.0 0.0042
0.0028 6.7 1890 0.0 0.0042
0.0036 6.81 1920 0.0 0.0036
0.0032 6.91 1950 0.0 0.0037
0.0029 7.02 1980 0.0 0.0035
0.0022 7.13 2010 0.0 0.0036
0.0025 7.23 2040 0.0 0.0036
0.0024 7.34 2070 0.0 0.0035
0.0022 7.45 2100 0.0 0.0032
0.0031 7.55 2130 0.0 0.0033
0.0028 7.66 2160 0.0 0.0031
0.0022 7.77 2190 0.0 0.0032
0.0023 7.87 2220 0.0 0.0034
0.0021 7.98 2250 0.0 0.0033
0.0018 8.09 2280 0.0 0.0031
0.0021 8.19 2310 0.0 0.0032
0.0038 8.3 2340 0.0 0.0030
0.0016 8.4 2370 0.0 0.0031
0.0066 8.51 2400 0.0 0.0029
0.0015 8.62 2430 0.0 0.0030
0.0022 8.72 2460 0.0 0.0028
0.0022 8.83 2490 0.0 0.0028
0.0012 8.94 2520 0.0 0.0027
0.0017 9.04 2550 0.0 0.0028
0.0019 9.15 2580 0.0 0.0027
0.0014 9.26 2610 0.0 0.0028
0.0014 9.36 2640 0.0 0.0028
0.0013 9.47 2670 0.0 0.0027
0.0014 9.57 2700 0.0 0.0027
0.0015 9.68 2730 0.0 0.0028
0.0014 9.79 2760 0.0 0.0027
0.001 9.89 2790 0.0 0.0026
0.0013 10.0 2820 0.0 0.0026
0.0014 10.11 2850 0.0 0.0026
0.0014 10.21 2880 0.0 0.0026
0.0031 10.32 2910 0.0 0.0025
0.0013 10.43 2940 0.0 0.0026
0.0012 10.53 2970 0.0 0.0026
0.0011 10.64 3000 0.0 0.0025
0.0009 10.74 3030 0.0 0.0025
0.0016 10.85 3060 0.0 0.0025
0.0014 10.96 3090 0.0 0.0025
0.0008 11.06 3120 0.0 0.0025
0.0008 11.17 3150 0.0 0.0025
0.0019 11.28 3180 0.0 0.0026
0.001 11.38 3210 0.0 0.0025
0.0009 11.49 3240 0.0 0.0025
0.0014 11.6 3270 0.0 0.0026
0.0014 11.7 3300 0.0 0.0024
0.0008 11.81 3330 0.0 0.0025
0.001 11.91 3360 0.0 0.0024
0.0012 12.02 3390 0.0 0.0024
0.0009 12.13 3420 0.0 0.0024
0.0007 12.23 3450 0.0 0.0025
0.0011 12.34 3480 0.0 0.0024
0.0012 12.45 3510 0.0 0.0023
0.0008 12.55 3540 0.0 0.0023
0.0005 12.66 3570 0.0 0.0024
0.0007 12.77 3600 0.0 0.0023
0.0006 12.87 3630 0.0 0.0024
0.0006 12.98 3660 0.0 0.0024
0.0014 13.09 3690 0.0 0.0025
0.0008 13.19 3720 0.0 0.0023
0.0007 13.3 3750 0.0 0.0022
0.0005 13.4 3780 0.0 0.0022
0.0006 13.51 3810 0.0 0.0022
0.0005 13.62 3840 0.0 0.0023
0.0007 13.72 3870 0.0 0.0022
0.0009 13.83 3900 0.0 0.0023
0.0005 13.94 3930 0.0 0.0022
0.0006 14.04 3960 0.0 0.0023
0.0005 14.15 3990 0.0 0.0023
0.0005 14.26 4020 0.0 0.0023
0.0008 14.36 4050 0.0 0.0022
0.0006 14.47 4080 0.0 0.0022
0.001 14.57 4110 0.0 0.0022
0.0004 14.68 4140 0.0 0.0021
0.0005 14.79 4170 0.0 0.0022
0.0006 14.89 4200 0.0 0.0023
0.0006 15.0 4230 0.0 0.0023
0.0007 15.11 4260 0.0 0.0022
0.0006 15.21 4290 0.0 0.0022
0.0005 15.32 4320 0.0 0.0021
0.0004 15.43 4350 0.0 0.0021
0.0008 15.53 4380 0.0 0.0021
0.0005 15.64 4410 0.0 0.0021
0.0005 15.74 4440 0.0 0.0022
0.0003 15.85 4470 0.0 0.0021
0.0005 15.96 4500 0.0 0.0021
0.0004 16.06 4530 0.0 0.0023
0.0004 16.17 4560 0.0 0.0022
0.0004 16.28 4590 0.0 0.0021
0.0005 16.38 4620 0.0 0.0020
0.0007 16.49 4650 0.0 0.0022
0.0005 16.6 4680 0.0 0.0021
0.0011 16.7 4710 0.0 0.0020
0.0005 16.81 4740 0.0 0.0020
0.0005 16.91 4770 0.0 0.0022
0.0004 17.02 4800 0.0 0.0021
0.0004 17.13 4830 0.0 0.0022
0.0004 17.23 4860 0.0 0.0021
0.0007 17.34 4890 0.0 0.0021
0.0006 17.45 4920 0.0 0.0021
0.0006 17.55 4950 0.0 0.0021
0.001 17.66 4980 0.0 0.0021
0.0003 17.77 5010 0.0 0.0020
0.0002 17.87 5040 0.0 0.0021
0.0005 17.98 5070 0.0 0.0021
0.0005 18.09 5100 0.0 0.0021
0.0004 18.19 5130 0.0 0.0021
0.0008 18.3 5160 0.0 0.0023
0.0004 18.4 5190 0.0 0.0020
0.0002 18.51 5220 0.0 0.0021
0.0003 18.62 5250 0.0 0.0020
0.0006 18.72 5280 0.0 0.0021
0.0004 18.83 5310 0.0 0.0021
0.0005 18.94 5340 0.0 0.0020
0.0004 19.04 5370 0.0 0.0020
0.0007 19.15 5400 0.0 0.0020
0.0005 19.26 5430 0.0 0.0021
0.0006 19.36 5460 0.0 0.0020
0.0004 19.47 5490 0.0 0.0021
0.0004 19.57 5520 0.0 0.0021
0.0002 19.68 5550 0.0 0.0021
0.0008 19.79 5580 0.0 0.0020
0.0006 19.89 5610 0.0 0.0024
0.0005 20.0 5640 0.0 0.0022
0.0002 20.11 5670 0.0 0.0021
0.0002 20.21 5700 0.0 0.0022
0.0002 20.32 5730 0.0 0.0022
0.0004 20.43 5760 0.0 0.0022
0.0002 20.53 5790 0.0 0.0023
0.0008 20.64 5820 0.0 0.0022
0.0002 20.74 5850 0.0 0.0023
0.0002 20.85 5880 0.0 0.0023
0.0021 20.96 5910 0.0 0.0023
0.0002 21.06 5940 0.0 0.0023
0.0002 21.17 5970 0.0 0.0023
0.0003 21.28 6000 0.0 0.0022
0.0002 21.38 6030 0.0 0.0023
0.0004 21.49 6060 0.0 0.0022
0.0002 21.6 6090 0.0 0.0023
0.0002 21.7 6120 0.0 0.0021
0.0007 21.81 6150 0.0 0.0022
0.0004 21.91 6180 0.0 0.0021
0.0002 22.02 6210 0.0 0.0021
0.0004 22.13 6240 0.0 0.0023
0.0004 22.23 6270 0.0 0.0023
0.0002 22.34 6300 0.0 0.0023
0.0002 22.45 6330 0.0 0.0024
0.0006 22.55 6360 0.0 0.0023
0.0002 22.66 6390 0.0 0.0021
0.0002 22.77 6420 0.0 0.0022
0.0002 22.87 6450 0.0 0.0023
0.0003 22.98 6480 0.0 0.0022
0.0004 23.09 6510 0.0 0.0021
0.0001 23.19 6540 0.0 0.0021
0.0004 23.3 6570 0.0 0.0023
0.0002 23.4 6600 0.0 0.0023
0.0002 23.51 6630 0.0 0.0021
0.0007 23.62 6660 0.0 0.0023
0.0001 23.72 6690 0.0 0.0022
0.0005 23.83 6720 0.0 0.0022
0.0003 23.94 6750 0.0 0.0022
0.0003 24.04 6780 0.0 0.0023
0.0013 24.15 6810 0.0 0.0022
0.0001 24.26 6840 0.0 0.0023
0.0003 24.36 6870 0.0 0.0022
0.0003 24.47 6900 0.0 0.0019
0.0001 24.57 6930 0.0 0.0023
0.0002 24.68 6960 0.0 0.0023
0.0002 24.79 6990 0.0 0.0021
0.0003 24.89 7020 0.0 0.0022
0.0003 25.0 7050 0.0 0.0022
0.0001 25.11 7080 0.0 0.0023
0.0002 25.21 7110 0.0 0.0022
0.0002 25.32 7140 0.0 0.0022
0.0002 25.43 7170 0.0 0.0022
0.0002 25.53 7200 0.0 0.0022
0.0003 25.64 7230 0.0 0.0023
0.0001 25.74 7260 0.0 0.0024
0.0001 25.85 7290 0.0 0.0023
0.0002 25.96 7320 0.0 0.0022
0.0002 26.06 7350 0.0 0.0021
0.0004 26.17 7380 0.0 0.0023
0.0002 26.28 7410 0.0 0.0023
0.0001 26.38 7440 0.0 0.0023
0.0002 26.49 7470 0.0 0.0020
0.0006 26.6 7500 0.0 0.0027
0.0002 26.7 7530 0.0 0.0022
0.0002 26.81 7560 0.0 0.0022
0.0006 26.91 7590 0.0 0.0020
0.0002 27.02 7620 0.0 0.0023
0.0004 27.13 7650 0.0 0.0022
0.0002 27.23 7680 0.0 0.0020
0.0004 27.34 7710 0.0 0.0021
0.0002 27.45 7740 0.0 0.0023
0.0001 27.55 7770 0.0 0.0023
0.0005 27.66 7800 0.0 0.0022
0.0002 27.77 7830 0.0 0.0023
0.0001 27.87 7860 0.0 0.0021
0.0001 27.98 7890 0.0 0.0022
0.0001 28.09 7920 0.0 0.0022
0.0002 28.19 7950 0.0 0.0021
0.0002 28.3 7980 0.0 0.0020
0.0001 28.4 8010 0.0 0.0023
0.0006 28.51 8040 0.0 0.0022
0.0001 28.62 8070 0.0 0.0023
0.0001 28.72 8100 0.0 0.0022
0.0001 28.83 8130 0.0 0.0021
0.0001 28.94 8160 0.0 0.0021
0.0002 29.04 8190 0.0 0.0023
0.0002 29.15 8220 0.0 0.0022
0.0001 29.26 8250 0.0 0.0020
0.0003 29.36 8280 0.0 0.0023
0.0001 29.47 8310 0.0 0.0022
0.0001 29.57 8340 0.0 0.0023
0.0002 29.68 8370 0.0 0.0023
0.0002 29.79 8400 0.0 0.0021
0.0002 29.89 8430 0.0 0.0023
0.0001 30.0 8460 0.0 0.0023
0.0001 30.11 8490 0.0 0.0025
0.0001 30.21 8520 0.0 0.0024
0.0001 30.32 8550 0.0 0.0024
0.0002 30.43 8580 0.0 0.0024
0.0003 30.53 8610 0.0 0.0024
0.0004 30.64 8640 0.0 0.0023
0.0003 30.74 8670 0.0 0.0021
0.0001 30.85 8700 0.0 0.0019
0.0001 30.96 8730 0.0 0.0021
0.0001 31.06 8760 0.0 0.0023
0.0004 31.17 8790 0.0 0.0022
0.0002 31.28 8820 0.0 0.0019
0.0001 31.38 8850 0.0 0.0023
0.0005 31.49 8880 0.0 0.0024
0.0001 31.6 8910 0.0 0.0024
0.0003 31.7 8940 0.0 0.0024
0.0005 31.81 8970 0.0 0.0024
0.0002 31.91 9000 0.0 0.0022
0.0003 32.02 9030 0.0 0.0023
0.0001 32.13 9060 0.0 0.0023
0.0003 32.23 9090 0.0 0.0023
0.0002 32.34 9120 0.0 0.0021
0.0002 32.45 9150 0.0 0.0023
0.0006 32.55 9180 0.0 0.0022
0.0006 32.66 9210 0.0 0.0023
0.0014 32.77 9240 0.0 0.0023
0.0001 32.87 9270 0.0 0.0022
0.0001 32.98 9300 0.0 0.0021
0.0001 33.09 9330 0.0 0.0021
0.0002 33.19 9360 0.0 0.0025
0.0001 33.3 9390 0.0 0.0024
0.0001 33.4 9420 0.0 0.0022
0.0002 33.51 9450 0.0 0.0023
0.002 33.62 9480 0.0 0.0024
0.0003 33.72 9510 0.0 0.0022
0.0003 33.83 9540 0.0 0.0023
0.0006 33.94 9570 0.0 0.0023
0.0001 34.04 9600 0.0 0.0024
0.0001 34.15 9630 0.0 0.0023
0.0002 34.26 9660 0.0 0.0023
0.0002 34.36 9690 0.0 0.0021
0.0001 34.47 9720 0.0 0.0023
0.0001 34.57 9750 0.0 0.0023
0.0003 34.68 9780 0.0 0.0024
0.0002 34.79 9810 0.0 0.0023
0.0002 34.89 9840 0.0 0.0023
0.0001 35.0 9870 0.0 0.0023
0.0003 35.11 9900 0.0 0.0024
0.0001 35.21 9930 0.0 0.0024
0.0002 35.32 9960 0.0 0.0022
0.0001 35.43 9990 0.0 0.0024
0.0002 35.53 10020 0.0 0.0023
0.0001 35.64 10050 0.0 0.0024
0.0001 35.74 10080 0.0 0.0024
0.0001 35.85 10110 0.0 0.0024
0.0001 35.96 10140 0.0 0.0022
0.0002 36.06 10170 0.0 0.0023
0.0002 36.17 10200 0.0 0.0025
0.0001 36.28 10230 0.0 0.0025
0.0001 36.38 10260 0.0 0.0027
0.0001 36.49 10290 0.0 0.0024
0.0001 36.6 10320 0.0 0.0024
0.0004 36.7 10350 0.0 0.0024
0.0002 36.81 10380 0.0 0.0023
0.0002 36.91 10410 0.0 0.0024
0.0003 37.02 10440 0.0 0.0024
0.0004 37.13 10470 0.0 0.0025
0.0001 37.23 10500 0.0 0.0024
0.0001 37.34 10530 0.0 0.0024
0.0001 37.45 10560 0.0 0.0025
0.0002 37.55 10590 0.0 0.0025
0.0002 37.66 10620 0.0 0.0024
0.0001 37.77 10650 0.0 0.0025
0.0005 37.87 10680 0.0 0.0024
0.0001 37.98 10710 0.0 0.0025
0.0 38.09 10740 0.0 0.0025
0.0001 38.19 10770 0.0 0.0026
0.0001 38.3 10800 0.0 0.0025
0.0001 38.4 10830 0.0 0.0025
0.0001 38.51 10860 0.0 0.0025
0.0002 38.62 10890 0.0 0.0024
0.0001 38.72 10920 0.0 0.0025
0.0 38.83 10950 0.0 0.0025
0.0001 38.94 10980 0.0 0.0024
0.0001 39.04 11010 0.0 0.0024
0.0003 39.15 11040 0.0 0.0024
0.0005 39.26 11070 0.0 0.0025
0.0 39.36 11100 0.0 0.0024
0.0001 39.47 11130 0.0 0.0024
0.0002 39.57 11160 0.0 0.0024
0.0004 39.68 11190 0.0 0.0024
0.0002 39.79 11220 0.0 0.0025
0.0003 39.89 11250 0.0 0.0024
0.0 40.0 11280 0.0 0.0024

Framework versions

  • Transformers 4.37.1
  • Pytorch 2.1.2
  • Datasets 2.16.1
  • Tokenizers 0.15.1
Downloads last month
16
Safetensors
Model size
13.7M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for samitizerxu/segformer-b1-improved

Base model

nvidia/mit-b1
Finetuned
(15)
this model