Edit model card

segformer-b5-p142-cvat-vgs

This model is a fine-tuned version of nvidia/mit-b5 on the vigneshgs7/segformer_open_cv_RGB_L_0_1 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0131
  • Mean Iou: 0.4961
  • Mean Accuracy: 0.9922
  • Overall Accuracy: 0.9922
  • Accuracy Background: nan
  • Accuracy Object: 0.9922
  • Iou Background: 0.0
  • Iou Object: 0.9922

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Background Accuracy Object Iou Background Iou Object
0.2847 0.06 20 0.3843 0.4662 0.9324 0.9324 nan 0.9324 0.0 0.9324
0.1681 0.11 40 0.1983 0.4704 0.9408 0.9408 nan 0.9408 0.0 0.9408
0.1592 0.17 60 0.1303 0.4745 0.9489 0.9489 nan 0.9489 0.0 0.9489
0.1177 0.23 80 0.0922 0.4944 0.9888 0.9888 nan 0.9888 0.0 0.9888
0.062 0.29 100 0.0745 0.4946 0.9892 0.9892 nan 0.9892 0.0 0.9892
0.0767 0.34 120 0.0545 0.4852 0.9703 0.9703 nan 0.9703 0.0 0.9703
0.0984 0.4 140 0.0621 0.4938 0.9875 0.9875 nan 0.9875 0.0 0.9875
0.1779 0.46 160 0.0504 0.4961 0.9921 0.9921 nan 0.9921 0.0 0.9921
0.0468 0.52 180 0.0407 0.4904 0.9807 0.9807 nan 0.9807 0.0 0.9807
0.0618 0.57 200 0.0390 0.4936 0.9873 0.9873 nan 0.9873 0.0 0.9873
0.062 0.63 220 0.0348 0.4947 0.9894 0.9894 nan 0.9894 0.0 0.9894
0.0357 0.69 240 0.0341 0.4914 0.9828 0.9828 nan 0.9828 0.0 0.9828
0.0304 0.74 260 0.0351 0.4960 0.9920 0.9920 nan 0.9920 0.0 0.9920
0.0267 0.8 280 0.0311 0.4938 0.9877 0.9877 nan 0.9877 0.0 0.9877
0.0536 0.86 300 0.0282 0.4904 0.9807 0.9807 nan 0.9807 0.0 0.9807
0.049 0.92 320 0.0274 0.4928 0.9855 0.9855 nan 0.9855 0.0 0.9855
0.0304 0.97 340 0.0262 0.4936 0.9872 0.9872 nan 0.9872 0.0 0.9872
0.0232 1.03 360 0.0251 0.4923 0.9847 0.9847 nan 0.9847 0.0 0.9847
0.0304 1.09 380 0.0240 0.4917 0.9835 0.9835 nan 0.9835 0.0 0.9835
0.0451 1.15 400 0.0261 0.4964 0.9927 0.9927 nan 0.9927 0.0 0.9927
0.0254 1.2 420 0.0234 0.4929 0.9859 0.9859 nan 0.9859 0.0 0.9859
0.0354 1.26 440 0.0229 0.4931 0.9861 0.9861 nan 0.9861 0.0 0.9861
0.2103 1.32 460 0.0224 0.4951 0.9902 0.9902 nan 0.9902 0.0 0.9902
0.041 1.38 480 0.0222 0.4920 0.9839 0.9839 nan 0.9839 0.0 0.9839
0.0297 1.43 500 0.0223 0.4950 0.9900 0.9900 nan 0.9900 0.0 0.9900
0.0299 1.49 520 0.0227 0.4961 0.9923 0.9923 nan 0.9923 0.0 0.9923
0.0213 1.55 540 0.0209 0.4947 0.9895 0.9895 nan 0.9895 0.0 0.9895
0.0269 1.6 560 0.0214 0.4909 0.9817 0.9817 nan 0.9817 0.0 0.9817
0.2199 1.66 580 0.0216 0.4956 0.9912 0.9912 nan 0.9912 0.0 0.9912
0.0191 1.72 600 0.0208 0.4935 0.9869 0.9869 nan 0.9869 0.0 0.9869
0.0265 1.78 620 0.0201 0.4941 0.9882 0.9882 nan 0.9882 0.0 0.9882
0.0244 1.83 640 0.0213 0.4910 0.9820 0.9820 nan 0.9820 0.0 0.9820
0.0172 1.89 660 0.0199 0.4929 0.9858 0.9858 nan 0.9858 0.0 0.9858
0.0339 1.95 680 0.0190 0.4930 0.9859 0.9859 nan 0.9859 0.0 0.9859
0.027 2.01 700 0.0192 0.4953 0.9906 0.9906 nan 0.9906 0.0 0.9906
0.0221 2.06 720 0.0195 0.4915 0.9830 0.9830 nan 0.9830 0.0 0.9830
0.0461 2.12 740 0.0188 0.4953 0.9905 0.9905 nan 0.9905 0.0 0.9905
0.0444 2.18 760 0.0189 0.4957 0.9914 0.9914 nan 0.9914 0.0 0.9914
0.0211 2.23 780 0.0184 0.4949 0.9898 0.9898 nan 0.9898 0.0 0.9898
0.0221 2.29 800 0.0186 0.4963 0.9925 0.9925 nan 0.9925 0.0 0.9925
0.0165 2.35 820 0.0181 0.4942 0.9883 0.9883 nan 0.9883 0.0 0.9883
0.0171 2.41 840 0.0181 0.4923 0.9846 0.9846 nan 0.9846 0.0 0.9846
0.0202 2.46 860 0.0178 0.4958 0.9915 0.9915 nan 0.9915 0.0 0.9915
0.0222 2.52 880 0.0178 0.4922 0.9844 0.9844 nan 0.9844 0.0 0.9844
0.018 2.58 900 0.0162 0.4949 0.9898 0.9898 nan 0.9898 0.0 0.9898
0.0288 2.64 920 0.0168 0.4943 0.9887 0.9887 nan 0.9887 0.0 0.9887
0.016 2.69 940 0.0178 0.4968 0.9936 0.9936 nan 0.9936 0.0 0.9936
0.0184 2.75 960 0.0172 0.4935 0.9870 0.9870 nan 0.9870 0.0 0.9870
0.0172 2.81 980 0.0175 0.4950 0.9900 0.9900 nan 0.9900 0.0 0.9900
0.0168 2.87 1000 0.0172 0.4951 0.9902 0.9902 nan 0.9902 0.0 0.9902
0.0197 2.92 1020 0.0169 0.4961 0.9923 0.9923 nan 0.9923 0.0 0.9923
0.0177 2.98 1040 0.0170 0.4961 0.9922 0.9922 nan 0.9922 0.0 0.9922
0.0377 3.04 1060 0.0163 0.4944 0.9888 0.9888 nan 0.9888 0.0 0.9888
0.0168 3.09 1080 0.0162 0.4953 0.9906 0.9906 nan 0.9906 0.0 0.9906
0.0167 3.15 1100 0.0166 0.4961 0.9922 0.9922 nan 0.9922 0.0 0.9922
0.0213 3.21 1120 0.0164 0.4948 0.9895 0.9895 nan 0.9895 0.0 0.9895
0.0195 3.27 1140 0.0162 0.4947 0.9894 0.9894 nan 0.9894 0.0 0.9894
0.014 3.32 1160 0.0160 0.4950 0.9900 0.9900 nan 0.9900 0.0 0.9900
0.0221 3.38 1180 0.0164 0.4961 0.9922 0.9922 nan 0.9922 0.0 0.9922
0.0162 3.44 1200 0.0159 0.4945 0.9890 0.9890 nan 0.9890 0.0 0.9890
0.0153 3.5 1220 0.0152 0.4957 0.9914 0.9914 nan 0.9914 0.0 0.9914
0.0145 3.55 1240 0.0161 0.4935 0.9871 0.9871 nan 0.9871 0.0 0.9871
0.0139 3.61 1260 0.0155 0.4951 0.9902 0.9902 nan 0.9902 0.0 0.9902
0.0153 3.67 1280 0.0157 0.4942 0.9884 0.9884 nan 0.9884 0.0 0.9884
0.0156 3.72 1300 0.0157 0.4949 0.9898 0.9898 nan 0.9898 0.0 0.9898
0.033 3.78 1320 0.0157 0.4952 0.9903 0.9903 nan 0.9903 0.0 0.9903
0.0219 3.84 1340 0.0153 0.4957 0.9915 0.9915 nan 0.9915 0.0 0.9915
0.0166 3.9 1360 0.0162 0.4935 0.9871 0.9871 nan 0.9871 0.0 0.9871
0.0168 3.95 1380 0.0157 0.4949 0.9897 0.9897 nan 0.9897 0.0 0.9897
0.0177 4.01 1400 0.0153 0.4966 0.9932 0.9932 nan 0.9932 0.0 0.9932
0.0136 4.07 1420 0.0150 0.4952 0.9905 0.9905 nan 0.9905 0.0 0.9905
0.0334 4.13 1440 0.0156 0.4956 0.9912 0.9912 nan 0.9912 0.0 0.9912
0.019 4.18 1460 0.0154 0.4950 0.9899 0.9899 nan 0.9899 0.0 0.9899
0.0147 4.24 1480 0.0148 0.4960 0.9920 0.9920 nan 0.9920 0.0 0.9920
0.0135 4.3 1500 0.0146 0.4951 0.9902 0.9902 nan 0.9902 0.0 0.9902
0.0186 4.36 1520 0.0143 0.4966 0.9933 0.9933 nan 0.9933 0.0 0.9933
0.0153 4.41 1540 0.0141 0.4954 0.9909 0.9909 nan 0.9909 0.0 0.9909
0.0181 4.47 1560 0.0145 0.4954 0.9908 0.9908 nan 0.9908 0.0 0.9908
0.0266 4.53 1580 0.0146 0.4953 0.9907 0.9907 nan 0.9907 0.0 0.9907
0.0141 4.58 1600 0.0147 0.4952 0.9904 0.9904 nan 0.9904 0.0 0.9904
0.0145 4.64 1620 0.0150 0.4947 0.9894 0.9894 nan 0.9894 0.0 0.9894
0.0128 4.7 1640 0.0151 0.4964 0.9928 0.9928 nan 0.9928 0.0 0.9928
0.0119 4.76 1660 0.0143 0.4948 0.9897 0.9897 nan 0.9897 0.0 0.9897
0.0133 4.81 1680 0.0144 0.4950 0.9900 0.9900 nan 0.9900 0.0 0.9900
0.0151 4.87 1700 0.0143 0.4956 0.9911 0.9911 nan 0.9911 0.0 0.9911
0.0211 4.93 1720 0.0149 0.4965 0.9930 0.9930 nan 0.9930 0.0 0.9930
0.0136 4.99 1740 0.0144 0.4964 0.9928 0.9928 nan 0.9928 0.0 0.9928
0.0129 5.04 1760 0.0142 0.4967 0.9934 0.9934 nan 0.9934 0.0 0.9934
0.0176 5.1 1780 0.0142 0.4965 0.9930 0.9930 nan 0.9930 0.0 0.9930
0.0119 5.16 1800 0.0141 0.4958 0.9916 0.9916 nan 0.9916 0.0 0.9916
0.021 5.21 1820 0.0143 0.4960 0.9920 0.9920 nan 0.9920 0.0 0.9920
0.0146 5.27 1840 0.0137 0.4961 0.9922 0.9922 nan 0.9922 0.0 0.9922
0.0158 5.33 1860 0.0138 0.4953 0.9905 0.9905 nan 0.9905 0.0 0.9905
0.014 5.39 1880 0.0142 0.4956 0.9913 0.9913 nan 0.9913 0.0 0.9913
0.0145 5.44 1900 0.0145 0.4952 0.9905 0.9905 nan 0.9905 0.0 0.9905
0.019 5.5 1920 0.0145 0.4960 0.9920 0.9920 nan 0.9920 0.0 0.9920
0.0134 5.56 1940 0.0143 0.4958 0.9915 0.9915 nan 0.9915 0.0 0.9915
0.011 5.62 1960 0.0141 0.4955 0.9910 0.9910 nan 0.9910 0.0 0.9910
0.0159 5.67 1980 0.0143 0.4971 0.9942 0.9942 nan 0.9942 0.0 0.9942
0.0132 5.73 2000 0.0140 0.4966 0.9933 0.9933 nan 0.9933 0.0 0.9933
0.017 5.79 2020 0.0136 0.4964 0.9928 0.9928 nan 0.9928 0.0 0.9928
0.0156 5.85 2040 0.0139 0.4951 0.9902 0.9902 nan 0.9902 0.0 0.9902
0.0169 5.9 2060 0.0142 0.4943 0.9887 0.9887 nan 0.9887 0.0 0.9887
0.0337 5.96 2080 0.0145 0.4967 0.9933 0.9933 nan 0.9933 0.0 0.9933
0.0158 6.02 2100 0.0141 0.4949 0.9898 0.9898 nan 0.9898 0.0 0.9898
0.0401 6.07 2120 0.0139 0.4956 0.9912 0.9912 nan 0.9912 0.0 0.9912
0.0629 6.13 2140 0.0138 0.4952 0.9904 0.9904 nan 0.9904 0.0 0.9904
0.0143 6.19 2160 0.0142 0.4967 0.9935 0.9935 nan 0.9935 0.0 0.9935
0.0133 6.25 2180 0.0135 0.4957 0.9915 0.9915 nan 0.9915 0.0 0.9915
0.0326 6.3 2200 0.0139 0.4963 0.9925 0.9925 nan 0.9925 0.0 0.9925
0.0141 6.36 2220 0.0133 0.4955 0.9910 0.9910 nan 0.9910 0.0 0.9910
0.0119 6.42 2240 0.0134 0.4958 0.9915 0.9915 nan 0.9915 0.0 0.9915
0.0133 6.48 2260 0.0139 0.4962 0.9924 0.9924 nan 0.9924 0.0 0.9924
0.0123 6.53 2280 0.0138 0.4967 0.9934 0.9934 nan 0.9934 0.0 0.9934
0.014 6.59 2300 0.0138 0.4962 0.9925 0.9925 nan 0.9925 0.0 0.9925
0.0137 6.65 2320 0.0136 0.4958 0.9916 0.9916 nan 0.9916 0.0 0.9916
0.0173 6.7 2340 0.0138 0.4964 0.9928 0.9928 nan 0.9928 0.0 0.9928
0.0137 6.76 2360 0.0136 0.4953 0.9905 0.9905 nan 0.9905 0.0 0.9905
0.0153 6.82 2380 0.0134 0.4958 0.9916 0.9916 nan 0.9916 0.0 0.9916
0.0135 6.88 2400 0.0137 0.4963 0.9926 0.9926 nan 0.9926 0.0 0.9926
0.0151 6.93 2420 0.0137 0.4952 0.9904 0.9904 nan 0.9904 0.0 0.9904
0.0122 6.99 2440 0.0134 0.4959 0.9918 0.9918 nan 0.9918 0.0 0.9918
0.013 7.05 2460 0.0135 0.4970 0.9941 0.9941 nan 0.9941 0.0 0.9941
0.0134 7.11 2480 0.0133 0.4964 0.9928 0.9928 nan 0.9928 0.0 0.9928
0.0145 7.16 2500 0.0134 0.4962 0.9924 0.9924 nan 0.9924 0.0 0.9924
0.028 7.22 2520 0.0135 0.4962 0.9924 0.9924 nan 0.9924 0.0 0.9924
0.0288 7.28 2540 0.0137 0.4967 0.9933 0.9933 nan 0.9933 0.0 0.9933
0.0117 7.34 2560 0.0135 0.4964 0.9927 0.9927 nan 0.9927 0.0 0.9927
0.013 7.39 2580 0.0136 0.4966 0.9932 0.9932 nan 0.9932 0.0 0.9932
0.0158 7.45 2600 0.0134 0.4950 0.9899 0.9899 nan 0.9899 0.0 0.9899
0.0135 7.51 2620 0.0134 0.4964 0.9928 0.9928 nan 0.9928 0.0 0.9928
0.0136 7.56 2640 0.0140 0.4967 0.9935 0.9935 nan 0.9935 0.0 0.9935
0.0396 7.62 2660 0.0133 0.4961 0.9922 0.9922 nan 0.9922 0.0 0.9922
0.0109 7.68 2680 0.0134 0.4963 0.9925 0.9925 nan 0.9925 0.0 0.9925
0.0148 7.74 2700 0.0133 0.4963 0.9925 0.9925 nan 0.9925 0.0 0.9925
0.0121 7.79 2720 0.0140 0.4945 0.9890 0.9890 nan 0.9890 0.0 0.9890
0.0109 7.85 2740 0.0139 0.4957 0.9913 0.9913 nan 0.9913 0.0 0.9913
0.014 7.91 2760 0.0135 0.4957 0.9915 0.9915 nan 0.9915 0.0 0.9915
0.0199 7.97 2780 0.0134 0.4959 0.9917 0.9917 nan 0.9917 0.0 0.9917
0.0119 8.02 2800 0.0136 0.4958 0.9916 0.9916 nan 0.9916 0.0 0.9916
0.0129 8.08 2820 0.0136 0.4962 0.9924 0.9924 nan 0.9924 0.0 0.9924
0.0108 8.14 2840 0.0134 0.4959 0.9917 0.9917 nan 0.9917 0.0 0.9917
0.0209 8.19 2860 0.0136 0.4960 0.9920 0.9920 nan 0.9920 0.0 0.9920
0.0154 8.25 2880 0.0137 0.4964 0.9928 0.9928 nan 0.9928 0.0 0.9928
0.0141 8.31 2900 0.0132 0.4965 0.9929 0.9929 nan 0.9929 0.0 0.9929
0.0187 8.37 2920 0.0131 0.4956 0.9912 0.9912 nan 0.9912 0.0 0.9912
0.0124 8.42 2940 0.0133 0.4959 0.9918 0.9918 nan 0.9918 0.0 0.9918
0.0135 8.48 2960 0.0132 0.4963 0.9926 0.9926 nan 0.9926 0.0 0.9926
0.0283 8.54 2980 0.0131 0.4958 0.9917 0.9917 nan 0.9917 0.0 0.9917
0.0691 8.6 3000 0.0131 0.4965 0.9930 0.9930 nan 0.9930 0.0 0.9930
0.0142 8.65 3020 0.0131 0.4965 0.9929 0.9929 nan 0.9929 0.0 0.9929
0.0155 8.71 3040 0.0130 0.4966 0.9931 0.9931 nan 0.9931 0.0 0.9931
0.0115 8.77 3060 0.0129 0.4966 0.9932 0.9932 nan 0.9932 0.0 0.9932
0.0095 8.83 3080 0.0130 0.4963 0.9927 0.9927 nan 0.9927 0.0 0.9927
0.012 8.88 3100 0.0132 0.4954 0.9907 0.9907 nan 0.9907 0.0 0.9907
0.0153 8.94 3120 0.0132 0.4965 0.9930 0.9930 nan 0.9930 0.0 0.9930
0.0141 9.0 3140 0.0134 0.4958 0.9917 0.9917 nan 0.9917 0.0 0.9917
0.0141 9.05 3160 0.0133 0.4958 0.9915 0.9915 nan 0.9915 0.0 0.9915
0.016 9.11 3180 0.0133 0.4964 0.9929 0.9929 nan 0.9929 0.0 0.9929
0.017 9.17 3200 0.0132 0.4965 0.9929 0.9929 nan 0.9929 0.0 0.9929
0.0245 9.23 3220 0.0132 0.4961 0.9921 0.9921 nan 0.9921 0.0 0.9921
0.0101 9.28 3240 0.0132 0.4962 0.9924 0.9924 nan 0.9924 0.0 0.9924
0.012 9.34 3260 0.0133 0.4959 0.9917 0.9917 nan 0.9917 0.0 0.9917
0.0111 9.4 3280 0.0133 0.4964 0.9928 0.9928 nan 0.9928 0.0 0.9928
0.0148 9.46 3300 0.0132 0.4962 0.9925 0.9925 nan 0.9925 0.0 0.9925
0.0124 9.51 3320 0.0135 0.4967 0.9934 0.9934 nan 0.9934 0.0 0.9934
0.0209 9.57 3340 0.0133 0.4963 0.9926 0.9926 nan 0.9926 0.0 0.9926
0.0134 9.63 3360 0.0132 0.4960 0.9920 0.9920 nan 0.9920 0.0 0.9920
0.0146 9.68 3380 0.0132 0.4958 0.9916 0.9916 nan 0.9916 0.0 0.9916
0.0217 9.74 3400 0.0132 0.4961 0.9923 0.9923 nan 0.9923 0.0 0.9923
0.0142 9.8 3420 0.0131 0.4961 0.9923 0.9923 nan 0.9923 0.0 0.9923
0.0134 9.86 3440 0.0131 0.4959 0.9918 0.9918 nan 0.9918 0.0 0.9918
0.0131 9.91 3460 0.0131 0.4960 0.9920 0.9920 nan 0.9920 0.0 0.9920
0.0136 9.97 3480 0.0131 0.4961 0.9922 0.9922 nan 0.9922 0.0 0.9922

Framework versions

  • Transformers 4.35.0
  • Pytorch 2.2.2
  • Datasets 2.14.6
  • Tokenizers 0.14.1
Downloads last month
6
Safetensors
Model size
84.6M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for vigneshgs7/segformer-b5-p142-cvat-vgs

Base model

nvidia/mit-b5
Finetuned
(42)
this model