Edit model card

diff_based_error_tagger

This model is a fine-tuned version of csebuetnlp/banglabert on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0009
  • 5 Err Precision: 0.9667
  • 5 Err Recall: 1.0
  • 5 Err F1: 0.9831
  • 5 Err Number: 29
  • Precision: 0.9922
  • Recall: 0.9916
  • F1: 0.9919
  • Number: 9932
  • Err Precision: 0.9695
  • Err Recall: 1.0
  • Err F1: 0.9845
  • Err Number: 286
  • Egin Err Precision: 0.9938
  • Egin Err Recall: 0.9964
  • Egin Err F1: 0.9951
  • Egin Err Number: 1126
  • El Err Precision: 0.9957
  • El Err Recall: 0.9942
  • El Err F1: 0.9949
  • El Err Number: 1384
  • Nd Err Precision: 0.9932
  • Nd Err Recall: 0.9941
  • Nd Err F1: 0.9937
  • Nd Err Number: 1183
  • Ne Word Err Precision: 0.9978
  • Ne Word Err Recall: 0.9942
  • Ne Word Err F1: 0.9960
  • Ne Word Err Number: 8248
  • Unc Insert Err Precision: 0.9956
  • Unc Insert Err Recall: 0.9978
  • Unc Insert Err F1: 0.9967
  • Unc Insert Err Number: 903
  • Micro Avg Precision: 0.9944
  • Micro Avg Recall: 0.9934
  • Micro Avg F1: 0.9939
  • Micro Avg Number: 23091
  • Macro Avg Precision: 0.9881
  • Macro Avg Recall: 0.9960
  • Macro Avg F1: 0.9920
  • Macro Avg Number: 23091
  • Weighted Avg Precision: 0.9944
  • Weighted Avg Recall: 0.9934
  • Weighted Avg F1: 0.9939
  • Weighted Avg Number: 23091
  • Overall Accuracy: 0.9994

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 40.0

Training results

Training Loss Epoch Step Validation Loss 5 Err Precision 5 Err Recall 5 Err F1 5 Err Number Precision Recall F1 Number Err Precision Err Recall Err F1 Err Number Egin Err Precision Egin Err Recall Egin Err F1 Egin Err Number El Err Precision El Err Recall El Err F1 El Err Number Nd Err Precision Nd Err Recall Nd Err F1 Nd Err Number Ne Word Err Precision Ne Word Err Recall Ne Word Err F1 Ne Word Err Number Unc Insert Err Precision Unc Insert Err Recall Unc Insert Err F1 Unc Insert Err Number Micro Avg Precision Micro Avg Recall Micro Avg F1 Micro Avg Number Macro Avg Precision Macro Avg Recall Macro Avg F1 Macro Avg Number Weighted Avg Precision Weighted Avg Recall Weighted Avg F1 Weighted Avg Number Overall Accuracy
0.8517 1.0 575 0.2995 0.0 0.0 0.0 29 0.2579 0.0859 0.1289 9932 0.0 0.0 0.0 286 0.0 0.0 0.0 1126 0.0 0.0 0.0 1384 0.0 0.0 0.0 1183 0.6479 0.2836 0.3945 8248 0.0 0.0 0.0 903 0.4615 0.1382 0.2127 23091 0.1132 0.0462 0.0654 23091 0.3424 0.1382 0.1963 23091 0.9291
0.2635 2.0 1150 0.2054 0.0 0.0 0.0 29 0.3578 0.2270 0.2778 9932 0.0 0.0 0.0 286 0.8148 0.0195 0.0382 1126 0.8571 0.0650 0.1209 1384 0.7344 0.2384 0.3599 1183 0.6604 0.5507 0.6006 8248 0.0 0.0 0.0 903 0.5250 0.3114 0.3910 23091 0.4281 0.1376 0.1747 23091 0.5185 0.3114 0.3616 23091 0.9422
0.2027 3.0 1725 0.1546 0.0 0.0 0.0 29 0.4421 0.3362 0.3819 9932 0.0 0.0 0.0 286 0.6649 0.5639 0.6103 1126 0.8 0.2168 0.3411 1384 0.6406 0.5740 0.6054 1183 0.7271 0.6519 0.6875 8248 0.6094 0.0864 0.1513 903 0.5959 0.4507 0.5133 23091 0.4855 0.3036 0.3472 23091 0.5869 0.4507 0.4970 23091 0.9526
0.1655 4.0 2300 0.1197 0.0 0.0 0.0 29 0.5546 0.4326 0.4861 9932 1.0 0.0315 0.0610 286 0.8037 0.6874 0.7410 1126 0.8553 0.3374 0.4839 1384 0.8139 0.6653 0.7321 1183 0.8075 0.7180 0.7601 8248 0.6805 0.1816 0.2867 903 0.6974 0.5379 0.6073 23091 0.6894 0.3817 0.4439 23091 0.6981 0.5379 0.5952 23091 0.9636
0.1321 5.0 2875 0.0841 0.0 0.0 0.0 29 0.6868 0.6291 0.6567 9932 0.8431 0.1503 0.2552 286 0.8635 0.7869 0.8234 1126 0.7739 0.7197 0.7458 1384 0.8690 0.7625 0.8122 1183 0.8646 0.8339 0.8490 8248 0.6517 0.3378 0.4449 903 0.7771 0.7041 0.7388 23091 0.6941 0.5275 0.5734 23091 0.7732 0.7041 0.7327 23091 0.9756
0.0998 6.0 3450 0.0578 0.0 0.0 0.0 29 0.8054 0.7739 0.7893 9932 0.8182 0.2832 0.4208 286 0.8971 0.8597 0.8780 1126 0.8313 0.8259 0.8286 1384 0.8867 0.8335 0.8593 1183 0.9104 0.9089 0.9097 8248 0.7280 0.6224 0.6710 903 0.8518 0.8195 0.8353 23091 0.7346 0.6384 0.6696 23091 0.8492 0.8195 0.8324 23091 0.9845
0.0768 7.0 4025 0.0410 0.0 0.0 0.0 29 0.8790 0.8519 0.8652 9932 0.8889 0.3916 0.5437 286 0.9032 0.9032 0.9032 1126 0.9332 0.8374 0.8827 1384 0.8774 0.8588 0.8680 1183 0.9427 0.9377 0.9402 8248 0.7850 0.7885 0.7867 903 0.9027 0.8753 0.8888 23091 0.7762 0.6961 0.7237 23091 0.9014 0.8753 0.8869 23091 0.9897
0.0601 8.0 4600 0.0294 0.0 0.0 0.0 29 0.9161 0.8936 0.9047 9932 0.8775 0.6259 0.7306 286 0.9336 0.9245 0.9290 1126 0.9555 0.8526 0.9011 1384 0.9115 0.8791 0.8950 1183 0.9606 0.9630 0.9618 8248 0.8757 0.8505 0.8629 903 0.9333 0.9106 0.9218 23091 0.8038 0.7487 0.7731 23091 0.9317 0.9106 0.9206 23091 0.9928
0.0465 9.0 5175 0.0233 0.0 0.0 0.0 29 0.9420 0.9258 0.9338 9932 0.8583 0.7413 0.7955 286 0.9158 0.9369 0.9263 1126 0.9421 0.9162 0.9289 1384 0.8985 0.8977 0.8981 1183 0.9781 0.9622 0.9701 8248 0.8934 0.9280 0.9104 903 0.9484 0.9340 0.9411 23091 0.8035 0.7885 0.7954 23091 0.9473 0.9340 0.9405 23091 0.9944
0.037 10.0 5750 0.0167 0.0 0.0 0.0 29 0.9539 0.9528 0.9534 9932 0.8418 0.8741 0.8576 286 0.9517 0.9449 0.9483 1126 0.9699 0.9321 0.9506 1384 0.9330 0.9298 0.9314 1183 0.9726 0.9787 0.9756 8248 0.9411 0.9557 0.9484 903 0.9585 0.9572 0.9578 23091 0.8205 0.8210 0.8207 23091 0.9573 0.9572 0.9572 23091 0.9960
0.0295 11.0 6325 0.0141 0.0 0.0 0.0 29 0.9551 0.9578 0.9565 9932 0.8571 0.9021 0.8790 286 0.9607 0.9547 0.9577 1126 0.9791 0.9473 0.9629 1384 0.9367 0.9374 0.9371 1183 0.9868 0.9807 0.9838 8248 0.8456 0.9767 0.9065 903 0.9609 0.9630 0.9619 23091 0.8151 0.8321 0.8229 23091 0.9605 0.9630 0.9616 23091 0.9964
0.0249 12.0 6900 0.0102 1.0 0.0690 0.1290 29 0.9775 0.9723 0.9749 9932 0.9231 0.8811 0.9016 286 0.9453 0.9671 0.9561 1126 0.9708 0.9624 0.9666 1384 0.9456 0.9544 0.9499 1183 0.9896 0.9850 0.9873 8248 0.9671 0.9779 0.9725 903 0.9771 0.9730 0.9751 23091 0.9649 0.8461 0.8547 23091 0.9772 0.9730 0.9746 23091 0.9975
0.0203 13.0 7475 0.0084 1.0 0.1379 0.2424 29 0.9787 0.9723 0.9755 9932 0.9357 0.9161 0.9258 286 0.9733 0.9716 0.9724 1126 0.9904 0.9646 0.9773 1384 0.9574 0.9687 0.9630 1183 0.9924 0.9879 0.9902 8248 0.9757 0.9767 0.9762 903 0.9823 0.9756 0.9789 23091 0.9755 0.8620 0.8779 23091 0.9823 0.9756 0.9785 23091 0.9980
0.0181 14.0 8050 0.0066 1.0 0.2069 0.3429 29 0.9827 0.9785 0.9806 9932 0.9627 0.9021 0.9314 286 0.9743 0.9760 0.9752 1126 0.9804 0.9776 0.9790 1384 0.9662 0.9653 0.9658 1183 0.9934 0.9905 0.9920 8248 0.9738 0.9889 0.9813 903 0.9846 0.9804 0.9825 23091 0.9792 0.8732 0.8935 23091 0.9846 0.9804 0.9822 23091 0.9983
0.0149 15.0 8625 0.0060 1.0 0.3448 0.5128 29 0.9842 0.9783 0.9812 9932 0.9416 0.9580 0.9497 286 0.9744 0.9822 0.9783 1126 0.9883 0.9776 0.9829 1384 0.9748 0.9806 0.9777 1183 0.9957 0.9871 0.9914 8248 0.9824 0.9900 0.9862 903 0.9870 0.9811 0.9840 23091 0.9802 0.8998 0.9200 23091 0.9870 0.9811 0.9839 23091 0.9985
0.0128 16.0 9200 0.0041 1.0 0.4828 0.6512 29 0.9874 0.9854 0.9864 9932 0.9618 0.9685 0.9652 286 0.9832 0.9885 0.9858 1126 0.9898 0.9848 0.9873 1384 0.9789 0.9822 0.9806 1183 0.9951 0.9928 0.9940 8248 0.9879 0.9945 0.9912 903 0.9894 0.9875 0.9884 23091 0.9855 0.9224 0.9427 23091 0.9894 0.9875 0.9883 23091 0.9989
0.0109 17.0 9775 0.0038 1.0 0.6552 0.7917 29 0.9880 0.9858 0.9869 9932 0.9516 0.9615 0.9565 286 0.9876 0.9876 0.9876 1126 0.9877 0.9892 0.9884 1384 0.9772 0.9789 0.9780 1183 0.9947 0.9932 0.9939 8248 0.9944 0.9911 0.9928 903 0.9896 0.9879 0.9887 23091 0.9851 0.9428 0.9595 23091 0.9896 0.9879 0.9887 23091 0.9989
0.0104 18.0 10350 0.0033 1.0 0.6207 0.7660 29 0.9885 0.9872 0.9879 9932 0.9561 0.9895 0.9725 286 0.9763 0.9893 0.9828 1126 0.9899 0.9921 0.9910 1384 0.9783 0.9890 0.9836 1183 0.9953 0.9941 0.9947 8248 0.9945 0.9934 0.9939 903 0.9897 0.9900 0.9898 23091 0.9849 0.9444 0.9590 23091 0.9897 0.9900 0.9898 23091 0.9990
0.009 19.0 10925 0.0026 0.9565 0.7586 0.8462 29 0.9903 0.9898 0.9901 9932 0.9690 0.9825 0.9757 286 0.9902 0.9876 0.9889 1126 0.9942 0.9899 0.9920 1384 0.9890 0.9899 0.9894 1183 0.9952 0.9954 0.9953 8248 0.9934 0.9956 0.9945 903 0.9920 0.9916 0.9918 23091 0.9847 0.9612 0.9715 23091 0.9920 0.9916 0.9918 23091 0.9992
0.0077 20.0 11500 0.0024 1.0 0.8966 0.9455 29 0.9913 0.9906 0.9910 9932 0.9530 0.9930 0.9726 286 0.9885 0.9938 0.9911 1126 0.9942 0.9949 0.9946 1384 0.9874 0.9915 0.9895 1183 0.9976 0.9931 0.9953 8248 0.9956 0.9956 0.9956 903 0.9931 0.9921 0.9926 23091 0.9885 0.9811 0.9844 23091 0.9931 0.9921 0.9926 23091 0.9993
0.0068 21.0 12075 0.0023 1.0 0.8966 0.9455 29 0.9911 0.9895 0.9903 9932 0.9823 0.9720 0.9772 286 0.9868 0.9947 0.9907 1126 0.9957 0.9928 0.9942 1384 0.9858 0.9941 0.9899 1183 0.9966 0.9939 0.9953 8248 0.9967 0.9934 0.9950 903 0.9930 0.9916 0.9923 23091 0.9919 0.9784 0.9848 23091 0.9930 0.9916 0.9923 23091 0.9993
0.0062 22.0 12650 0.0019 1.0 0.8966 0.9455 29 0.9913 0.9914 0.9914 9932 0.9758 0.9860 0.9809 286 0.9894 0.9911 0.9902 1126 0.9942 0.9935 0.9939 1384 0.9890 0.9907 0.9899 1183 0.9960 0.9956 0.9958 8248 0.9956 0.9956 0.9956 903 0.9929 0.9930 0.9930 23091 0.9914 0.9801 0.9854 23091 0.9929 0.9930 0.9930 23091 0.9993
0.0055 23.0 13225 0.0018 1.0 0.9310 0.9643 29 0.9923 0.9911 0.9917 9932 0.9758 0.9860 0.9809 286 0.9902 0.9911 0.9907 1126 0.9942 0.9949 0.9946 1384 0.9882 0.9932 0.9907 1183 0.9967 0.9943 0.9955 8248 0.9967 0.9945 0.9956 903 0.9937 0.9926 0.9931 23091 0.9918 0.9845 0.9880 23091 0.9937 0.9926 0.9931 23091 0.9994
0.0053 24.0 13800 0.0015 1.0 0.9310 0.9643 29 0.9922 0.9916 0.9919 9932 0.9860 0.9825 0.9842 286 0.9903 0.9929 0.9916 1126 0.9942 0.9957 0.9949 1384 0.9899 0.9924 0.9911 1183 0.9959 0.9958 0.9958 8248 0.9967 0.9945 0.9956 903 0.9935 0.9934 0.9935 23091 0.9931 0.9845 0.9887 23091 0.9935 0.9934 0.9935 23091 0.9994
0.0048 25.0 14375 0.0015 0.9667 1.0 0.9831 29 0.9916 0.9915 0.9916 9932 0.9758 0.9860 0.9809 286 0.9912 0.9956 0.9934 1126 0.9928 0.9971 0.9950 1384 0.9891 0.9949 0.9920 1183 0.9967 0.9941 0.9954 8248 0.9967 0.9967 0.9967 903 0.9933 0.9933 0.9933 23091 0.9876 0.9945 0.9910 23091 0.9933 0.9933 0.9933 23091 0.9994
0.0039 26.0 14950 0.0013 0.9667 1.0 0.9831 29 0.9920 0.9908 0.9914 9932 0.9792 0.9895 0.9843 286 0.9912 0.9956 0.9934 1126 0.9971 0.9928 0.9949 1384 0.9916 0.9932 0.9924 1183 0.9966 0.9952 0.9959 8248 0.9967 0.9967 0.9967 903 0.9939 0.9931 0.9935 23091 0.9889 0.9942 0.9915 23091 0.9939 0.9931 0.9935 23091 0.9994
0.0039 27.0 15525 0.0013 1.0 0.9310 0.9643 29 0.9912 0.9921 0.9917 9932 0.9726 0.9930 0.9827 286 0.9929 0.9973 0.9951 1126 0.9978 0.9921 0.9949 1384 0.9907 0.9949 0.9928 1183 0.9949 0.9964 0.9956 8248 0.9923 0.9989 0.9956 903 0.9928 0.9942 0.9935 23091 0.9916 0.9870 0.9891 23091 0.9928 0.9942 0.9935 23091 0.9994
0.0037 28.0 16100 0.0013 1.0 0.9655 0.9825 29 0.9925 0.9915 0.9920 9932 0.9826 0.9860 0.9843 286 0.9929 0.9956 0.9942 1126 0.9942 0.9957 0.9949 1384 0.9924 0.9949 0.9937 1183 0.9982 0.9936 0.9959 8248 0.9956 0.9978 0.9967 903 0.9947 0.9930 0.9938 23091 0.9936 0.9901 0.9918 23091 0.9947 0.9930 0.9938 23091 0.9994
0.0034 29.0 16675 0.0012 1.0 0.9655 0.9825 29 0.9918 0.9919 0.9919 9932 0.9726 0.9930 0.9827 286 0.9964 0.9938 0.9951 1126 0.9957 0.9942 0.9949 1384 0.9949 0.9924 0.9937 1183 0.9965 0.9950 0.9958 8248 0.9956 0.9978 0.9967 903 0.9940 0.9935 0.9938 23091 0.9929 0.9905 0.9916 23091 0.9940 0.9935 0.9938 23091 0.9994
0.0031 30.0 17250 0.0011 0.9667 1.0 0.9831 29 0.9909 0.9920 0.9915 9932 0.9792 0.9895 0.9843 286 0.9956 0.9947 0.9951 1126 0.9942 0.9957 0.9949 1384 0.9949 0.9932 0.9941 1183 0.9956 0.9962 0.9959 8248 0.9989 0.9945 0.9967 903 0.9934 0.9940 0.9937 23091 0.9895 0.9945 0.9920 23091 0.9934 0.9940 0.9937 23091 0.9994
0.0026 31.0 17825 0.0011 0.9667 1.0 0.9831 29 0.9910 0.9918 0.9914 9932 0.9727 0.9965 0.9845 286 0.9938 0.9964 0.9951 1126 0.9928 0.9971 0.9950 1384 0.9932 0.9924 0.9928 1183 0.9964 0.9953 0.9958 8248 0.9956 0.9978 0.9967 903 0.9932 0.9939 0.9936 23091 0.9878 0.9959 0.9918 23091 0.9932 0.9939 0.9936 23091 0.9994
0.0025 32.0 18400 0.0011 1.0 0.9655 0.9825 29 0.9914 0.9920 0.9917 9932 0.9727 0.9965 0.9845 286 0.9973 0.9929 0.9951 1126 0.9900 1.0 0.9950 1384 0.9932 0.9924 0.9928 1183 0.9972 0.9948 0.9960 8248 0.9945 0.9989 0.9967 903 0.9937 0.9939 0.9938 23091 0.9920 0.9916 0.9918 23091 0.9937 0.9939 0.9938 23091 0.9994
0.0025 33.0 18975 0.0010 1.0 0.9655 0.9825 29 0.9918 0.9922 0.9920 9932 0.9694 0.9965 0.9828 286 0.9973 0.9929 0.9951 1126 0.9928 0.9971 0.9950 1384 0.9932 0.9932 0.9932 1183 0.9966 0.9952 0.9959 8248 0.9967 0.9967 0.9967 903 0.9939 0.9939 0.9939 23091 0.9922 0.9912 0.9916 23091 0.9939 0.9939 0.9939 23091 0.9994
0.0023 34.0 19550 0.0010 0.9667 1.0 0.9831 29 0.9914 0.9920 0.9917 9932 0.9662 1.0 0.9828 286 0.9947 0.9956 0.9951 1126 0.9964 0.9935 0.9949 1384 0.9932 0.9924 0.9928 1183 0.9964 0.9954 0.9959 8248 0.9956 0.9978 0.9967 903 0.9935 0.9939 0.9937 23091 0.9876 0.9958 0.9916 23091 0.9936 0.9939 0.9937 23091 0.9994
0.0023 35.0 20125 0.0010 0.9667 1.0 0.9831 29 0.9924 0.9918 0.9921 9932 0.9662 1.0 0.9828 286 0.9929 0.9973 0.9951 1126 0.9964 0.9935 0.9949 1384 0.9916 0.9958 0.9937 1183 0.9970 0.9945 0.9958 8248 0.9967 0.9967 0.9967 903 0.9941 0.9937 0.9939 23091 0.9875 0.9962 0.9918 23091 0.9941 0.9937 0.9939 23091 0.9994
0.002 36.0 20700 0.0010 0.9667 1.0 0.9831 29 0.9923 0.9915 0.9919 9932 0.9695 1.0 0.9845 286 0.9947 0.9956 0.9951 1126 0.9957 0.9942 0.9949 1384 0.9924 0.9949 0.9937 1183 0.9979 0.9941 0.9960 8248 0.9956 0.9978 0.9967 903 0.9945 0.9933 0.9939 23091 0.9881 0.9960 0.9920 23091 0.9945 0.9933 0.9939 23091 0.9994
0.0018 37.0 21275 0.0009 0.9667 1.0 0.9831 29 0.9915 0.9919 0.9917 9932 0.9792 0.9895 0.9843 286 0.9938 0.9964 0.9951 1126 0.9964 0.9935 0.9949 1384 0.9924 0.9949 0.9937 1183 0.9965 0.9955 0.9960 8248 0.9967 0.9967 0.9967 903 0.9938 0.9939 0.9938 23091 0.9891 0.9948 0.9919 23091 0.9938 0.9939 0.9938 23091 0.9994
0.002 38.0 21850 0.0009 0.9667 1.0 0.9831 29 0.9920 0.9914 0.9917 9932 0.9727 0.9965 0.9845 286 0.9956 0.9947 0.9951 1126 0.9949 0.9949 0.9949 1384 0.9924 0.9949 0.9937 1183 0.9979 0.9938 0.9959 8248 0.9956 0.9978 0.9967 903 0.9944 0.9932 0.9938 23091 0.9885 0.9955 0.9919 23091 0.9944 0.9932 0.9938 23091 0.9994
0.0018 39.0 22425 0.0009 0.9667 1.0 0.9831 29 0.9924 0.9914 0.9919 9932 0.9695 1.0 0.9845 286 0.9947 0.9956 0.9951 1126 0.9957 0.9942 0.9949 1384 0.9941 0.9932 0.9937 1183 0.9978 0.9942 0.9960 8248 0.9956 0.9978 0.9967 903 0.9945 0.9932 0.9939 23091 0.9883 0.9958 0.9920 23091 0.9946 0.9932 0.9939 23091 0.9994
0.0017 40.0 23000 0.0009 0.9667 1.0 0.9831 29 0.9922 0.9916 0.9919 9932 0.9695 1.0 0.9845 286 0.9938 0.9964 0.9951 1126 0.9957 0.9942 0.9949 1384 0.9932 0.9941 0.9937 1183 0.9978 0.9942 0.9960 8248 0.9956 0.9978 0.9967 903 0.9944 0.9934 0.9939 23091 0.9881 0.9960 0.9920 23091 0.9944 0.9934 0.9939 23091 0.9994

Framework versions

  • Transformers 4.25.1
  • Pytorch 1.13.1+cu117
  • Datasets 2.9.0
  • Tokenizers 0.13.2
Downloads last month
48
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.