--- library_name: transformers license: mit base_model: microsoft/mdeberta-v3-base tags: - generated_from_trainer metrics: - precision - recall - f1 - accuracy model-index: - name: scenario-non-kd-scr-ner-full-mdeberta_data-univner_full44 results: [] --- # scenario-non-kd-scr-ner-full-mdeberta_data-univner_full44 This model is a fine-tuned version of [microsoft/mdeberta-v3-base](https://huggingface.co/microsoft/mdeberta-v3-base) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.3159 - Precision: 0.6317 - Recall: 0.6096 - F1: 0.6205 - Accuracy: 0.9636 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 44 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 30 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | |:-------------:|:-------:|:-----:|:---------------:|:---------:|:------:|:------:|:--------:| | 0.3088 | 0.2910 | 500 | 0.2391 | 0.3299 | 0.2154 | 0.2606 | 0.9344 | | 0.1992 | 0.5821 | 1000 | 0.1901 | 0.3635 | 0.3792 | 0.3712 | 0.9430 | | 0.1511 | 0.8731 | 1500 | 0.1697 | 0.4865 | 0.4499 | 0.4675 | 0.9520 | | 0.1103 | 1.1641 | 2000 | 0.1592 | 0.5109 | 0.5203 | 0.5155 | 0.9553 | | 0.09 | 1.4552 | 2500 | 0.1498 | 0.5337 | 0.5649 | 0.5488 | 0.9578 | | 0.0852 | 1.7462 | 3000 | 0.1489 | 0.5792 | 0.5224 | 0.5493 | 0.9592 | | 0.0739 | 2.0373 | 3500 | 0.1468 | 0.5588 | 0.6070 | 0.5819 | 0.9606 | | 0.0469 | 2.3283 | 4000 | 0.1673 | 0.5990 | 0.5625 | 0.5802 | 0.9612 | | 0.0479 | 2.6193 | 4500 | 0.1578 | 0.5783 | 0.6139 | 0.5956 | 0.9612 | | 0.0472 | 2.9104 | 5000 | 0.1568 | 0.6109 | 0.5914 | 0.6010 | 0.9627 | | 0.0318 | 3.2014 | 5500 | 0.1806 | 0.6092 | 0.5807 | 0.5946 | 0.9622 | | 0.0271 | 3.4924 | 6000 | 0.1910 | 0.5860 | 0.5659 | 0.5757 | 0.9605 | | 0.0289 | 3.7835 | 6500 | 0.1811 | 0.5978 | 0.5990 | 0.5984 | 0.9626 | | 0.0246 | 4.0745 | 7000 | 0.1958 | 0.6135 | 0.6027 | 0.6080 | 0.9630 | | 0.0165 | 4.3655 | 7500 | 0.2066 | 0.6118 | 0.5927 | 0.6021 | 0.9621 | | 0.0177 | 4.6566 | 8000 | 0.2033 | 0.5884 | 0.6018 | 0.5950 | 0.9608 | | 0.0178 | 4.9476 | 8500 | 0.2051 | 0.5985 | 0.5956 | 0.5970 | 0.9620 | | 0.0117 | 5.2386 | 9000 | 0.2181 | 0.5962 | 0.6174 | 0.6066 | 0.9619 | | 0.0107 | 5.5297 | 9500 | 0.2247 | 0.5859 | 0.6119 | 0.5986 | 0.9610 | | 0.013 | 5.8207 | 10000 | 0.2253 | 0.6049 | 0.5825 | 0.5935 | 0.9618 | | 0.0106 | 6.1118 | 10500 | 0.2280 | 0.6130 | 0.6068 | 0.6099 | 0.9625 | | 0.0071 | 6.4028 | 11000 | 0.2321 | 0.6006 | 0.6292 | 0.6146 | 0.9623 | | 0.0077 | 6.6938 | 11500 | 0.2497 | 0.5847 | 0.6024 | 0.5934 | 0.9612 | | 0.0085 | 6.9849 | 12000 | 0.2421 | 0.5838 | 0.6058 | 0.5946 | 0.9611 | | 0.0053 | 7.2759 | 12500 | 0.2562 | 0.6189 | 0.5917 | 0.6050 | 0.9627 | | 0.0057 | 7.5669 | 13000 | 0.2602 | 0.5918 | 0.6138 | 0.6026 | 0.9613 | | 0.0065 | 7.8580 | 13500 | 0.2531 | 0.6077 | 0.6148 | 0.6112 | 0.9625 | | 0.0048 | 8.1490 | 14000 | 0.2634 | 0.6182 | 0.6141 | 0.6161 | 0.9628 | | 0.004 | 8.4400 | 14500 | 0.2736 | 0.6197 | 0.5969 | 0.6081 | 0.9622 | | 0.0053 | 8.7311 | 15000 | 0.2579 | 0.6194 | 0.6119 | 0.6156 | 0.9633 | | 0.0042 | 9.0221 | 15500 | 0.2770 | 0.6223 | 0.6211 | 0.6217 | 0.9633 | | 0.003 | 9.3132 | 16000 | 0.2835 | 0.6010 | 0.6081 | 0.6046 | 0.9622 | | 0.0031 | 9.6042 | 16500 | 0.2839 | 0.6408 | 0.5954 | 0.6173 | 0.9632 | | 0.0035 | 9.8952 | 17000 | 0.2876 | 0.6209 | 0.5996 | 0.6101 | 0.9623 | | 0.0034 | 10.1863 | 17500 | 0.2839 | 0.6215 | 0.6044 | 0.6128 | 0.9628 | | 0.0026 | 10.4773 | 18000 | 0.2851 | 0.6053 | 0.6187 | 0.6119 | 0.9626 | | 0.0032 | 10.7683 | 18500 | 0.2799 | 0.6120 | 0.6068 | 0.6094 | 0.9620 | | 0.003 | 11.0594 | 19000 | 0.2912 | 0.6110 | 0.6306 | 0.6207 | 0.9629 | | 0.0019 | 11.3504 | 19500 | 0.2963 | 0.6188 | 0.6201 | 0.6194 | 0.9629 | | 0.0025 | 11.6414 | 20000 | 0.2876 | 0.6101 | 0.6240 | 0.6170 | 0.9628 | | 0.0023 | 11.9325 | 20500 | 0.2940 | 0.6392 | 0.6001 | 0.6190 | 0.9631 | | 0.0017 | 12.2235 | 21000 | 0.3056 | 0.6017 | 0.6177 | 0.6096 | 0.9625 | | 0.0013 | 12.5146 | 21500 | 0.3127 | 0.6128 | 0.6018 | 0.6072 | 0.9628 | | 0.002 | 12.8056 | 22000 | 0.3052 | 0.6160 | 0.6151 | 0.6156 | 0.9630 | | 0.0018 | 13.0966 | 22500 | 0.3115 | 0.6279 | 0.5999 | 0.6136 | 0.9629 | | 0.0015 | 13.3877 | 23000 | 0.3121 | 0.6125 | 0.6155 | 0.6140 | 0.9628 | | 0.0013 | 13.6787 | 23500 | 0.3203 | 0.6193 | 0.6185 | 0.6189 | 0.9629 | | 0.0015 | 13.9697 | 24000 | 0.3290 | 0.6329 | 0.6028 | 0.6175 | 0.9629 | | 0.0012 | 14.2608 | 24500 | 0.3267 | 0.6123 | 0.6194 | 0.6158 | 0.9625 | | 0.0011 | 14.5518 | 25000 | 0.3191 | 0.6213 | 0.6165 | 0.6189 | 0.9633 | | 0.0011 | 14.8428 | 25500 | 0.3159 | 0.6317 | 0.6096 | 0.6205 | 0.9636 | ### Framework versions - Transformers 4.44.2 - Pytorch 2.1.1+cu121 - Datasets 2.14.5 - Tokenizers 0.19.1