Edit model card

albert-base-v2-finetuned-ner-cadec

This model is a fine-tuned version of albert-base-v2 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6830
  • Precision: 0.6428
  • Recall: 0.6704
  • F1: 0.6563
  • Accuracy: 0.9147
  • Adr Precision: 0.6180
  • Adr Recall: 0.6767
  • Adr F1: 0.6460
  • Disease Precision: 0.2581
  • Disease Recall: 0.32
  • Disease F1: 0.2857
  • Drug Precision: 0.9036
  • Drug Recall: 0.9259
  • Drug F1: 0.9146
  • Finding Precision: 0.4054
  • Finding Recall: 0.2174
  • Finding F1: 0.2830
  • Symptom Precision: 0.4333
  • Symptom Recall: 0.4815
  • Symptom F1: 0.4561
  • B-adr Precision: 0.7509
  • B-adr Recall: 0.7735
  • B-adr F1: 0.7620
  • B-disease Precision: 0.2667
  • B-disease Recall: 0.32
  • B-disease F1: 0.2909
  • B-drug Precision: 0.9563
  • B-drug Recall: 0.9444
  • B-drug F1: 0.9503
  • B-finding Precision: 0.5312
  • B-finding Recall: 0.2576
  • B-finding F1: 0.3469
  • B-symptom Precision: 0.5
  • B-symptom Recall: 0.5185
  • B-symptom F1: 0.5091
  • I-adr Precision: 0.5959
  • I-adr Recall: 0.6578
  • I-adr F1: 0.6254
  • I-disease Precision: 0.2174
  • I-disease Recall: 0.2381
  • I-disease F1: 0.2273
  • I-drug Precision: 0.9096
  • I-drug Recall: 0.9379
  • I-drug F1: 0.9235
  • I-finding Precision: 0.4
  • I-finding Recall: 0.1923
  • I-finding F1: 0.2597
  • I-symptom Precision: 0.375
  • I-symptom Recall: 0.4286
  • I-symptom F1: 0.4000
  • Macro Avg F1: 0.5295
  • Weighted Avg F1: 0.6995

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 40

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy Adr Precision Adr Recall Adr F1 Disease Precision Disease Recall Disease F1 Drug Precision Drug Recall Drug F1 Finding Precision Finding Recall Finding F1 Symptom Precision Symptom Recall Symptom F1 B-adr Precision B-adr Recall B-adr F1 B-disease Precision B-disease Recall B-disease F1 B-drug Precision B-drug Recall B-drug F1 B-finding Precision B-finding Recall B-finding F1 B-symptom Precision B-symptom Recall B-symptom F1 I-adr Precision I-adr Recall I-adr F1 I-disease Precision I-disease Recall I-disease F1 I-drug Precision I-drug Recall I-drug F1 I-finding Precision I-finding Recall I-finding F1 I-symptom Precision I-symptom Recall I-symptom F1 Macro Avg F1 Weighted Avg F1
No log 1.0 125 0.3036 0.4900 0.5855 0.5335 0.8983 0.4564 0.6017 0.5191 0.0877 0.2 0.1220 0.8671 0.9259 0.8955 0.0294 0.0145 0.0194 0.0 0.0 0.0 0.6901 0.7805 0.7326 0.2 0.2 0.2000 0.9112 0.9506 0.9305 0.5 0.0152 0.0294 0.0 0.0 0.0 0.4781 0.5784 0.5235 0.0652 0.1429 0.0896 0.8830 0.9379 0.9096 0.1818 0.1154 0.1412 0.0 0.0 0.0 0.3556 0.6214
No log 2.0 250 0.2758 0.5594 0.6297 0.5924 0.9070 0.5310 0.6717 0.5931 0.1081 0.16 0.1290 0.8421 0.8889 0.8649 0.16 0.0580 0.0851 0.5 0.0370 0.0690 0.7317 0.7965 0.7627 0.2308 0.24 0.2353 0.9804 0.9259 0.9524 0.5385 0.1061 0.1772 1.0 0.0741 0.1379 0.5274 0.6660 0.5887 0.0952 0.0952 0.0952 0.8471 0.8944 0.8701 0.3125 0.0962 0.1471 0.0 0.0 0.0 0.3967 0.6599
No log 3.0 375 0.2840 0.5713 0.6489 0.6076 0.9079 0.5533 0.675 0.6081 0.1667 0.32 0.2192 0.8571 0.9259 0.8902 0.1842 0.1014 0.1308 0.3 0.1111 0.1622 0.7298 0.7982 0.7625 0.2162 0.32 0.2581 0.9281 0.9568 0.9422 0.4583 0.1667 0.2444 0.7 0.2593 0.3784 0.5482 0.6599 0.5989 0.1842 0.3333 0.2373 0.8671 0.9317 0.8982 0.2609 0.1154 0.16 0.0 0.0 0.0 0.4480 0.6744
0.2544 4.0 500 0.2906 0.5760 0.6353 0.6042 0.9104 0.5679 0.6483 0.6054 0.0816 0.16 0.1081 0.8862 0.9136 0.8997 0.2364 0.1884 0.2097 0.3889 0.2593 0.3111 0.7626 0.7788 0.7706 0.1875 0.24 0.2105 0.9379 0.9321 0.9350 0.4474 0.2576 0.3269 0.7333 0.4074 0.5238 0.5647 0.6314 0.5962 0.1579 0.2857 0.2034 0.8916 0.9193 0.9052 0.3143 0.2115 0.2529 0.6667 0.1429 0.2353 0.4960 0.6862
0.2544 5.0 625 0.2948 0.5855 0.6750 0.6270 0.9132 0.5633 0.6967 0.6230 0.0333 0.04 0.0364 0.8786 0.9383 0.9075 0.2549 0.1884 0.2167 0.5455 0.4444 0.4898 0.7312 0.8088 0.7681 0.2083 0.2 0.2041 0.9118 0.9568 0.9337 0.5161 0.2424 0.3299 0.7778 0.5185 0.6222 0.5507 0.6741 0.6062 0.0667 0.0476 0.0556 0.8994 0.9441 0.9212 0.3077 0.2308 0.2637 0.625 0.3571 0.4545 0.5159 0.6920
0.2544 6.0 750 0.3172 0.6092 0.6569 0.6322 0.9130 0.5859 0.665 0.6230 0.2093 0.36 0.2647 0.8922 0.9198 0.9058 0.4074 0.1594 0.2292 0.3529 0.4444 0.3934 0.7616 0.7858 0.7735 0.2778 0.4 0.3279 0.9333 0.9506 0.9419 0.6471 0.1667 0.2651 0.6667 0.5185 0.5833 0.5762 0.6619 0.6161 0.2333 0.3333 0.2745 0.9030 0.9255 0.9141 0.4762 0.1923 0.2740 0.2174 0.3571 0.2703 0.5241 0.6973
0.2544 7.0 875 0.3519 0.5992 0.6670 0.6313 0.9127 0.5663 0.6833 0.6193 0.1471 0.2 0.1695 0.9042 0.9321 0.9179 0.3947 0.2174 0.2804 0.4 0.2963 0.3404 0.7245 0.8283 0.7729 0.2593 0.28 0.2692 0.9448 0.9506 0.9477 0.6538 0.2576 0.3696 0.7857 0.4074 0.5366 0.5613 0.6619 0.6075 0.1818 0.1905 0.1860 0.9146 0.9317 0.9231 0.4138 0.2308 0.2963 0.2308 0.2143 0.2222 0.5131 0.6977
0.0853 8.0 1000 0.3881 0.5996 0.6546 0.6259 0.9138 0.5928 0.655 0.6223 0.0769 0.12 0.0938 0.8817 0.9198 0.9003 0.3284 0.3188 0.3235 0.4231 0.4074 0.4151 0.7560 0.7788 0.7672 0.25 0.32 0.2807 0.9337 0.9568 0.9451 0.5957 0.4242 0.4956 0.6190 0.4815 0.5417 0.5890 0.6334 0.6104 0.0417 0.0476 0.0444 0.8970 0.9193 0.9080 0.3654 0.3654 0.3654 0.4545 0.3571 0.4 0.5358 0.7022
0.0853 9.0 1125 0.4068 0.5875 0.6580 0.6207 0.9109 0.5625 0.66 0.6074 0.1818 0.4 0.2500 0.9146 0.9259 0.9202 0.3421 0.1884 0.2430 0.4286 0.4444 0.4364 0.7138 0.7947 0.7521 0.2340 0.44 0.3056 0.9390 0.9506 0.9448 0.6522 0.2273 0.3371 0.5714 0.4444 0.5 0.5546 0.6314 0.5905 0.1944 0.3333 0.2456 0.9202 0.9317 0.9259 0.3929 0.2115 0.275 0.3158 0.4286 0.3636 0.5240 0.6849
0.0853 10.0 1250 0.4256 0.6337 0.6682 0.6505 0.9140 0.6069 0.6717 0.6377 0.2432 0.36 0.2903 0.9130 0.9074 0.9102 0.4706 0.2319 0.3107 0.4286 0.5556 0.4839 0.7530 0.7717 0.7622 0.3125 0.4 0.3509 0.9497 0.9321 0.9408 0.6667 0.2424 0.3556 0.5152 0.6296 0.5667 0.5930 0.6558 0.6228 0.2609 0.2857 0.2727 0.9245 0.9130 0.9187 0.5769 0.2885 0.3846 0.4211 0.5714 0.4848 0.5660 0.7051
0.0853 11.0 1375 0.4426 0.6211 0.6387 0.6298 0.9093 0.5899 0.645 0.6162 0.1944 0.28 0.2295 0.8916 0.9136 0.9024 0.3929 0.1594 0.2268 0.5 0.4074 0.4490 0.7522 0.7469 0.7496 0.2059 0.28 0.2373 0.9383 0.9383 0.9383 0.5455 0.1818 0.2727 0.6667 0.4444 0.5333 0.5704 0.6191 0.5938 0.2 0.2381 0.2174 0.9080 0.9193 0.9136 0.5714 0.2308 0.3288 0.4286 0.4286 0.4286 0.5213 0.6819
0.0307 12.0 1500 0.4234 0.6211 0.6852 0.6516 0.9144 0.5848 0.6783 0.6281 0.2143 0.24 0.2264 0.9096 0.9321 0.9207 0.5 0.3913 0.4390 0.4667 0.5185 0.4912 0.7471 0.8 0.7726 0.2857 0.24 0.2609 0.9448 0.9506 0.9477 0.5957 0.4242 0.4956 0.6364 0.5185 0.5714 0.5754 0.6680 0.6183 0.2609 0.2857 0.2727 0.9207 0.9379 0.9292 0.5641 0.4231 0.4835 0.2857 0.4286 0.3429 0.5695 0.7156
0.0307 13.0 1625 0.4559 0.6220 0.6784 0.6490 0.9127 0.6059 0.6817 0.6416 0.2188 0.28 0.2456 0.9042 0.9321 0.9179 0.3846 0.2899 0.3306 0.3243 0.4444 0.375 0.7547 0.7841 0.7691 0.2667 0.32 0.2909 0.9448 0.9506 0.9477 0.5476 0.3485 0.4259 0.5161 0.5926 0.5517 0.6 0.6660 0.6313 0.16 0.1905 0.1739 0.9152 0.9379 0.9264 0.4474 0.3269 0.3778 0.1923 0.3571 0.25 0.5345 0.7098
0.0307 14.0 1750 0.4551 0.6031 0.6523 0.6268 0.9127 0.5842 0.6417 0.6116 0.1458 0.28 0.1918 0.9152 0.9321 0.9235 0.2979 0.2029 0.2414 0.5278 0.7037 0.6032 0.7496 0.7735 0.7613 0.2051 0.32 0.25 0.9565 0.9506 0.9536 0.5143 0.2727 0.3564 0.6 0.6667 0.6316 0.5768 0.6273 0.6010 0.1290 0.1905 0.1538 0.9264 0.9379 0.9321 0.3889 0.2692 0.3182 0.3478 0.5714 0.4324 0.5390 0.6960
0.0307 15.0 1875 0.4709 0.6046 0.6806 0.6404 0.9135 0.5872 0.6733 0.6273 0.1961 0.4 0.2632 0.8902 0.9506 0.9194 0.425 0.2464 0.3119 0.3810 0.5926 0.4638 0.7492 0.7929 0.7704 0.2128 0.4 0.2778 0.9286 0.9630 0.9455 0.5806 0.2727 0.3711 0.5758 0.7037 0.6333 0.5761 0.6395 0.6062 0.1892 0.3333 0.2414 0.9 0.9503 0.9245 0.4839 0.2885 0.3614 0.3214 0.6429 0.4286 0.5560 0.7029
0.0136 16.0 2000 0.5190 0.6193 0.6818 0.6491 0.9133 0.6 0.69 0.6419 0.2045 0.36 0.2609 0.8779 0.9321 0.9042 0.3824 0.1884 0.2524 0.4688 0.5556 0.5085 0.7401 0.7965 0.7673 0.2308 0.36 0.2812 0.9509 0.9568 0.9538 0.5357 0.2273 0.3191 0.6667 0.5926 0.6275 0.5875 0.6701 0.6261 0.2333 0.3333 0.2745 0.8895 0.9503 0.9189 0.5238 0.2115 0.3014 0.35 0.5 0.4118 0.5482 0.7043
0.0136 17.0 2125 0.5312 0.6206 0.6818 0.6498 0.9154 0.5974 0.695 0.6425 0.2105 0.32 0.2540 0.9259 0.9259 0.9259 0.3846 0.2174 0.2778 0.3636 0.4444 0.4000 0.7302 0.8 0.7635 0.2857 0.4 0.3333 0.9563 0.9444 0.9503 0.5926 0.2424 0.3441 0.6087 0.5185 0.5600 0.5832 0.6497 0.6146 0.2 0.2381 0.2174 0.9371 0.9255 0.9313 0.4062 0.25 0.3095 0.2727 0.4286 0.3333 0.5357 0.6998
0.0136 18.0 2250 0.5362 0.6235 0.6546 0.6387 0.9127 0.5976 0.6533 0.6242 0.2581 0.32 0.2857 0.9096 0.9321 0.9207 0.4186 0.2609 0.3214 0.2903 0.3333 0.3103 0.7513 0.7646 0.7579 0.2667 0.32 0.2909 0.9627 0.9568 0.9598 0.5556 0.3030 0.3922 0.44 0.4074 0.4231 0.5830 0.6293 0.6053 0.2632 0.2381 0.25 0.9202 0.9317 0.9259 0.4688 0.2885 0.3571 0.2273 0.3571 0.2778 0.5240 0.6959
0.0136 19.0 2375 0.5420 0.6261 0.6297 0.6279 0.9154 0.5917 0.64 0.6149 0.1944 0.28 0.2295 0.9423 0.9074 0.9245 0.2857 0.0870 0.1333 0.4615 0.4444 0.4528 0.7645 0.7469 0.7556 0.2424 0.32 0.2759 0.9744 0.9383 0.9560 0.3889 0.1061 0.1667 0.5833 0.5185 0.5490 0.5704 0.6191 0.5938 0.2083 0.2381 0.2222 0.9484 0.9130 0.9304 0.375 0.1154 0.1765 0.3333 0.4286 0.375 0.5001 0.6786
0.0059 20.0 2500 0.5872 0.6138 0.6874 0.6485 0.9106 0.6011 0.7083 0.6503 0.1628 0.28 0.2059 0.9042 0.9321 0.9179 0.3243 0.1739 0.2264 0.3429 0.4444 0.3871 0.7212 0.8195 0.7672 0.25 0.4 0.3077 0.9394 0.9568 0.9480 0.4828 0.2121 0.2947 0.56 0.5185 0.5385 0.5819 0.6660 0.6211 0.1379 0.1905 0.16 0.9036 0.9317 0.9174 0.3571 0.1923 0.25 0.25 0.4286 0.3158 0.5120 0.6958
0.0059 21.0 2625 0.5642 0.6088 0.6557 0.6314 0.9126 0.5913 0.6533 0.6207 0.2 0.36 0.2571 0.9080 0.9136 0.9108 0.4286 0.2174 0.2885 0.3333 0.5556 0.4167 0.7671 0.7752 0.7711 0.2381 0.4 0.2985 0.9441 0.9383 0.9412 0.6897 0.3030 0.4211 0.4595 0.6296 0.5312 0.5730 0.6314 0.6008 0.1818 0.2857 0.2222 0.9141 0.9255 0.9198 0.4286 0.2308 0.3 0.2308 0.4286 0.3 0.5306 0.6978
0.0059 22.0 2750 0.5869 0.6190 0.6716 0.6442 0.9133 0.6092 0.6833 0.6441 0.1951 0.32 0.2424 0.9030 0.9198 0.9113 0.2895 0.1594 0.2056 0.3659 0.5556 0.4412 0.7564 0.7805 0.7683 0.2308 0.36 0.2812 0.9444 0.9444 0.9444 0.5714 0.2424 0.3404 0.5484 0.6296 0.5862 0.5930 0.6558 0.6228 0.1562 0.2381 0.1887 0.9030 0.9255 0.9141 0.2414 0.1346 0.1728 0.2593 0.5 0.3415 0.5161 0.6964
0.0059 23.0 2875 0.5873 0.6149 0.6546 0.6341 0.9112 0.5964 0.655 0.6243 0.2105 0.32 0.2540 0.9096 0.9321 0.9207 0.4 0.2029 0.2692 0.2857 0.4444 0.3478 0.7597 0.7611 0.7604 0.2222 0.32 0.2623 0.9390 0.9506 0.9448 0.7 0.3182 0.4375 0.4828 0.5185 0.5 0.5680 0.6212 0.5934 0.2222 0.2857 0.25 0.9146 0.9317 0.9231 0.4783 0.2115 0.2933 0.1875 0.4286 0.2609 0.5226 0.6917
0.0024 24.0 3000 0.6273 0.6126 0.6806 0.6448 0.9103 0.5983 0.695 0.6430 0.1842 0.28 0.2222 0.8929 0.9259 0.9091 0.3171 0.1884 0.2364 0.3784 0.5185 0.4375 0.7234 0.8053 0.7621 0.2581 0.32 0.2857 0.9390 0.9506 0.9448 0.5455 0.2727 0.3636 0.5161 0.5926 0.5517 0.5893 0.6721 0.6280 0.1923 0.2381 0.2128 0.9042 0.9379 0.9207 0.375 0.1731 0.2368 0.2857 0.4286 0.3429 0.5249 0.6994
0.0024 25.0 3125 0.6096 0.6133 0.6682 0.6396 0.9102 0.5985 0.6783 0.6359 0.1892 0.28 0.2258 0.8976 0.9198 0.9085 0.2955 0.1884 0.2301 0.4 0.5185 0.4516 0.7534 0.7788 0.7659 0.2286 0.32 0.2667 0.9503 0.9444 0.9474 0.4857 0.2576 0.3366 0.5769 0.5556 0.5660 0.5750 0.6477 0.6092 0.1481 0.1905 0.1667 0.9091 0.9317 0.9202 0.2903 0.1731 0.2169 0.2 0.3571 0.2564 0.5052 0.6919
0.0024 26.0 3250 0.6085 0.6352 0.6784 0.6561 0.9153 0.6150 0.6817 0.6466 0.2162 0.32 0.2581 0.8876 0.9259 0.9063 0.4359 0.2464 0.3148 0.4545 0.5556 0.5 0.7695 0.7858 0.7776 0.2353 0.32 0.2712 0.9277 0.9506 0.9390 0.6111 0.3333 0.4314 0.6154 0.5926 0.6038 0.6011 0.6660 0.6319 0.1923 0.2381 0.2128 0.8994 0.9441 0.9212 0.4231 0.2115 0.2821 0.3158 0.4286 0.3636 0.5434 0.7107
0.0024 27.0 3375 0.6359 0.6191 0.6682 0.6427 0.9123 0.6015 0.6767 0.6369 0.1842 0.28 0.2222 0.9091 0.9259 0.9174 0.3171 0.1884 0.2364 0.4118 0.5185 0.4590 0.7383 0.7841 0.7605 0.2353 0.32 0.2712 0.9387 0.9444 0.9415 0.5625 0.2727 0.3673 0.5357 0.5556 0.5455 0.5849 0.6456 0.6137 0.1538 0.1905 0.1702 0.9085 0.9255 0.9169 0.36 0.1731 0.2338 0.2609 0.4286 0.3243 0.5145 0.6927
0.0012 28.0 3500 0.6558 0.6220 0.6670 0.6437 0.9129 0.6080 0.68 0.6420 0.1707 0.28 0.2121 0.9030 0.9198 0.9113 0.3421 0.1884 0.2430 0.375 0.4444 0.4068 0.7688 0.7770 0.7729 0.2286 0.32 0.2667 0.9387 0.9444 0.9415 0.5758 0.2879 0.3838 0.5185 0.5185 0.5185 0.5814 0.6619 0.6190 0.1613 0.2381 0.1923 0.9085 0.9255 0.9169 0.3636 0.1538 0.2162 0.3 0.4286 0.3529 0.5181 0.6989
0.0012 29.0 3625 0.6434 0.6277 0.6625 0.6446 0.9142 0.6054 0.67 0.6361 0.175 0.28 0.2154 0.9030 0.9198 0.9113 0.4 0.1739 0.2424 0.4545 0.5556 0.5 0.7599 0.7788 0.7692 0.2571 0.36 0.3 0.9444 0.9444 0.9444 0.625 0.2273 0.3333 0.5517 0.5926 0.5714 0.5824 0.6477 0.6133 0.1379 0.1905 0.16 0.9085 0.9255 0.9169 0.4 0.1538 0.2222 0.3333 0.4286 0.375 0.5206 0.6954
0.0012 30.0 3750 0.6456 0.6324 0.6625 0.6471 0.9152 0.6076 0.6683 0.6365 0.1892 0.28 0.2258 0.9036 0.9259 0.9146 0.4 0.2029 0.2692 0.4815 0.4815 0.4815 0.7570 0.7611 0.7590 0.2353 0.32 0.2712 0.9273 0.9444 0.9358 0.5484 0.2576 0.3505 0.6087 0.5185 0.5600 0.5860 0.6456 0.6143 0.1481 0.1905 0.1667 0.9030 0.9255 0.9141 0.4091 0.1731 0.2432 0.3529 0.4286 0.3871 0.5202 0.6918
0.0012 31.0 3875 0.6698 0.6266 0.6670 0.6462 0.9126 0.6048 0.6733 0.6372 0.2 0.28 0.2333 0.8982 0.9259 0.9119 0.3590 0.2029 0.2593 0.4516 0.5185 0.4828 0.7513 0.7805 0.7656 0.2581 0.32 0.2857 0.9333 0.9506 0.9419 0.5152 0.2576 0.3434 0.5357 0.5556 0.5455 0.5839 0.6517 0.6160 0.1667 0.1905 0.1778 0.8982 0.9317 0.9146 0.4348 0.1923 0.2667 0.3529 0.4286 0.3871 0.5244 0.6960
0.0006 32.0 4000 0.6512 0.6253 0.6727 0.6481 0.9143 0.6119 0.6833 0.6457 0.1842 0.28 0.2222 0.9042 0.9321 0.9179 0.3171 0.1884 0.2364 0.3824 0.4815 0.4262 0.7539 0.7752 0.7644 0.2353 0.32 0.2712 0.9387 0.9444 0.9415 0.5152 0.2576 0.3434 0.5357 0.5556 0.5455 0.5851 0.6578 0.6194 0.1481 0.1905 0.1667 0.9036 0.9317 0.9174 0.3704 0.1923 0.2532 0.3 0.4286 0.3529 0.5176 0.6957
0.0006 33.0 4125 0.6629 0.6432 0.6636 0.6533 0.9169 0.6289 0.6667 0.6472 0.2059 0.28 0.2373 0.9036 0.9259 0.9146 0.3721 0.2319 0.2857 0.4062 0.4815 0.4407 0.7822 0.7628 0.7724 0.2581 0.32 0.2857 0.9273 0.9444 0.9358 0.6316 0.3636 0.4615 0.5769 0.5556 0.5660 0.6031 0.6436 0.6227 0.1538 0.1905 0.1702 0.9091 0.9317 0.9202 0.3929 0.2115 0.275 0.2727 0.4286 0.3333 0.5343 0.7054
0.0006 34.0 4250 0.6629 0.6356 0.6716 0.6531 0.9149 0.6138 0.6833 0.6467 0.2059 0.28 0.2373 0.9091 0.9259 0.9174 0.3514 0.1884 0.2453 0.4483 0.4815 0.4643 0.75 0.7752 0.7624 0.2581 0.32 0.2857 0.9503 0.9444 0.9474 0.5152 0.2576 0.3434 0.56 0.5185 0.5385 0.5923 0.6599 0.6243 0.2 0.2381 0.2174 0.9152 0.9379 0.9264 0.3913 0.1731 0.24 0.375 0.4286 0.4000 0.5285 0.6988
0.0006 35.0 4375 0.6643 0.6373 0.6727 0.6545 0.9155 0.6156 0.6833 0.6477 0.2059 0.28 0.2373 0.9036 0.9259 0.9146 0.3514 0.1884 0.2453 0.4828 0.5185 0.5 0.7509 0.7735 0.7620 0.2581 0.32 0.2857 0.9444 0.9444 0.9444 0.5 0.2576 0.3400 0.5769 0.5556 0.5660 0.5905 0.6578 0.6224 0.2 0.2381 0.2174 0.9096 0.9379 0.9235 0.3913 0.1731 0.24 0.4 0.4286 0.4138 0.5315 0.6979
0.0005 36.0 4500 0.6727 0.6329 0.6716 0.6516 0.9140 0.6077 0.6817 0.6426 0.2424 0.32 0.2759 0.9091 0.9259 0.9174 0.3421 0.1884 0.2430 0.4643 0.4815 0.4727 0.7457 0.7735 0.7593 0.2581 0.32 0.2857 0.9563 0.9444 0.9503 0.5 0.2424 0.3265 0.56 0.5185 0.5385 0.5794 0.6538 0.6144 0.2083 0.2381 0.2222 0.9152 0.9379 0.9264 0.4 0.1923 0.2597 0.3529 0.4286 0.3871 0.5270 0.6948
0.0005 37.0 4625 0.6779 0.6390 0.6716 0.6549 0.9142 0.6150 0.6817 0.6466 0.2424 0.32 0.2759 0.9091 0.9259 0.9174 0.3611 0.1884 0.2476 0.4483 0.4815 0.4643 0.7470 0.7735 0.76 0.2581 0.32 0.2857 0.9563 0.9444 0.9503 0.5 0.2273 0.3125 0.5185 0.5185 0.5185 0.5894 0.6578 0.6218 0.2083 0.2381 0.2222 0.9152 0.9379 0.9264 0.4167 0.1923 0.2632 0.375 0.4286 0.4000 0.5261 0.6967
0.0005 38.0 4750 0.6804 0.6429 0.6727 0.6574 0.9147 0.6210 0.68 0.6492 0.25 0.32 0.2807 0.9036 0.9259 0.9146 0.3684 0.2029 0.2617 0.4516 0.5185 0.4828 0.7534 0.7735 0.7633 0.2581 0.32 0.2857 0.9563 0.9444 0.9503 0.5 0.2576 0.3400 0.5172 0.5556 0.5357 0.5967 0.6599 0.6267 0.2174 0.2381 0.2273 0.9096 0.9379 0.9235 0.4 0.1923 0.2597 0.3529 0.4286 0.3871 0.5299 0.7004
0.0005 39.0 4875 0.6818 0.6421 0.6704 0.6560 0.9147 0.6170 0.6767 0.6455 0.2581 0.32 0.2857 0.9036 0.9259 0.9146 0.4054 0.2174 0.2830 0.4333 0.4815 0.4561 0.7522 0.7735 0.7627 0.2667 0.32 0.2909 0.9563 0.9444 0.9503 0.5294 0.2727 0.36 0.5 0.5185 0.5091 0.5938 0.6578 0.6242 0.2174 0.2381 0.2273 0.9096 0.9379 0.9235 0.4 0.1923 0.2597 0.375 0.4286 0.4000 0.5308 0.6999
0.0002 40.0 5000 0.6830 0.6428 0.6704 0.6563 0.9147 0.6180 0.6767 0.6460 0.2581 0.32 0.2857 0.9036 0.9259 0.9146 0.4054 0.2174 0.2830 0.4333 0.4815 0.4561 0.7509 0.7735 0.7620 0.2667 0.32 0.2909 0.9563 0.9444 0.9503 0.5312 0.2576 0.3469 0.5 0.5185 0.5091 0.5959 0.6578 0.6254 0.2174 0.2381 0.2273 0.9096 0.9379 0.9235 0.4 0.1923 0.2597 0.375 0.4286 0.4000 0.5295 0.6995

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
7
Safetensors
Model size
11.1M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for csNoHug/albert-base-v2-finetuned-ner-cadec

Finetuned
(161)
this model