Edit model card

PekingU_rtdetr_r50vd_cppe5_jw_1

This model is a fine-tuned version of PekingU/rtdetr_r50vd on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 12.2150
  • Map: 0.3305
  • Map 50: 0.6332
  • Map 75: 0.2924
  • Map Small: 0.1087
  • Map Medium: 0.2627
  • Map Large: 0.5123
  • Mar 1: 0.3018
  • Mar 10: 0.4833
  • Mar 100: 0.5384
  • Mar Small: 0.3203
  • Mar Medium: 0.478
  • Mar Large: 0.7067
  • Map Coverall: 0.5473
  • Mar 100 Coverall: 0.7158
  • Map Face Shield: 0.3566
  • Mar 100 Face Shield: 0.6063
  • Map Gloves: 0.2415
  • Mar 100 Gloves: 0.4621
  • Map Goggles: 0.2167
  • Mar 100 Goggles: 0.4431
  • Map Mask: 0.2905
  • Mar 100 Mask: 0.4649

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Map Map 50 Map 75 Map Small Map Medium Map Large Mar 1 Mar 10 Mar 100 Mar Small Mar Medium Mar Large Map Coverall Mar 100 Coverall Map Face Shield Mar 100 Face Shield Map Gloves Mar 100 Gloves Map Goggles Mar 100 Goggles Map Mask Mar 100 Mask
No log 1.0 107 13.6921 0.2046 0.3735 0.1947 0.068 0.1841 0.3025 0.2454 0.4315 0.4921 0.3014 0.4276 0.6762 0.4274 0.6712 0.0666 0.4848 0.116 0.3835 0.0986 0.4277 0.3143 0.4933
No log 2.0 214 12.2598 0.2507 0.4883 0.2181 0.1039 0.2171 0.3973 0.2841 0.4745 0.537 0.3358 0.4862 0.6937 0.4926 0.6901 0.1416 0.5722 0.1704 0.4478 0.1357 0.4769 0.3134 0.4982
No log 3.0 321 11.9174 0.2813 0.5306 0.2637 0.0892 0.2413 0.4566 0.2967 0.5005 0.5551 0.3245 0.5093 0.7123 0.5103 0.7081 0.214 0.6076 0.1968 0.4679 0.1747 0.4769 0.3108 0.5151
No log 4.0 428 11.9953 0.3027 0.5694 0.2825 0.1123 0.2568 0.4667 0.3125 0.5034 0.5641 0.3481 0.5136 0.7092 0.5386 0.7086 0.2789 0.6316 0.2018 0.4563 0.2069 0.5123 0.2872 0.5116
20.3574 5.0 535 11.9065 0.3094 0.6001 0.2701 0.111 0.2743 0.4646 0.3037 0.4914 0.5537 0.3398 0.5075 0.7127 0.5381 0.6973 0.2998 0.6101 0.1896 0.4629 0.2168 0.5077 0.3028 0.4907
20.3574 6.0 642 11.7344 0.3247 0.6261 0.2856 0.1295 0.2678 0.511 0.3124 0.4959 0.5571 0.3649 0.4939 0.7194 0.5191 0.6851 0.3164 0.6013 0.2084 0.4808 0.2565 0.5138 0.3234 0.5044
20.3574 7.0 749 11.8164 0.3334 0.6308 0.3036 0.1224 0.2636 0.5347 0.3184 0.5033 0.5614 0.3559 0.5025 0.7285 0.5403 0.6932 0.3151 0.6278 0.213 0.471 0.2695 0.5062 0.3293 0.5089
20.3574 8.0 856 11.7534 0.3368 0.6435 0.3015 0.1196 0.273 0.5312 0.3174 0.4979 0.5609 0.3269 0.501 0.7413 0.558 0.7104 0.3457 0.6316 0.2289 0.4754 0.2288 0.4908 0.3223 0.4964
20.3574 9.0 963 11.8592 0.3326 0.6485 0.2827 0.1199 0.2747 0.5222 0.3153 0.501 0.5546 0.3213 0.4936 0.729 0.5518 0.6986 0.3311 0.6 0.2148 0.4839 0.236 0.4785 0.3295 0.512
12.6823 10.0 1070 11.9904 0.333 0.6405 0.2978 0.1135 0.2741 0.5196 0.3091 0.4947 0.5467 0.3229 0.4953 0.7097 0.5403 0.7018 0.3419 0.6215 0.2384 0.4688 0.2325 0.4615 0.3117 0.48
12.6823 11.0 1177 11.9472 0.334 0.6479 0.2904 0.1056 0.2845 0.5164 0.314 0.505 0.5605 0.3419 0.4978 0.7316 0.5548 0.7023 0.3488 0.6089 0.229 0.5027 0.2365 0.5 0.301 0.4889
12.6823 12.0 1284 12.0204 0.336 0.6377 0.3032 0.1228 0.2728 0.5224 0.3162 0.5077 0.5591 0.3479 0.5015 0.7214 0.5353 0.6896 0.3297 0.6025 0.2531 0.5031 0.252 0.5031 0.3099 0.4969
12.6823 13.0 1391 11.9460 0.3373 0.6461 0.2912 0.1245 0.2783 0.5243 0.3113 0.502 0.5521 0.3388 0.4936 0.7192 0.5445 0.6986 0.3387 0.6013 0.2517 0.4893 0.2403 0.4846 0.3114 0.4867
12.6823 14.0 1498 11.9051 0.3523 0.6617 0.3226 0.136 0.2944 0.5363 0.3223 0.5036 0.5558 0.3249 0.508 0.7127 0.57 0.709 0.3597 0.6076 0.2491 0.5049 0.2586 0.4662 0.3239 0.4916
11.1387 15.0 1605 12.0044 0.3399 0.6562 0.2971 0.1349 0.2898 0.5089 0.3074 0.5 0.556 0.3511 0.4995 0.7189 0.5576 0.7144 0.338 0.581 0.2422 0.4946 0.2619 0.4985 0.3 0.4916
11.1387 16.0 1712 12.2005 0.3263 0.6241 0.2928 0.1109 0.2512 0.5052 0.3029 0.4896 0.5467 0.3166 0.4913 0.704 0.5254 0.7005 0.3614 0.6101 0.2504 0.5049 0.2274 0.4477 0.267 0.4702
11.1387 17.0 1819 12.0237 0.3394 0.6504 0.3072 0.1154 0.2833 0.5157 0.3127 0.4958 0.5516 0.3447 0.5022 0.7017 0.5558 0.7009 0.353 0.6089 0.2552 0.4924 0.2226 0.4708 0.3103 0.4849
11.1387 18.0 1926 12.1186 0.3327 0.6432 0.2865 0.1113 0.2699 0.5206 0.3047 0.4767 0.5402 0.3143 0.4822 0.7094 0.5491 0.7063 0.3614 0.6076 0.2396 0.4799 0.2131 0.4462 0.3002 0.4609
10.0798 19.0 2033 12.0813 0.3357 0.6531 0.2953 0.1239 0.2789 0.5139 0.3065 0.4932 0.5437 0.337 0.4774 0.7108 0.5488 0.7068 0.3669 0.6127 0.2413 0.4661 0.23 0.4615 0.2914 0.4716
10.0798 20.0 2140 12.1951 0.3343 0.6419 0.3053 0.1129 0.2649 0.5183 0.3102 0.4806 0.5394 0.3237 0.4768 0.7028 0.5574 0.7104 0.3622 0.6013 0.2465 0.4768 0.2126 0.4338 0.2927 0.4747
10.0798 21.0 2247 12.2319 0.3353 0.6449 0.2919 0.111 0.2746 0.5182 0.3082 0.4866 0.5404 0.3251 0.4866 0.7057 0.5494 0.7117 0.3628 0.6051 0.2454 0.4777 0.2196 0.4308 0.2992 0.4769
10.0798 22.0 2354 12.2877 0.3294 0.6321 0.2919 0.1089 0.261 0.5186 0.3053 0.4851 0.5447 0.3214 0.4871 0.7145 0.5372 0.7077 0.3567 0.6316 0.2353 0.4723 0.2212 0.44 0.2967 0.472
10.0798 23.0 2461 12.1634 0.3319 0.6397 0.3058 0.1157 0.2597 0.5252 0.3122 0.4853 0.5394 0.3303 0.483 0.707 0.5459 0.7117 0.3409 0.5886 0.2364 0.4737 0.2308 0.4492 0.3055 0.4738
9.3418 24.0 2568 12.1011 0.3331 0.6406 0.2928 0.1156 0.2575 0.5196 0.3056 0.4807 0.539 0.3326 0.4804 0.7006 0.556 0.7176 0.3524 0.6038 0.2341 0.4603 0.2233 0.4477 0.2999 0.4658
9.3418 25.0 2675 12.3668 0.3304 0.6382 0.2966 0.1091 0.2597 0.5184 0.305 0.4758 0.5322 0.3251 0.4719 0.6994 0.5502 0.7104 0.3498 0.5949 0.2376 0.4643 0.2223 0.4262 0.2921 0.4653
9.3418 26.0 2782 12.1546 0.3373 0.6453 0.2976 0.1169 0.2658 0.5251 0.3092 0.4862 0.5425 0.3409 0.4803 0.7113 0.5548 0.7095 0.3569 0.6152 0.2476 0.4612 0.2262 0.4646 0.3009 0.4622
9.3418 27.0 2889 12.2520 0.3326 0.6367 0.2929 0.1101 0.2617 0.5187 0.304 0.4815 0.5366 0.3183 0.4799 0.7078 0.5496 0.709 0.3546 0.6076 0.2416 0.4594 0.2189 0.4492 0.2983 0.4578
9.3418 28.0 2996 12.2222 0.3312 0.6378 0.2931 0.1093 0.2587 0.5152 0.3034 0.4821 0.5379 0.3194 0.4758 0.7086 0.5549 0.7108 0.3513 0.6025 0.2382 0.4612 0.2156 0.4523 0.2962 0.4627
8.908 29.0 3103 12.2006 0.3346 0.6397 0.2931 0.1112 0.2634 0.5211 0.3047 0.4837 0.5396 0.329 0.4763 0.71 0.5522 0.7144 0.3559 0.6 0.2435 0.4647 0.2193 0.4523 0.302 0.4667
8.908 30.0 3210 12.2150 0.3305 0.6332 0.2924 0.1087 0.2627 0.5123 0.3018 0.4833 0.5384 0.3203 0.478 0.7067 0.5473 0.7158 0.3566 0.6063 0.2415 0.4621 0.2167 0.4431 0.2905 0.4649

Framework versions

  • Transformers 4.44.0
  • Pytorch 2.5.0+cu124
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
2
Safetensors
Model size
42.9M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for jaxnwagner/PekingU_rtdetr_r50vd_cppe5_jw_1

Finetuned
(3)
this model