chickens-60-epoch-200-images-aug

This model is a fine-tuned version of facebook/detr-resnet-50 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3172
  • Map: 0.7813
  • Map 50: 0.9672
  • Map 75: 0.9061
  • Map Small: 0.5029
  • Map Medium: 0.7534
  • Map Large: 0.8581
  • Mar 1: 0.2609
  • Mar 10: 0.8237
  • Mar 100: 0.8407
  • Mar Small: 0.5407
  • Mar Medium: 0.8181
  • Mar Large: 0.9033
  • Map Chicken: 0.7893
  • Mar 100 Chicken: 0.8558
  • Map Duck: 0.7616
  • Mar 100 Duck: 0.825
  • Map Plant: 0.7929
  • Mar 100 Plant: 0.8414

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 2
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • num_epochs: 60

Training results

Training Loss Epoch Step Validation Loss Map Map 50 Map 75 Map Small Map Medium Map Large Mar 1 Mar 10 Mar 100 Mar Small Mar Medium Mar Large Map Chicken Mar 100 Chicken Map Duck Mar 100 Duck Map Plant Mar 100 Plant
1.39 1.0 497 1.4568 0.1658 0.2544 0.19 0.0007 0.1178 0.572 0.0735 0.2448 0.2951 0.0444 0.2733 0.7627 0.0281 0.15 0.0297 0.0324 0.4395 0.7029
1.0818 2.0 994 1.2408 0.1924 0.2866 0.2221 0.0107 0.1379 0.6468 0.0738 0.2559 0.2789 0.0333 0.2522 0.75 0.038 0.1558 0.0 0.0 0.5391 0.6809
1.0382 3.0 1491 1.1565 0.2389 0.3456 0.2667 0.0175 0.2088 0.6989 0.0965 0.3342 0.3793 0.3111 0.3535 0.7928 0.094 0.4112 0.0 0.0 0.6229 0.7266
0.9389 4.0 1988 1.0233 0.2651 0.3695 0.3088 0.0187 0.2392 0.7309 0.104 0.3987 0.46 0.3519 0.4331 0.8088 0.137 0.6396 0.0 0.0 0.6583 0.7404
0.8606 5.0 2485 0.9402 0.3365 0.502 0.3846 0.0152 0.3165 0.7061 0.1389 0.4708 0.5159 0.1667 0.494 0.7807 0.2694 0.6954 0.0976 0.1345 0.6424 0.7178
0.8217 6.0 2982 0.8550 0.3399 0.497 0.3807 0.0512 0.3161 0.7385 0.1247 0.4612 0.4907 0.1222 0.4637 0.8108 0.304 0.6796 0.0474 0.0554 0.6685 0.7371
0.8555 7.0 3479 0.7985 0.3644 0.512 0.4361 0.0648 0.3433 0.7396 0.1417 0.5009 0.531 0.2296 0.5061 0.801 0.3468 0.7642 0.0697 0.0926 0.6768 0.7363
0.7846 8.0 3976 0.7775 0.4093 0.5643 0.4936 0.0811 0.3834 0.7663 0.1702 0.541 0.5874 0.2926 0.563 0.8176 0.3666 0.7942 0.1645 0.2162 0.6968 0.752
1.1121 9.0 4473 0.7245 0.3964 0.5446 0.4803 0.0597 0.3762 0.759 0.1392 0.5093 0.5425 0.2889 0.5193 0.8111 0.4271 0.8033 0.0679 0.0764 0.6942 0.7479
0.7181 10.0 4970 0.7218 0.3985 0.5517 0.4777 0.1619 0.3747 0.7474 0.14 0.5086 0.5403 0.2852 0.5159 0.8033 0.4358 0.785 0.0785 0.0959 0.6812 0.74
0.7338 11.0 5467 0.7031 0.5215 0.7323 0.6247 0.2342 0.4985 0.7549 0.1927 0.6175 0.6458 0.3667 0.6213 0.8124 0.5003 0.7454 0.3735 0.4466 0.6908 0.7455
0.7107 12.0 5964 0.6559 0.4871 0.6743 0.5933 0.1353 0.4609 0.7674 0.1755 0.5688 0.5983 0.3222 0.5742 0.8131 0.5366 0.7946 0.227 0.2514 0.6979 0.7488
0.6867 13.0 6461 0.6307 0.5689 0.7843 0.6921 0.2219 0.5409 0.7784 0.204 0.6467 0.6719 0.2407 0.6486 0.8265 0.5677 0.775 0.4308 0.4804 0.7081 0.7602
0.6659 14.0 6958 0.6498 0.5874 0.834 0.7233 0.1414 0.564 0.7351 0.2123 0.6699 0.6914 0.337 0.6663 0.8023 0.5821 0.7738 0.5066 0.5655 0.6736 0.7348
0.6979 15.0 7455 0.5984 0.6138 0.874 0.7383 0.1613 0.5832 0.7824 0.2239 0.6946 0.7171 0.2963 0.6913 0.8278 0.5625 0.7554 0.5718 0.6405 0.7071 0.7553
0.5107 16.0 7952 0.5714 0.6356 0.9011 0.7713 0.2066 0.6084 0.789 0.2249 0.72 0.7417 0.3704 0.7177 0.8415 0.5761 0.7579 0.6135 0.6953 0.7172 0.7719
0.6271 17.0 8449 0.5257 0.6636 0.9051 0.7937 0.351 0.6331 0.8044 0.2317 0.7393 0.7635 0.4556 0.7407 0.85 0.6165 0.7917 0.6398 0.7122 0.7346 0.7867
0.5645 18.0 8946 0.5132 0.6835 0.9215 0.8205 0.4803 0.6611 0.792 0.2429 0.7496 0.769 0.5185 0.7498 0.8382 0.6527 0.7871 0.6679 0.7372 0.7299 0.7826
0.6116 19.0 9443 0.5127 0.6574 0.9452 0.7874 0.4883 0.6233 0.7941 0.2285 0.7226 0.7382 0.5111 0.7106 0.8435 0.6334 0.7325 0.62 0.7095 0.7188 0.7727
0.5896 20.0 9940 0.4924 0.6682 0.9411 0.8196 0.5009 0.6364 0.7906 0.2348 0.732 0.747 0.5222 0.7187 0.8438 0.6419 0.7417 0.6441 0.7284 0.7186 0.7709
0.538 21.0 10437 0.4794 0.6745 0.947 0.8269 0.4886 0.6421 0.7972 0.2385 0.7357 0.7528 0.5222 0.7256 0.8536 0.6632 0.7508 0.6336 0.723 0.7266 0.7846
0.581 22.0 10934 0.4552 0.6926 0.9556 0.8348 0.5355 0.662 0.8043 0.2412 0.7491 0.7662 0.5556 0.7429 0.8569 0.6813 0.7629 0.6591 0.7405 0.7374 0.7951
0.4655 23.0 11431 0.4418 0.6984 0.9577 0.8555 0.506 0.6693 0.799 0.2408 0.7511 0.7678 0.5481 0.7446 0.849 0.7097 0.7825 0.6537 0.7345 0.7317 0.7865
0.4435 24.0 11928 0.4251 0.7043 0.9552 0.8358 0.3469 0.6769 0.802 0.2439 0.7612 0.7763 0.3889 0.7496 0.8614 0.7102 0.7892 0.6703 0.7507 0.7325 0.7891
0.521 25.0 12425 0.4064 0.7268 0.962 0.8799 0.3649 0.6998 0.8139 0.25 0.7767 0.7937 0.6222 0.7714 0.8657 0.7058 0.78 0.7239 0.7939 0.7507 0.807
0.5358 26.0 12922 0.4021 0.7158 0.9596 0.8604 0.3895 0.6886 0.8158 0.2497 0.7677 0.7857 0.4148 0.7646 0.866 0.7198 0.8012 0.6743 0.7493 0.7533 0.8066
0.478 27.0 13419 0.4045 0.712 0.9597 0.8763 0.4779 0.6811 0.8136 0.2431 0.7606 0.7788 0.5037 0.7558 0.8611 0.6946 0.7763 0.6954 0.7615 0.746 0.7986
0.4684 28.0 13916 0.3945 0.7246 0.9589 0.8756 0.3809 0.6987 0.8248 0.2436 0.7704 0.7887 0.4481 0.7704 0.868 0.7186 0.7837 0.6913 0.7682 0.7639 0.8141
0.4758 29.0 14413 0.3802 0.7363 0.9607 0.8843 0.4282 0.7047 0.8264 0.2492 0.7806 0.7994 0.5 0.7777 0.8765 0.7271 0.8 0.7199 0.7811 0.762 0.817
0.5724 30.0 14910 0.3773 0.7508 0.9608 0.8825 0.4597 0.7246 0.8255 0.2525 0.7929 0.8103 0.5741 0.787 0.8755 0.7633 0.8275 0.7277 0.7905 0.7612 0.8129
0.5229 31.0 15407 0.3797 0.7341 0.964 0.8791 0.4828 0.705 0.8393 0.2508 0.7812 0.7987 0.5111 0.773 0.8843 0.7341 0.8037 0.7007 0.7757 0.7675 0.8166
0.4247 32.0 15904 0.3720 0.7533 0.9598 0.8998 0.4806 0.7271 0.8303 0.2523 0.7979 0.8164 0.5185 0.7945 0.8758 0.7561 0.8271 0.7396 0.8061 0.7641 0.816
0.4791 33.0 16401 0.3666 0.7556 0.968 0.8978 0.4072 0.725 0.8319 0.2561 0.7978 0.8154 0.4778 0.7944 0.8804 0.7659 0.8292 0.7335 0.7953 0.7676 0.8219
0.4107 34.0 16898 0.3568 0.751 0.9592 0.8941 0.5087 0.722 0.8416 0.2537 0.7945 0.8128 0.5667 0.7901 0.8886 0.7645 0.8263 0.7143 0.7851 0.7742 0.8271
0.5572 35.0 17395 0.3460 0.7607 0.9613 0.9006 0.4575 0.7377 0.838 0.259 0.8017 0.82 0.5111 0.8016 0.8817 0.7648 0.8304 0.7384 0.8 0.779 0.8295
0.4188 36.0 17892 0.3547 0.7522 0.9623 0.8907 0.4372 0.7274 0.8258 0.2572 0.7951 0.8122 0.4852 0.7921 0.8729 0.7627 0.8288 0.7286 0.7919 0.7653 0.816
0.3833 37.0 18389 0.3474 0.7589 0.9619 0.9009 0.484 0.731 0.8389 0.2569 0.8028 0.8208 0.5296 0.799 0.8876 0.7742 0.8404 0.7286 0.7939 0.7739 0.8281
0.4402 38.0 18886 0.3546 0.7577 0.9662 0.9037 0.4664 0.73 0.8281 0.2579 0.7997 0.8166 0.5074 0.7943 0.8791 0.7628 0.8238 0.7449 0.8088 0.7654 0.8174
0.4705 39.0 19383 0.3458 0.7676 0.9625 0.8974 0.556 0.7396 0.8375 0.2606 0.8096 0.8269 0.6111 0.8058 0.884 0.7782 0.8413 0.7503 0.8142 0.7743 0.8254
0.4476 40.0 19880 0.3528 0.7613 0.9603 0.8951 0.4661 0.7346 0.8308 0.2579 0.8062 0.8248 0.5111 0.8027 0.8827 0.7736 0.8408 0.7424 0.8122 0.768 0.8215
0.4515 41.0 20377 0.3540 0.7652 0.9646 0.8982 0.5283 0.7364 0.8366 0.257 0.8087 0.8274 0.5926 0.8047 0.8899 0.7645 0.8296 0.7573 0.8216 0.7738 0.8309
0.5283 42.0 20874 0.3416 0.761 0.964 0.8967 0.4546 0.7305 0.8388 0.2547 0.8077 0.8256 0.5407 0.8031 0.8895 0.7714 0.8379 0.7368 0.8088 0.7748 0.8301
0.4371 43.0 21371 0.3397 0.7609 0.9658 0.9026 0.4562 0.7316 0.8418 0.2555 0.8073 0.8254 0.5185 0.8032 0.8905 0.7686 0.8388 0.736 0.8074 0.778 0.8299
0.3756 44.0 21868 0.3444 0.7645 0.9615 0.9075 0.5114 0.735 0.8403 0.2607 0.8085 0.8255 0.5815 0.8011 0.8925 0.7749 0.8379 0.744 0.8101 0.7746 0.8283
0.4228 45.0 22365 0.3341 0.7744 0.9645 0.8991 0.5075 0.7461 0.8398 0.2617 0.8159 0.8332 0.5593 0.8117 0.8899 0.7876 0.8479 0.7597 0.8216 0.776 0.8301
0.3976 46.0 22862 0.3294 0.7751 0.9648 0.907 0.4606 0.7469 0.8427 0.2638 0.8171 0.8352 0.5222 0.8156 0.8915 0.7821 0.8487 0.761 0.8223 0.7821 0.8346
0.4371 47.0 23359 0.3312 0.7744 0.9665 0.9009 0.4494 0.7465 0.8476 0.2596 0.8157 0.8334 0.5148 0.8127 0.8938 0.783 0.8471 0.7565 0.8182 0.7838 0.8348
0.3589 48.0 23856 0.3307 0.7705 0.9664 0.8963 0.5143 0.742 0.8432 0.2577 0.8134 0.8309 0.5667 0.809 0.8931 0.7732 0.8425 0.7577 0.8176 0.7806 0.8326
0.4881 49.0 24353 0.3237 0.7771 0.9655 0.9058 0.525 0.7485 0.8457 0.2593 0.8207 0.8383 0.5963 0.8163 0.8944 0.7878 0.8558 0.759 0.823 0.7843 0.8361
0.3637 50.0 24850 0.3246 0.7762 0.9647 0.9041 0.5141 0.7451 0.8517 0.2588 0.8173 0.8346 0.5741 0.8102 0.899 0.7824 0.8475 0.761 0.8216 0.7852 0.8348
0.3978 51.0 25347 0.3296 0.7744 0.9654 0.8952 0.5077 0.7459 0.8478 0.259 0.8172 0.8341 0.5519 0.8113 0.8954 0.7835 0.8475 0.7562 0.8223 0.7835 0.8324
0.3942 52.0 25844 0.3241 0.7808 0.9681 0.9075 0.5314 0.753 0.8568 0.2612 0.8214 0.8393 0.5815 0.8172 0.9023 0.7874 0.8533 0.7624 0.8223 0.7925 0.8424
0.4079 53.0 26341 0.3221 0.779 0.9708 0.9091 0.4946 0.7525 0.855 0.2598 0.8206 0.8377 0.5333 0.8162 0.9007 0.7867 0.8542 0.7583 0.8182 0.792 0.8408
0.3715 54.0 26838 0.3191 0.7831 0.969 0.9074 0.5022 0.7551 0.8596 0.2606 0.824 0.8417 0.5407 0.8199 0.9046 0.789 0.855 0.7646 0.8257 0.7957 0.8443
0.4328 55.0 27335 0.3184 0.7799 0.9686 0.9057 0.5077 0.7512 0.8586 0.2598 0.8217 0.8396 0.5481 0.8172 0.9023 0.7878 0.8562 0.7579 0.8209 0.7939 0.8416
0.3676 56.0 27832 0.3192 0.7792 0.9683 0.9059 0.5051 0.7509 0.8561 0.2603 0.8218 0.8394 0.5481 0.8168 0.9007 0.7884 0.8562 0.7581 0.8223 0.791 0.8396
0.3737 57.0 28329 0.3174 0.7814 0.9677 0.9071 0.5052 0.7529 0.8579 0.2606 0.823 0.8406 0.5481 0.8182 0.9023 0.7884 0.8558 0.7629 0.8243 0.7928 0.8416
0.3532 58.0 28826 0.3172 0.7813 0.9674 0.9062 0.5029 0.7531 0.8581 0.2607 0.8236 0.8407 0.5407 0.8179 0.9029 0.7897 0.8562 0.7617 0.825 0.7926 0.8408
0.356 59.0 29323 0.3172 0.7814 0.9672 0.9061 0.5029 0.7534 0.8587 0.2609 0.8236 0.8407 0.5407 0.818 0.9036 0.7887 0.8554 0.7616 0.825 0.794 0.8416
0.4282 60.0 29820 0.3172 0.7813 0.9672 0.9061 0.5029 0.7534 0.8581 0.2609 0.8237 0.8407 0.5407 0.8181 0.9033 0.7893 0.8558 0.7616 0.825 0.7929 0.8414

Framework versions

  • Transformers 4.45.2
  • Pytorch 2.4.1+cu121
  • Datasets 2.19.2
  • Tokenizers 0.20.0
Downloads last month
34
Safetensors
Model size
41.6M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for joe611/chickens-60-epoch-200-images-aug

Finetuned
(453)
this model