henryscheible's picture
update model card README.md
2845f2a
metadata
license: apache-2.0
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: t5-small_winobias_finetuned
    results: []

t5-small_winobias_finetuned

This model is a fine-tuned version of t5-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2473
  • Accuracy: 0.5278
  • Tp: 0.5
  • Tn: 0.0278
  • Fp: 0.4722
  • Fn: 0.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Accuracy Tp Tn Fp Fn
0.6334 0.8 20 0.3622 0.5 0.5 0.0 0.5 0.0
0.4058 1.6 40 0.3510 0.5 0.5 0.0 0.5 0.0
0.3923 2.4 60 0.3511 0.5 0.5 0.0 0.5 0.0
0.376 3.2 80 0.3509 0.5 0.5 0.0 0.5 0.0
0.3749 4.0 100 0.3502 0.5 0.5 0.0 0.5 0.0
0.3895 4.8 120 0.3505 0.5 0.5 0.0 0.5 0.0
0.3624 5.6 140 0.3508 0.5 0.5 0.0 0.5 0.0
0.3754 6.4 160 0.3501 0.5 0.5 0.0 0.5 0.0
0.3702 7.2 180 0.3576 0.5 0.5 0.0 0.5 0.0
0.3748 8.0 200 0.3499 0.5 0.5 0.0 0.5 0.0
0.3715 8.8 220 0.3482 0.5 0.5 0.0 0.5 0.0
0.3576 9.6 240 0.3489 0.5 0.5 0.0 0.5 0.0
0.3659 10.4 260 0.3510 0.5 0.5 0.0 0.5 0.0
0.3565 11.2 280 0.3464 0.5 0.5 0.0 0.5 0.0
0.353 12.0 300 0.3474 0.5 0.5 0.0 0.5 0.0
0.3614 12.8 320 0.3450 0.5 0.5 0.0 0.5 0.0
0.3625 13.6 340 0.3458 0.5 0.5 0.0 0.5 0.0
0.36 14.4 360 0.3494 0.5 0.5 0.0 0.5 0.0
0.3585 15.2 380 0.3435 0.5 0.5 0.0 0.5 0.0
0.3541 16.0 400 0.3431 0.5 0.5 0.0 0.5 0.0
0.3564 16.8 420 0.3414 0.5 0.5 0.0 0.5 0.0
0.3462 17.6 440 0.3413 0.5 0.5 0.0 0.5 0.0
0.3541 18.4 460 0.3382 0.5 0.5 0.0 0.5 0.0
0.3579 19.2 480 0.3399 0.5 0.5 0.0 0.5 0.0
0.3466 20.0 500 0.3317 0.5 0.5 0.0 0.5 0.0
0.3314 20.8 520 0.3303 0.5 0.5 0.0 0.5 0.0
0.33 21.6 540 0.3246 0.5 0.5 0.0 0.5 0.0
0.3279 22.4 560 0.3154 0.5 0.5 0.0 0.5 0.0
0.3234 23.2 580 0.3050 0.5 0.5 0.0 0.5 0.0
0.3193 24.0 600 0.2947 0.5 0.5 0.0 0.5 0.0
0.3086 24.8 620 0.2849 0.5013 0.5 0.0013 0.4987 0.0
0.2912 25.6 640 0.2748 0.5013 0.5 0.0013 0.4987 0.0
0.2787 26.4 660 0.2655 0.5107 0.5 0.0107 0.4893 0.0
0.2779 27.2 680 0.2581 0.5177 0.5 0.0177 0.4823 0.0
0.2697 28.0 700 0.2527 0.5170 0.5 0.0170 0.4830 0.0
0.2669 28.8 720 0.2495 0.5259 0.5 0.0259 0.4741 0.0
0.2654 29.6 740 0.2473 0.5278 0.5 0.0278 0.4722 0.0

Framework versions

  • Transformers 4.26.1
  • Pytorch 1.13.1
  • Datasets 2.10.1
  • Tokenizers 0.13.2