henryscheible
commited on
Commit
•
e5081b2
1
Parent(s):
a535576
update model card README.md
Browse files
README.md
CHANGED
@@ -2,9 +2,26 @@
|
|
2 |
license: apache-2.0
|
3 |
tags:
|
4 |
- generated_from_trainer
|
|
|
|
|
|
|
|
|
5 |
model-index:
|
6 |
- name: t5-small_crows_pairs_finetuned
|
7 |
-
results:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
8 |
---
|
9 |
|
10 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
@@ -12,7 +29,14 @@ should probably proofread and complete it, then remove this comment. -->
|
|
12 |
|
13 |
# t5-small_crows_pairs_finetuned
|
14 |
|
15 |
-
This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
16 |
|
17 |
## Model description
|
18 |
|
@@ -39,6 +63,69 @@ The following hyperparameters were used during training:
|
|
39 |
- lr_scheduler_type: linear
|
40 |
- num_epochs: 30
|
41 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
42 |
### Framework versions
|
43 |
|
44 |
- Transformers 4.26.1
|
|
|
2 |
license: apache-2.0
|
3 |
tags:
|
4 |
- generated_from_trainer
|
5 |
+
datasets:
|
6 |
+
- crows_pairs
|
7 |
+
metrics:
|
8 |
+
- accuracy
|
9 |
model-index:
|
10 |
- name: t5-small_crows_pairs_finetuned
|
11 |
+
results:
|
12 |
+
- task:
|
13 |
+
name: Sequence-to-sequence Language Modeling
|
14 |
+
type: text2text-generation
|
15 |
+
dataset:
|
16 |
+
name: crows_pairs
|
17 |
+
type: crows_pairs
|
18 |
+
config: crows_pairs
|
19 |
+
split: test
|
20 |
+
args: crows_pairs
|
21 |
+
metrics:
|
22 |
+
- name: Accuracy
|
23 |
+
type: accuracy
|
24 |
+
value: 0.7218543046357616
|
25 |
---
|
26 |
|
27 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
|
|
29 |
|
30 |
# t5-small_crows_pairs_finetuned
|
31 |
|
32 |
+
This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the crows_pairs dataset.
|
33 |
+
It achieves the following results on the evaluation set:
|
34 |
+
- Loss: 1.0876
|
35 |
+
- Accuracy: 0.7219
|
36 |
+
- Tp: 0.4901
|
37 |
+
- Tn: 0.2318
|
38 |
+
- Fp: 0.2649
|
39 |
+
- Fn: 0.0132
|
40 |
|
41 |
## Model description
|
42 |
|
|
|
63 |
- lr_scheduler_type: linear
|
64 |
- num_epochs: 30
|
65 |
|
66 |
+
### Training results
|
67 |
+
|
68 |
+
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Tp | Tn | Fp | Fn |
|
69 |
+
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:------:|:------:|:------:|
|
70 |
+
| 0.4776 | 0.53 | 20 | 0.3559 | 0.5033 | 0.5033 | 0.0 | 0.4967 | 0.0 |
|
71 |
+
| 0.3708 | 1.05 | 40 | 0.3431 | 0.5265 | 0.5033 | 0.0232 | 0.4735 | 0.0 |
|
72 |
+
| 0.3905 | 1.58 | 60 | 0.3414 | 0.6325 | 0.5033 | 0.1291 | 0.3675 | 0.0 |
|
73 |
+
| 0.3526 | 2.11 | 80 | 0.3260 | 0.5464 | 0.5033 | 0.0430 | 0.4536 | 0.0 |
|
74 |
+
| 0.2734 | 2.63 | 100 | 0.3047 | 0.8179 | 0.4934 | 0.3245 | 0.1722 | 0.0099 |
|
75 |
+
| 0.2429 | 3.16 | 120 | 0.2631 | 0.6623 | 0.5033 | 0.1589 | 0.3377 | 0.0 |
|
76 |
+
| 0.2161 | 3.68 | 140 | 0.2829 | 0.7152 | 0.4901 | 0.2252 | 0.2715 | 0.0132 |
|
77 |
+
| 0.1617 | 4.21 | 160 | 0.2947 | 0.7748 | 0.4901 | 0.2848 | 0.2119 | 0.0132 |
|
78 |
+
| 0.1776 | 4.74 | 180 | 0.3258 | 0.7384 | 0.4901 | 0.2483 | 0.2483 | 0.0132 |
|
79 |
+
| 0.1113 | 5.26 | 200 | 0.3362 | 0.7815 | 0.4901 | 0.2914 | 0.2053 | 0.0132 |
|
80 |
+
| 0.1008 | 5.79 | 220 | 0.4016 | 0.7748 | 0.4868 | 0.2881 | 0.2086 | 0.0166 |
|
81 |
+
| 0.0572 | 6.32 | 240 | 0.4486 | 0.7219 | 0.4901 | 0.2318 | 0.2649 | 0.0132 |
|
82 |
+
| 0.0701 | 6.84 | 260 | 0.4561 | 0.7384 | 0.4901 | 0.2483 | 0.2483 | 0.0132 |
|
83 |
+
| 0.051 | 7.37 | 280 | 0.4813 | 0.7815 | 0.4868 | 0.2947 | 0.2020 | 0.0166 |
|
84 |
+
| 0.0408 | 7.89 | 300 | 0.5279 | 0.7815 | 0.4868 | 0.2947 | 0.2020 | 0.0166 |
|
85 |
+
| 0.0326 | 8.42 | 320 | 0.6454 | 0.7550 | 0.4934 | 0.2616 | 0.2351 | 0.0099 |
|
86 |
+
| 0.0361 | 8.95 | 340 | 0.7559 | 0.7417 | 0.4901 | 0.2517 | 0.2450 | 0.0132 |
|
87 |
+
| 0.0304 | 9.47 | 360 | 0.7422 | 0.7318 | 0.4868 | 0.2450 | 0.2517 | 0.0166 |
|
88 |
+
| 0.0299 | 10.0 | 380 | 0.7770 | 0.7450 | 0.4868 | 0.2583 | 0.2384 | 0.0166 |
|
89 |
+
| 0.0227 | 10.53 | 400 | 0.7033 | 0.7947 | 0.4801 | 0.3146 | 0.1821 | 0.0232 |
|
90 |
+
| 0.017 | 11.05 | 420 | 0.7220 | 0.7649 | 0.4801 | 0.2848 | 0.2119 | 0.0232 |
|
91 |
+
| 0.0166 | 11.58 | 440 | 0.7674 | 0.7649 | 0.4636 | 0.3013 | 0.1954 | 0.0397 |
|
92 |
+
| 0.0123 | 12.11 | 460 | 0.8153 | 0.7616 | 0.4834 | 0.2781 | 0.2185 | 0.0199 |
|
93 |
+
| 0.0084 | 12.63 | 480 | 0.8422 | 0.7483 | 0.4834 | 0.2649 | 0.2318 | 0.0199 |
|
94 |
+
| 0.0178 | 13.16 | 500 | 0.7960 | 0.7649 | 0.4735 | 0.2914 | 0.2053 | 0.0298 |
|
95 |
+
| 0.016 | 13.68 | 520 | 0.8152 | 0.7086 | 0.4834 | 0.2252 | 0.2715 | 0.0199 |
|
96 |
+
| 0.007 | 14.21 | 540 | 0.8518 | 0.6589 | 0.4901 | 0.1689 | 0.3278 | 0.0132 |
|
97 |
+
| 0.0019 | 14.74 | 560 | 0.8647 | 0.7020 | 0.4834 | 0.2185 | 0.2781 | 0.0199 |
|
98 |
+
| 0.0159 | 15.26 | 580 | 0.8817 | 0.7086 | 0.4834 | 0.2252 | 0.2715 | 0.0199 |
|
99 |
+
| 0.0016 | 15.79 | 600 | 0.9105 | 0.6689 | 0.4834 | 0.1854 | 0.3113 | 0.0199 |
|
100 |
+
| 0.0079 | 16.32 | 620 | 0.9311 | 0.7053 | 0.4834 | 0.2219 | 0.2748 | 0.0199 |
|
101 |
+
| 0.0045 | 16.84 | 640 | 0.9586 | 0.7086 | 0.4834 | 0.2252 | 0.2715 | 0.0199 |
|
102 |
+
| 0.0033 | 17.37 | 660 | 0.9765 | 0.7252 | 0.4834 | 0.2417 | 0.2550 | 0.0199 |
|
103 |
+
| 0.0078 | 17.89 | 680 | 1.0263 | 0.7086 | 0.4834 | 0.2252 | 0.2715 | 0.0199 |
|
104 |
+
| 0.0047 | 18.42 | 700 | 0.9929 | 0.7351 | 0.4768 | 0.2583 | 0.2384 | 0.0265 |
|
105 |
+
| 0.0082 | 18.95 | 720 | 1.0001 | 0.7185 | 0.4801 | 0.2384 | 0.2583 | 0.0232 |
|
106 |
+
| 0.0022 | 19.47 | 740 | 1.0150 | 0.7086 | 0.4801 | 0.2285 | 0.2682 | 0.0232 |
|
107 |
+
| 0.0027 | 20.0 | 760 | 1.0638 | 0.6887 | 0.4901 | 0.1987 | 0.2980 | 0.0132 |
|
108 |
+
| 0.0025 | 20.53 | 780 | 1.0124 | 0.7020 | 0.4834 | 0.2185 | 0.2781 | 0.0199 |
|
109 |
+
| 0.007 | 21.05 | 800 | 1.0082 | 0.6987 | 0.4834 | 0.2152 | 0.2815 | 0.0199 |
|
110 |
+
| 0.0119 | 21.58 | 820 | 1.0225 | 0.7119 | 0.4801 | 0.2318 | 0.2649 | 0.0232 |
|
111 |
+
| 0.0016 | 22.11 | 840 | 1.0494 | 0.7053 | 0.4901 | 0.2152 | 0.2815 | 0.0132 |
|
112 |
+
| 0.0007 | 22.63 | 860 | 1.0515 | 0.7152 | 0.4868 | 0.2285 | 0.2682 | 0.0166 |
|
113 |
+
| 0.0014 | 23.16 | 880 | 1.0492 | 0.7119 | 0.4868 | 0.2252 | 0.2715 | 0.0166 |
|
114 |
+
| 0.002 | 23.68 | 900 | 1.0970 | 0.7020 | 0.4934 | 0.2086 | 0.2881 | 0.0099 |
|
115 |
+
| 0.0003 | 24.21 | 920 | 1.0429 | 0.7185 | 0.4834 | 0.2351 | 0.2616 | 0.0199 |
|
116 |
+
| 0.0002 | 24.74 | 940 | 1.0772 | 0.7053 | 0.4868 | 0.2185 | 0.2781 | 0.0166 |
|
117 |
+
| 0.0008 | 25.26 | 960 | 1.0766 | 0.7119 | 0.4934 | 0.2185 | 0.2781 | 0.0099 |
|
118 |
+
| 0.001 | 25.79 | 980 | 1.0720 | 0.7185 | 0.4934 | 0.2252 | 0.2715 | 0.0099 |
|
119 |
+
| 0.0002 | 26.32 | 1000 | 1.0763 | 0.7152 | 0.4901 | 0.2252 | 0.2715 | 0.0132 |
|
120 |
+
| 0.0002 | 26.84 | 1020 | 1.0675 | 0.7185 | 0.4901 | 0.2285 | 0.2682 | 0.0132 |
|
121 |
+
| 0.0011 | 27.37 | 1040 | 1.0745 | 0.7185 | 0.4834 | 0.2351 | 0.2616 | 0.0199 |
|
122 |
+
| 0.0007 | 27.89 | 1060 | 1.0792 | 0.7185 | 0.4901 | 0.2285 | 0.2682 | 0.0132 |
|
123 |
+
| 0.0007 | 28.42 | 1080 | 1.0880 | 0.7152 | 0.4934 | 0.2219 | 0.2748 | 0.0099 |
|
124 |
+
| 0.0005 | 28.95 | 1100 | 1.0903 | 0.7185 | 0.4934 | 0.2252 | 0.2715 | 0.0099 |
|
125 |
+
| 0.0025 | 29.47 | 1120 | 1.0877 | 0.7185 | 0.4901 | 0.2285 | 0.2682 | 0.0132 |
|
126 |
+
| 0.0004 | 30.0 | 1140 | 1.0876 | 0.7219 | 0.4901 | 0.2318 | 0.2649 | 0.0132 |
|
127 |
+
|
128 |
+
|
129 |
### Framework versions
|
130 |
|
131 |
- Transformers 4.26.1
|