simonycl's picture
update model card README.md
4e95ee7
|
raw
history blame
10.7 kB
metadata
license: apache-2.0
base_model: albert-base-v2
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: best_model-yelp_polarity-32-13
    results: []

best_model-yelp_polarity-32-13

This model is a fine-tuned version of albert-base-v2 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5144
  • Accuracy: 0.9219

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 150

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 2 0.6082 0.8906
No log 2.0 4 0.6076 0.8906
No log 3.0 6 0.6029 0.9062
No log 4.0 8 0.6007 0.9062
0.5399 5.0 10 0.5942 0.9062
0.5399 6.0 12 0.5899 0.9062
0.5399 7.0 14 0.5812 0.9062
0.5399 8.0 16 0.5718 0.9062
0.5399 9.0 18 0.5613 0.9062
0.4539 10.0 20 0.5576 0.9219
0.4539 11.0 22 0.5573 0.9219
0.4539 12.0 24 0.5612 0.9219
0.4539 13.0 26 0.5724 0.9062
0.4539 14.0 28 0.6101 0.8906
0.3252 15.0 30 0.6515 0.8906
0.3252 16.0 32 0.6612 0.8906
0.3252 17.0 34 0.6215 0.8906
0.3252 18.0 36 0.5622 0.9219
0.3252 19.0 38 0.5454 0.9219
0.2192 20.0 40 0.5331 0.9219
0.2192 21.0 42 0.5137 0.9219
0.2192 22.0 44 0.5021 0.9219
0.2192 23.0 46 0.5023 0.9219
0.2192 24.0 48 0.5072 0.9219
0.1347 25.0 50 0.5089 0.9219
0.1347 26.0 52 0.5062 0.9219
0.1347 27.0 54 0.5079 0.9062
0.1347 28.0 56 0.5042 0.9062
0.1347 29.0 58 0.4885 0.9062
0.0984 30.0 60 0.4719 0.9062
0.0984 31.0 62 0.4657 0.9062
0.0984 32.0 64 0.4671 0.9062
0.0984 33.0 66 0.4626 0.9062
0.0984 34.0 68 0.4623 0.9062
0.0679 35.0 70 0.4629 0.9062
0.0679 36.0 72 0.4639 0.9062
0.0679 37.0 74 0.4669 0.9062
0.0679 38.0 76 0.4707 0.9062
0.0679 39.0 78 0.4729 0.9062
0.0447 40.0 80 0.4741 0.9062
0.0447 41.0 82 0.4789 0.9062
0.0447 42.0 84 0.4829 0.9062
0.0447 43.0 86 0.4858 0.9062
0.0447 44.0 88 0.4855 0.9062
0.0337 45.0 90 0.4863 0.9062
0.0337 46.0 92 0.4884 0.9062
0.0337 47.0 94 0.4888 0.9062
0.0337 48.0 96 0.4901 0.9062
0.0337 49.0 98 0.4937 0.9062
0.0241 50.0 100 0.5010 0.9062
0.0241 51.0 102 0.5028 0.9062
0.0241 52.0 104 0.4960 0.9062
0.0241 53.0 106 0.5056 0.9062
0.0241 54.0 108 0.5088 0.9062
0.0139 55.0 110 0.4949 0.9062
0.0139 56.0 112 0.4853 0.9062
0.0139 57.0 114 0.4616 0.9062
0.0139 58.0 116 0.4451 0.9219
0.0139 59.0 118 0.4400 0.9219
0.0064 60.0 120 0.4371 0.9219
0.0064 61.0 122 0.4255 0.9375
0.0064 62.0 124 0.4178 0.9375
0.0064 63.0 126 0.4154 0.9375
0.0064 64.0 128 0.4194 0.9375
0.0023 65.0 130 0.4217 0.9375
0.0023 66.0 132 0.4193 0.9375
0.0023 67.0 134 0.4165 0.9375
0.0023 68.0 136 0.4159 0.9375
0.0023 69.0 138 0.4167 0.9375
0.0009 70.0 140 0.4178 0.9375
0.0009 71.0 142 0.4197 0.9375
0.0009 72.0 144 0.4218 0.9375
0.0009 73.0 146 0.4239 0.9375
0.0009 74.0 148 0.4260 0.9375
0.0005 75.0 150 0.4281 0.9375
0.0005 76.0 152 0.4300 0.9375
0.0005 77.0 154 0.4318 0.9375
0.0005 78.0 156 0.4336 0.9375
0.0005 79.0 158 0.4353 0.9375
0.0003 80.0 160 0.4369 0.9375
0.0003 81.0 162 0.4384 0.9375
0.0003 82.0 164 0.4400 0.9375
0.0003 83.0 166 0.4414 0.9375
0.0003 84.0 168 0.4428 0.9375
0.0003 85.0 170 0.4441 0.9375
0.0003 86.0 172 0.4454 0.9375
0.0003 87.0 174 0.4466 0.9375
0.0003 88.0 176 0.4479 0.9375
0.0003 89.0 178 0.4491 0.9375
0.0002 90.0 180 0.4503 0.9375
0.0002 91.0 182 0.4515 0.9375
0.0002 92.0 184 0.4527 0.9375
0.0002 93.0 186 0.4540 0.9375
0.0002 94.0 188 0.4552 0.9375
0.0002 95.0 190 0.4565 0.9375
0.0002 96.0 192 0.4577 0.9375
0.0002 97.0 194 0.4592 0.9375
0.0002 98.0 196 0.4605 0.9375
0.0002 99.0 198 0.4619 0.9375
0.0002 100.0 200 0.4631 0.9375
0.0002 101.0 202 0.4645 0.9219
0.0002 102.0 204 0.4659 0.9219
0.0002 103.0 206 0.4671 0.9219
0.0002 104.0 208 0.4683 0.9219
0.0002 105.0 210 0.4696 0.9219
0.0002 106.0 212 0.4710 0.9219
0.0002 107.0 214 0.4723 0.9219
0.0002 108.0 216 0.4736 0.9219
0.0002 109.0 218 0.4748 0.9219
0.0002 110.0 220 0.4760 0.9219
0.0002 111.0 222 0.4773 0.9219
0.0002 112.0 224 0.4785 0.9219
0.0002 113.0 226 0.4798 0.9219
0.0002 114.0 228 0.4809 0.9219
0.0002 115.0 230 0.4820 0.9219
0.0002 116.0 232 0.4830 0.9219
0.0002 117.0 234 0.4839 0.9219
0.0002 118.0 236 0.4847 0.9219
0.0002 119.0 238 0.4853 0.9219
0.0001 120.0 240 0.4861 0.9219
0.0001 121.0 242 0.4868 0.9219
0.0001 122.0 244 0.4876 0.9219
0.0001 123.0 246 0.4883 0.9219
0.0001 124.0 248 0.4892 0.9219
0.0001 125.0 250 0.4901 0.9219
0.0001 126.0 252 0.4912 0.9219
0.0001 127.0 254 0.4921 0.9219
0.0001 128.0 256 0.4932 0.9219
0.0001 129.0 258 0.4944 0.9219
0.0001 130.0 260 0.4956 0.9219
0.0001 131.0 262 0.4967 0.9219
0.0001 132.0 264 0.4980 0.9219
0.0001 133.0 266 0.4993 0.9219
0.0001 134.0 268 0.5003 0.9219
0.0001 135.0 270 0.5014 0.9219
0.0001 136.0 272 0.5025 0.9219
0.0001 137.0 274 0.5034 0.9219
0.0001 138.0 276 0.5042 0.9219
0.0001 139.0 278 0.5048 0.9219
0.0001 140.0 280 0.5058 0.9219
0.0001 141.0 282 0.5064 0.9219
0.0001 142.0 284 0.5070 0.9219
0.0001 143.0 286 0.5079 0.9219
0.0001 144.0 288 0.5088 0.9219
0.0001 145.0 290 0.5096 0.9219
0.0001 146.0 292 0.5107 0.9219
0.0001 147.0 294 0.5118 0.9219
0.0001 148.0 296 0.5127 0.9219
0.0001 149.0 298 0.5136 0.9219
0.0001 150.0 300 0.5144 0.9219

Framework versions

  • Transformers 4.32.0.dev0
  • Pytorch 2.0.1+cu118
  • Datasets 2.4.0
  • Tokenizers 0.13.3