rel-cl-lstm-2
This model is a fine-tuned version of on the None dataset. It achieves the following results on the evaluation set:
- Loss: 3.9785
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 2
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- training_steps: 3052726
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
4.8008 | 0.03 | 76319 | 4.7627 |
4.5151 | 0.03 | 152638 | 4.4816 |
4.374 | 0.03 | 228957 | 4.3478 |
4.2862 | 1.03 | 305276 | 4.2645 |
4.2226 | 0.03 | 381595 | 4.2082 |
4.1691 | 1.03 | 457914 | 4.1669 |
4.1333 | 0.03 | 534233 | 4.1361 |
4.1034 | 1.03 | 610552 | 4.1115 |
4.0735 | 0.03 | 686871 | 4.0929 |
4.051 | 1.03 | 763190 | 4.0768 |
4.0334 | 0.03 | 839509 | 4.0648 |
4.0153 | 1.03 | 915828 | 4.0532 |
3.9936 | 0.03 | 992147 | 4.0440 |
3.9834 | 0.03 | 1068466 | 4.0364 |
3.9681 | 1.03 | 1144785 | 4.0294 |
3.9586 | 0.03 | 1221105 | 4.0229 |
3.9442 | 1.03 | 1297425 | 4.0173 |
3.9351 | 0.03 | 1373745 | 4.0124 |
3.9238 | 1.03 | 1450065 | 4.0085 |
3.9209 | 0.03 | 1526385 | 4.0051 |
3.9142 | 1.03 | 1602705 | 4.0024 |
3.9116 | 0.03 | 1679025 | 3.9996 |
3.9073 | 1.03 | 1755345 | 3.9973 |
3.9009 | 0.03 | 1831665 | 3.9954 |
3.8922 | 1.03 | 1907985 | 3.9933 |
3.8829 | 0.03 | 1984305 | 3.9910 |
3.8762 | 1.03 | 2060625 | 3.9890 |
3.8746 | 0.03 | 2136945 | 3.9878 |
3.8673 | 1.03 | 2213265 | 3.9862 |
3.8607 | 0.03 | 2289585 | 3.9850 |
3.8607 | 0.03 | 2365905 | 3.9843 |
3.8592 | 0.03 | 2442225 | 3.9831 |
3.8521 | 1.03 | 2518545 | 3.9822 |
3.8487 | 0.03 | 2594865 | 3.9816 |
3.8455 | 1.03 | 2671185 | 3.9811 |
3.846 | 0.03 | 2747505 | 3.9803 |
3.846 | 1.03 | 2823825 | 3.9796 |
3.846 | 0.03 | 2900145 | 3.9794 |
3.8496 | 0.03 | 2976465 | 3.9789 |
3.8456 | 1.02 | 3052726 | 3.9785 |
Framework versions
- Transformers 4.33.3
- Pytorch 2.0.1
- Datasets 2.12.0
- Tokenizers 0.13.3
- Downloads last month
- 42