Edit model card

kasrahabib/roberta-base-finetuned-iso29148-promise-km-labels-all-cls

This model is a fine-tuned version of FacebookAI/roberta-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 0.0093
  • Validation Loss: 0.1119
  • Epoch: 29

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 2370, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
  • training_precision: float32

Training results

Train Loss Validation Loss Epoch
2.2048 1.6598 0
1.1216 0.5631 1
0.3896 0.2574 2
0.1978 0.1997 3
0.1204 0.1526 4
0.0676 0.1887 5
0.0435 0.1289 6
0.0338 0.1219 7
0.0291 0.1140 8
0.0372 0.1829 9
0.0655 0.2036 10
0.0654 0.3368 11
0.1950 0.3786 12
0.0544 0.1708 13
0.0195 0.1446 14
0.0166 0.1364 15
0.0154 0.1302 16
0.0136 0.1272 17
0.0127 0.1251 18
0.0119 0.1248 19
0.0115 0.1231 20
0.0112 0.1214 21
0.0107 0.1190 22
0.0104 0.1166 23
0.0100 0.1157 24
0.0095 0.1131 25
0.0096 0.1126 26
0.0092 0.1120 27
0.0094 0.1119 28
0.0093 0.1119 29

Framework versions

  • Transformers 4.42.3
  • TensorFlow 2.15.0
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
1
Inference API
Unable to determine this model's library. Check the docs .

Model tree for kasrahabib/roberta-base-finetuned-iso29148-promise-km-labels-all-cls

Finetuned
(1285)
this model