gpt2-evy / README.md
joshcarp's picture
Model save
905dbb1 verified
|
raw
history blame
2.34 kB
metadata
license: mit
base_model: gpt2
tags:
  - generated_from_trainer
model-index:
  - name: gpt2-evy
    results: []

gpt2-evy

This model is a fine-tuned version of gpt2 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5637

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 2
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss
No log 1.0 31 1.1270
No log 2.0 62 0.9170
No log 3.0 93 0.8410
1.3727 4.0 124 0.7949
1.3727 5.0 155 0.7361
1.3727 6.0 186 0.7021
0.9077 7.0 217 0.6789
0.9077 8.0 248 0.6490
0.9077 9.0 279 0.6346
0.7106 10.0 310 0.6219
0.7106 11.0 341 0.5986
0.7106 12.0 372 0.5797
0.5814 13.0 403 0.5835
0.5814 14.0 434 0.5872
0.5814 15.0 465 0.5741
0.5814 16.0 496 0.5749
0.4916 17.0 527 0.5662
0.4916 18.0 558 0.5529
0.4916 19.0 589 0.5672
0.4005 20.0 620 0.5646
0.4005 21.0 651 0.5701
0.4005 22.0 682 0.5650
0.357 23.0 713 0.5637

Framework versions

  • Transformers 4.41.0
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1