pegasus-xsum-clara-med
This model is a fine-tuned version of google/pegasus-xsum on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.9013
- Rouge1: 43.7595
- Rouge2: 25.7022
- Rougel: 39.6153
- Rougelsum: 39.7151
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5.6e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum |
---|---|---|---|---|---|---|---|
No log | 1.0 | 190 | 2.5468 | 41.6125 | 24.1264 | 37.7704 | 37.8615 |
No log | 2.0 | 380 | 2.3603 | 41.9598 | 24.315 | 38.1087 | 38.217 |
2.7787 | 3.0 | 570 | 2.2604 | 42.0463 | 24.5067 | 38.1632 | 38.2716 |
2.7787 | 4.0 | 760 | 2.1846 | 42.1471 | 24.639 | 38.3677 | 38.471 |
2.2691 | 5.0 | 950 | 2.1361 | 42.4562 | 24.8962 | 38.6107 | 38.7065 |
2.2691 | 6.0 | 1140 | 2.0887 | 42.6005 | 24.947 | 38.7049 | 38.805 |
2.2691 | 7.0 | 1330 | 2.0617 | 42.7946 | 24.9509 | 38.9123 | 39.0003 |
2.0313 | 8.0 | 1520 | 2.0222 | 43.0201 | 25.3552 | 39.151 | 39.266 |
2.0313 | 9.0 | 1710 | 2.0049 | 43.2293 | 25.4719 | 39.4239 | 39.4944 |
1.872 | 10.0 | 1900 | 1.9899 | 43.2629 | 25.5285 | 39.4124 | 39.4591 |
1.872 | 11.0 | 2090 | 1.9772 | 43.4294 | 25.8006 | 39.5863 | 39.6726 |
1.872 | 12.0 | 2280 | 1.9630 | 43.63 | 25.7259 | 39.5521 | 39.6888 |
1.7497 | 13.0 | 2470 | 1.9513 | 43.4053 | 25.5567 | 39.4567 | 39.5918 |
1.7497 | 14.0 | 2660 | 1.9336 | 43.2584 | 25.4554 | 39.2917 | 39.3944 |
1.6609 | 15.0 | 2850 | 1.9345 | 43.2644 | 25.5958 | 39.3474 | 39.4645 |
1.6609 | 16.0 | 3040 | 1.9152 | 43.4404 | 25.6127 | 39.4472 | 39.5418 |
1.6609 | 17.0 | 3230 | 1.9106 | 43.2751 | 25.3213 | 39.2723 | 39.3871 |
1.5809 | 18.0 | 3420 | 1.9125 | 43.2335 | 25.341 | 39.2705 | 39.3577 |
1.5809 | 19.0 | 3610 | 1.9086 | 43.1679 | 25.3275 | 39.1858 | 39.303 |
1.5221 | 20.0 | 3800 | 1.9030 | 43.2794 | 25.4126 | 39.2902 | 39.4092 |
1.5221 | 21.0 | 3990 | 1.8996 | 43.1731 | 25.3819 | 39.1873 | 39.3172 |
1.5221 | 22.0 | 4180 | 1.9006 | 43.4949 | 25.4485 | 39.3092 | 39.4516 |
1.4714 | 23.0 | 4370 | 1.8977 | 43.5657 | 25.5974 | 39.4489 | 39.5257 |
1.4714 | 24.0 | 4560 | 1.9035 | 43.6444 | 25.6794 | 39.5809 | 39.683 |
1.4421 | 25.0 | 4750 | 1.9000 | 43.4825 | 25.5898 | 39.4319 | 39.4973 |
1.4421 | 26.0 | 4940 | 1.9030 | 43.4623 | 25.5726 | 39.461 | 39.6009 |
1.4421 | 27.0 | 5130 | 1.8993 | 43.3357 | 25.5518 | 39.3897 | 39.4672 |
1.4139 | 28.0 | 5320 | 1.9009 | 43.5834 | 25.7211 | 39.584 | 39.6725 |
1.4139 | 29.0 | 5510 | 1.9002 | 43.7115 | 25.6997 | 39.6603 | 39.7621 |
1.4016 | 30.0 | 5700 | 1.9013 | 43.7595 | 25.7022 | 39.6153 | 39.7151 |
Framework versions
- Transformers 4.25.1
- Pytorch 1.13.0
- Datasets 2.8.0
- Tokenizers 0.12.1
- Downloads last month
- 8
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.