pegasus-large-cnn-dailymail
This model is a fine-tuned version of google/pegasus-large on the cnn_dailymail dataset.
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
For training of this PEGASUS-large 10k samples were taken from CNN-Dailymail Dataset
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
Framework versions
- Transformers 4.34.1
- Pytorch 2.1.0+cu118
- Datasets 2.14.6
- Tokenizers 0.14.1
- Downloads last month
- 14
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for sabre-code/pegasus-large-cnn-dailymail
Base model
google/pegasus-large