DunnBC22's picture
Upload README.md
2152a71
|
raw
history blame
2.04 kB
---
license: mit
tags:
- generated_from_trainer
metrics:
- bleu
- rouge
model-index:
- name: mbart-large-50-Biomedical_Dataset
results: []
---
# mbart-large-50-Biomedical_Dataset
This model is a fine-tuned version of [facebook/mbart-large-50](https://huggingface.co/facebook/mbart-large-50).
It achieves the following results on the evaluation set:
- Training Loss: 1.0165
- Epoch: 1.0
- Step: 2636
- Validation Loss: 0.9425
- Bleu: 38.9893
- Rouge Metrics:
- Rouge1: 0.6826259612196924
- Rouge2: 0.473675987811788
- RougeL: 0.6586445010303293
- RougeLsum: 0.6585487473231793
- Meteor: 0.6299677745833094
- Prediction lengths: 24.362727392855568
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results [^1]
| Training Loss | Epoch | Step | Validation Loss | Bleu | Rouge1 | Rouge2 | RougeL | RougeLsum | Meteor | Prediction Lengths |
| :-------------: | :-------------: | :-------------: | :-------------: | :-------------: | :-------------: | :-------------: | :-------------: | :-------------: | :-------------: | :-------------: |
| 1.0165 | 1.0 | 2636 | 0.9425 | 38.9893 | 0.6826 | 0.4737 | 0.6586 | 0.6585 | 0.6270 | 24.3627 |
<br />
Footnotes:
[^1]: All results in this table are rounded to the nearest ten-thousandths of the decimal.
### Framework versions
- Transformers 4.26.1
- Pytorch 2.0.1
- Datasets 2.13.1
- Tokenizers 0.13.3
- Loss: 0.9425
- Bleu: 38.9893
- Rouge: {'rouge1': 0.6826259612196924, 'rouge2': 0.473675987811788, 'rougeL': 0.6586445010303293, 'rougeLsum': 0.6585487473231793}
- Meteor: {'meteor': 0.6299677745833094}
- Prediction Lengths: 24.362727392855568