Update README.md
Browse files
README.md
CHANGED
@@ -1,7 +1,7 @@
|
|
1 |
---
|
2 |
license: apache-2.0
|
3 |
datasets:
|
4 |
-
- pszemraj/scientific_lay_summarisation-
|
5 |
language:
|
6 |
- en
|
7 |
widget:
|
@@ -197,7 +197,8 @@ library_name: transformers
|
|
197 |
|
198 |
# long-t5-tglobal-base-sci-simplify-elife
|
199 |
|
200 |
-
This model is a fine-tuned version of [google/long-t5-tglobal-base](https://huggingface.co/google/long-t5-tglobal-base) on the
|
|
|
201 |
It achieves the following results on the evaluation set:
|
202 |
- Loss: 1.9990
|
203 |
- Rouge1: 38.5587
|
@@ -216,7 +217,7 @@ More information needed
|
|
216 |
|
217 |
## Training and evaluation data
|
218 |
|
219 |
-
|
220 |
|
221 |
## Training procedure
|
222 |
|
|
|
1 |
---
|
2 |
license: apache-2.0
|
3 |
datasets:
|
4 |
+
- pszemraj/scientific_lay_summarisation-elife-norm
|
5 |
language:
|
6 |
- en
|
7 |
widget:
|
|
|
197 |
|
198 |
# long-t5-tglobal-base-sci-simplify-elife
|
199 |
|
200 |
+
This model is a fine-tuned version of [google/long-t5-tglobal-base](https://huggingface.co/google/long-t5-tglobal-base) on the `pszemraj/scientific_lay_summarisation-elife-norm` dataset.
|
201 |
+
|
202 |
It achieves the following results on the evaluation set:
|
203 |
- Loss: 1.9990
|
204 |
- Rouge1: 38.5587
|
|
|
217 |
|
218 |
## Training and evaluation data
|
219 |
|
220 |
+
The `elife` subset of the :lay summaries dataset. Refer to `pszemraj/scientific_lay_summarisation-elife-norm`.
|
221 |
|
222 |
## Training procedure
|
223 |
|