Update README.md
Browse files
README.md
CHANGED
@@ -227,7 +227,7 @@ This model is a fine-tuned version of [google/long-t5-tglobal-base](https://hugg
|
|
227 |
|
228 |
## Usage
|
229 |
|
230 |
-
It's recommended to
|
231 |
|
232 |
|
233 |
```bash
|
@@ -250,7 +250,7 @@ print(summary)
|
|
250 |
|
251 |
## Training and evaluation data
|
252 |
|
253 |
-
The `elife` subset of the
|
254 |
|
255 |
## Training procedure
|
256 |
|
|
|
227 |
|
228 |
## Usage
|
229 |
|
230 |
+
It's recommended to use this model with [beam search decoding](https://huggingface.co/docs/transformers/generation_strategies#beamsearch-decoding). If interested, you can also use the `textsum` util repo to have most of this abstracted out for you:
|
231 |
|
232 |
|
233 |
```bash
|
|
|
250 |
|
251 |
## Training and evaluation data
|
252 |
|
253 |
+
The `elife` subset of the lay summaries dataset. Refer to `pszemraj/scientific_lay_summarisation-elife-norm`
|
254 |
|
255 |
## Training procedure
|
256 |
|