gonglinyuan
commited on
Commit
•
15e4a2d
1
Parent(s):
4a5ca58
Update README.md
Browse files
README.md
CHANGED
@@ -119,7 +119,7 @@ Official repository: https://github.com/gonglinyuan/metro_t0
|
|
119 |
|
120 |
# METRO-T0
|
121 |
|
122 |
-
Paper: Model-Generated Pretraining Signals Improves Zero-Shot Generalization of Text-to-Text Transformers
|
123 |
|
124 |
METRO-T0 is a T5-style text-to-text Transformer pretrained using model-generated pretraining signals, prompt-finetuned on a family of public NLP tasks proposed in [T0](https://arxiv.org/abs/2110.08207).
|
125 |
METRO-T0 is highly parameter efficient. For example, METRO-T0-Large++ (775M parameters) outperforms GPT-3 (175B parameters) and T0-3B (3B parameters) on a wide range of NLP tasks.
|
@@ -165,5 +165,13 @@ print(tokenizer.decode(outputs[0], skip_special_tokens=True)) # expected: posit
|
|
165 |
If you find the code and models useful for your research, please cite the following paper:
|
166 |
|
167 |
```
|
168 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
169 |
```
|
|
|
119 |
|
120 |
# METRO-T0
|
121 |
|
122 |
+
Paper: [Model-Generated Pretraining Signals Improves Zero-Shot Generalization of Text-to-Text Transformers](https://arxiv.org/abs/2305.12567) (ACL 2023)
|
123 |
|
124 |
METRO-T0 is a T5-style text-to-text Transformer pretrained using model-generated pretraining signals, prompt-finetuned on a family of public NLP tasks proposed in [T0](https://arxiv.org/abs/2110.08207).
|
125 |
METRO-T0 is highly parameter efficient. For example, METRO-T0-Large++ (775M parameters) outperforms GPT-3 (175B parameters) and T0-3B (3B parameters) on a wide range of NLP tasks.
|
|
|
165 |
If you find the code and models useful for your research, please cite the following paper:
|
166 |
|
167 |
```
|
168 |
+
@misc{gong2023modelgenerated,
|
169 |
+
title={Model-Generated Pretraining Signals Improves Zero-Shot Generalization of Text-to-Text Transformers},
|
170 |
+
author={Linyuan Gong and Chenyan Xiong and Xiaodong Liu and Payal Bajaj and Yiqing Xie and Alvin Cheung and Jianfeng Gao and Xia Song},
|
171 |
+
year={2023},
|
172 |
+
eprint={2305.12567},
|
173 |
+
archivePrefix={arXiv},
|
174 |
+
primaryClass={cs.CL},
|
175 |
+
url={https://arxiv.org/abs/2305.12567}
|
176 |
+
}
|
177 |
```
|