new all pull
#106
by
zederer
- opened
README.md
CHANGED
@@ -71,17 +71,24 @@ For more instructions, including how to run CLI and web demos, and model quantiz
|
|
71 |
|
72 |
## 引用
|
73 |
|
74 |
-
|
75 |
-
|
76 |
-
If you find our work helpful, please consider citing the following paper.
|
77 |
|
78 |
```
|
79 |
-
@
|
80 |
-
|
81 |
-
|
82 |
-
|
83 |
-
|
84 |
-
|
85 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
86 |
}
|
87 |
```
|
|
|
71 |
|
72 |
## 引用
|
73 |
|
74 |
+
如果你觉得我们的工作有帮助的话,请考虑引用下列论文:
|
|
|
|
|
75 |
|
76 |
```
|
77 |
+
@inproceedings{
|
78 |
+
zeng2023glm-130b,
|
79 |
+
title={{GLM}-130B: An Open Bilingual Pre-trained Model},
|
80 |
+
author={Aohan Zeng and Xiao Liu and Zhengxiao Du and Zihan Wang and Hanyu Lai and Ming Ding and Zhuoyi Yang and Yifan Xu and Wendi Zheng and Xiao Xia and Weng Lam Tam and Zixuan Ma and Yufei Xue and Jidong Zhai and Wenguang Chen and Zhiyuan Liu and Peng Zhang and Yuxiao Dong and Jie Tang},
|
81 |
+
booktitle={The Eleventh International Conference on Learning Representations (ICLR)},
|
82 |
+
year={2023},
|
83 |
+
url={https://openreview.net/forum?id=-Aw0rrrPUF}
|
84 |
+
}
|
85 |
+
```
|
86 |
+
```
|
87 |
+
@inproceedings{du2022glm,
|
88 |
+
title={GLM: General Language Model Pretraining with Autoregressive Blank Infilling},
|
89 |
+
author={Du, Zhengxiao and Qian, Yujie and Liu, Xiao and Ding, Ming and Qiu, Jiezhong and Yang, Zhilin and Tang, Jie},
|
90 |
+
booktitle={Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)},
|
91 |
+
pages={320--335},
|
92 |
+
year={2022}
|
93 |
}
|
94 |
```
|