Chinese BERT, RoBERTa, MacBERT, LERT series
Collection
9 items
β’
Updated
β’
5
LERT is a linguistically-motivated pre-trained language model.
Further information: https://github.com/ymcui/LERT/blob/main/README_EN.md