system HF staff commited on
Commit
5dc3ff4
1 Parent(s): eb661f0

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +72 -0
README.md ADDED
@@ -0,0 +1,72 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ inference: false
3
+ ---
4
+
5
+
6
+ <!-- ---
7
+ language: pt
8
+ license: mit
9
+ tags:
10
+ - t5
11
+ - pytorch
12
+ - tensorflow
13
+ datasets:
14
+ - brWaC
15
+ widget:
16
+ - text: "Texto de exemplo em português"
17
+ --- -->
18
+
19
+ <!-- inference: false -->
20
+
21
+ # Portuguese T5 (aka "PTT5")
22
+
23
+ ## Introduction
24
+ PTT5 is a T5 model pretrained in the BrWac corpus, a large collection of web pages in Portuguese, improving T5's performance on Portuguese sentence similarity and entailment tasks. It's available in three sizes (small, base and large) and two vocabularies (Google's T5 original and ours, trained on Portuguese Wikipedia).
25
+
26
+ For further information or requests, please go to [PTT5 repository](https://github.com/unicamp-dl/PTT5).
27
+
28
+ ## Available models
29
+ | Model | Size | #Params | Vocabulary |
30
+ | :-: | :-: | :-: | :-: |
31
+ | [unicamp-dl/ptt5-small-t5-vocab](https://huggingface.co/unicamp-dl/ptt5-small-t5-vocab) | small | 60M | Google's T5 |
32
+ | [unicamp-dl/ptt5-base-t5-vocab](https://huggingface.co/unicamp-dl/ptt5-base-t5-vocab) | base | 220M | Google's T5 |
33
+ | [unicamp-dl/ptt5-large-t5-vocab](https://huggingface.co/unicamp-dl/ptt5-large-t5-vocab) | large | 740M | Google's T5 |
34
+ | [unicamp-dl/ptt5-small-portuguese-vocab](https://huggingface.co/unicamp-dl/ptt5-small-portuguese-vocab) | small | 60M | Portuguese |
35
+ | **[unicamp-dl/ptt5-base-portuguese-vocab](https://huggingface.co/unicamp-dl/ptt5-base-portuguese-vocab)** **(Recommended)** | **base** | **220M** | **Portuguese** |
36
+ | [unicamp-dl/ptt5-large-portuguese-vocab](https://huggingface.co/unicamp-dl/ptt5-large-portuguese-vocab) | large | 740M | Portuguese |
37
+
38
+ ## Usage
39
+ ```python
40
+ # Tokenizer
41
+ from transformers import AutoTokenizer # or T5Tokenizer
42
+
43
+ # PyTorch (bare model, baremodel + language modeling head)
44
+ from transformers import T5Model, T5ForConditionalGeneration
45
+
46
+ # Tensorflow (bare model, baremodel + language modeling head)
47
+ from transformers import TFT5Model, TFT5ForConditionalGeneration
48
+
49
+ model_name = 'unicamp-dl/ptt5-base-portuguese-vocab'
50
+
51
+ tokenizer = T5Tokenizer.from_pretrained(model_name)
52
+
53
+ # PyTorch
54
+ model_pt = T5ForConditionalGeneration.from_pretrained(model_name)
55
+
56
+ # TensorFlow
57
+ model_tf = TFT5ForConditionalGeneration.from_pretrained(model_name)
58
+ ```
59
+
60
+
61
+ ## Citation
62
+ We are preparing an arXiv submission and soon will provide a citation. For now, if you need to cite use:
63
+ ```bibtex
64
+ @misc{ptt5_2020,
65
+ Author = {Carmo, Diedre and Piau, Marcos and Campiotti, Israel and Nogueira, Rodrigo and Lotufo, Roberto},
66
+ Title = {PTT5: Pre-training and validating the T5 transformer in Brazilian Portuguese data},
67
+ Year = {2020},
68
+ Publisher = {GitHub},
69
+ Journal = {GitHub repository},
70
+ Howpublished = {\url{https://github.com/unicamp-dl/PTT5}}
71
+ }
72
+ ```