yoshitomo-matsubara
commited on
Commit
•
832b0e3
1
Parent(s):
42cd75b
initial commit
Browse files- README.md +18 -0
- config.json +26 -0
- pytorch_model.bin +3 -0
- special_tokens_map.json +1 -0
- tokenizer.json +0 -0
- tokenizer_config.json +1 -0
- training.log +60 -0
- vocab.txt +0 -0
README.md
ADDED
@@ -0,0 +1,18 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
language: en
|
3 |
+
tags:
|
4 |
+
- bert
|
5 |
+
- mrpc
|
6 |
+
- glue
|
7 |
+
- torchdistill
|
8 |
+
license: apache-2.0
|
9 |
+
datasets:
|
10 |
+
- mrpc
|
11 |
+
metrics:
|
12 |
+
- f1
|
13 |
+
- accuracy
|
14 |
+
---
|
15 |
+
|
16 |
+
`bert-base-uncased` fine-tuned on MRPC dataset, using [***torchdistill***](https://github.com/yoshitomo-matsubara/torchdistill) and [Google Colab](https://colab.research.google.com/github/yoshitomo-matsubara/torchdistill/blob/master/demo/glue_finetuning_and_submission.ipynb).
|
17 |
+
The hyperparameters are the same as those in Hugging Face's example and/or the paper of BERT, and the training configuration (including hyperparameters) is available [here](https://github.com/yoshitomo-matsubara/torchdistill/blob/main/configs/sample/glue/mrpc/ce/bert_base_uncased.yaml).
|
18 |
+
I submitted prediction files to [the GLUE leaderboard](https://gluebenchmark.com/leaderboard), and the overall GLUE score was **80.2**.
|
config.json
ADDED
@@ -0,0 +1,26 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"_name_or_path": "bert-base-uncased",
|
3 |
+
"architectures": [
|
4 |
+
"BertForSequenceClassification"
|
5 |
+
],
|
6 |
+
"attention_probs_dropout_prob": 0.1,
|
7 |
+
"finetuning_task": "mrpc",
|
8 |
+
"gradient_checkpointing": false,
|
9 |
+
"hidden_act": "gelu",
|
10 |
+
"hidden_dropout_prob": 0.1,
|
11 |
+
"hidden_size": 768,
|
12 |
+
"initializer_range": 0.02,
|
13 |
+
"intermediate_size": 3072,
|
14 |
+
"layer_norm_eps": 1e-12,
|
15 |
+
"max_position_embeddings": 512,
|
16 |
+
"model_type": "bert",
|
17 |
+
"num_attention_heads": 12,
|
18 |
+
"num_hidden_layers": 12,
|
19 |
+
"pad_token_id": 0,
|
20 |
+
"position_embedding_type": "absolute",
|
21 |
+
"problem_type": "single_label_classification",
|
22 |
+
"transformers_version": "4.6.1",
|
23 |
+
"type_vocab_size": 2,
|
24 |
+
"use_cache": true,
|
25 |
+
"vocab_size": 30522
|
26 |
+
}
|
pytorch_model.bin
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:a6894aad4927594992cf88237f337b9a48eb4be3f0ddcea6cf5b980f90b21405
|
3 |
+
size 438024457
|
special_tokens_map.json
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
{"unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]"}
|
tokenizer.json
ADDED
The diff for this file is too large to render.
See raw diff
|
|
tokenizer_config.json
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
{"do_lower_case": true, "unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]", "tokenize_chinese_chars": true, "strip_accents": null, "do_lower": true, "model_max_length": 512, "special_tokens_map_file": null, "name_or_path": "bert-base-uncased"}
|
training.log
ADDED
@@ -0,0 +1,60 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
2021-05-27 22:43:02,432 INFO __main__ Namespace(adjust_lr=False, config='torchdistill/configs/sample/glue/mrpc/ce/bert_base_uncased.yaml', log='log/glue/mrpc/ce/bert_base_uncased.txt', private_output='leaderboard/glue/standard/bert_base_uncased/', seed=None, student_only=False, task_name='mrpc', test_only=False, world_size=1)
|
2 |
+
2021-05-27 22:43:02,460 INFO __main__ Distributed environment: NO
|
3 |
+
Num processes: 1
|
4 |
+
Process index: 0
|
5 |
+
Local process index: 0
|
6 |
+
Device: cuda
|
7 |
+
Use FP16 precision: True
|
8 |
+
|
9 |
+
2021-05-27 22:43:07,596 WARNING datasets.builder Reusing dataset glue (/root/.cache/huggingface/datasets/glue/mrpc/1.0.0/dacbe3125aa31d7f70367a07a8a9e72a5a0bfeb5fc42e75c9db75b96da6053ad)
|
10 |
+
2021-05-27 22:43:09,157 INFO __main__ Start training
|
11 |
+
2021-05-27 22:43:09,157 INFO torchdistill.models.util [student model]
|
12 |
+
2021-05-27 22:43:09,158 INFO torchdistill.models.util Using the original student model
|
13 |
+
2021-05-27 22:43:09,158 INFO torchdistill.core.training Loss = 1.0 * OrgLoss
|
14 |
+
2021-05-27 22:43:11,724 INFO torchdistill.misc.log Epoch: [0] [ 0/230] eta: 0:00:38 lr: 4.9956521739130436e-05 sample/s: 24.35787334127007 loss: 0.6288 (0.6288) time: 0.1682 data: 0.0040 max mem: 1883
|
15 |
+
2021-05-27 22:43:19,858 INFO torchdistill.misc.log Epoch: [0] [ 50/230] eta: 0:00:29 lr: 4.7782608695652175e-05 sample/s: 25.151060327587707 loss: 0.5948 (0.6191) time: 0.1625 data: 0.0024 max mem: 2791
|
16 |
+
2021-05-27 22:43:27,829 INFO torchdistill.misc.log Epoch: [0] [100/230] eta: 0:00:20 lr: 4.5608695652173914e-05 sample/s: 26.44901099282227 loss: 0.5307 (0.5978) time: 0.1588 data: 0.0023 max mem: 2880
|
17 |
+
2021-05-27 22:43:35,885 INFO torchdistill.misc.log Epoch: [0] [150/230] eta: 0:00:12 lr: 4.3434782608695654e-05 sample/s: 27.124775269999354 loss: 0.5620 (0.5827) time: 0.1594 data: 0.0023 max mem: 2880
|
18 |
+
2021-05-27 22:43:43,849 INFO torchdistill.misc.log Epoch: [0] [200/230] eta: 0:00:04 lr: 4.126086956521739e-05 sample/s: 25.07550204164599 loss: 0.5060 (0.5631) time: 0.1580 data: 0.0023 max mem: 2880
|
19 |
+
2021-05-27 22:43:48,455 INFO torchdistill.misc.log Epoch: [0] Total time: 0:00:36
|
20 |
+
2021-05-27 22:43:49,705 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/mrpc/default_experiment-1-0.arrow
|
21 |
+
2021-05-27 22:43:49,706 INFO __main__ Validation: accuracy = 0.8112745098039216, f1 = 0.8701517706576728
|
22 |
+
2021-05-27 22:43:49,706 INFO __main__ Updating ckpt at ./resource/ckpt/glue/mrpc/ce/mrpc-bert-base-uncased
|
23 |
+
2021-05-27 22:43:50,994 INFO torchdistill.misc.log Epoch: [1] [ 0/230] eta: 0:00:35 lr: 3.995652173913044e-05 sample/s: 27.032513683654216 loss: 0.5486 (0.5486) time: 0.1525 data: 0.0045 max mem: 2880
|
24 |
+
2021-05-27 22:43:58,967 INFO torchdistill.misc.log Epoch: [1] [ 50/230] eta: 0:00:28 lr: 3.7782608695652176e-05 sample/s: 29.77206923598369 loss: 0.3705 (0.3844) time: 0.1589 data: 0.0023 max mem: 2880
|
25 |
+
2021-05-27 22:44:06,974 INFO torchdistill.misc.log Epoch: [1] [100/230] eta: 0:00:20 lr: 3.5608695652173915e-05 sample/s: 25.122813742872204 loss: 0.3969 (0.3978) time: 0.1616 data: 0.0023 max mem: 2880
|
26 |
+
2021-05-27 22:44:15,046 INFO torchdistill.misc.log Epoch: [1] [150/230] eta: 0:00:12 lr: 3.3434782608695655e-05 sample/s: 23.426238389660583 loss: 0.3003 (0.3903) time: 0.1623 data: 0.0024 max mem: 2880
|
27 |
+
2021-05-27 22:44:23,128 INFO torchdistill.misc.log Epoch: [1] [200/230] eta: 0:00:04 lr: 3.1260869565217394e-05 sample/s: 25.108113002264297 loss: 0.2929 (0.3789) time: 0.1637 data: 0.0024 max mem: 2880
|
28 |
+
2021-05-27 22:44:27,685 INFO torchdistill.misc.log Epoch: [1] Total time: 0:00:36
|
29 |
+
2021-05-27 22:44:28,945 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/mrpc/default_experiment-1-0.arrow
|
30 |
+
2021-05-27 22:44:28,946 INFO __main__ Validation: accuracy = 0.8725490196078431, f1 = 0.9068100358422939
|
31 |
+
2021-05-27 22:44:28,946 INFO __main__ Updating ckpt at ./resource/ckpt/glue/mrpc/ce/mrpc-bert-base-uncased
|
32 |
+
2021-05-27 22:44:30,399 INFO torchdistill.misc.log Epoch: [2] [ 0/230] eta: 0:00:38 lr: 2.9956521739130438e-05 sample/s: 24.807797861266206 loss: 0.0967 (0.0967) time: 0.1670 data: 0.0058 max mem: 2880
|
33 |
+
2021-05-27 22:44:38,497 INFO torchdistill.misc.log Epoch: [2] [ 50/230] eta: 0:00:29 lr: 2.7782608695652174e-05 sample/s: 25.094029980226573 loss: 0.0729 (0.1408) time: 0.1629 data: 0.0023 max mem: 2880
|
34 |
+
2021-05-27 22:44:46,653 INFO torchdistill.misc.log Epoch: [2] [100/230] eta: 0:00:21 lr: 2.5608695652173913e-05 sample/s: 25.213161711112615 loss: 0.0524 (0.1608) time: 0.1631 data: 0.0024 max mem: 2880
|
35 |
+
2021-05-27 22:44:54,619 INFO torchdistill.misc.log Epoch: [2] [150/230] eta: 0:00:12 lr: 2.3434782608695656e-05 sample/s: 25.193095525906795 loss: 0.0695 (0.1629) time: 0.1553 data: 0.0023 max mem: 2880
|
36 |
+
2021-05-27 22:45:02,620 INFO torchdistill.misc.log Epoch: [2] [200/230] eta: 0:00:04 lr: 2.126086956521739e-05 sample/s: 27.233573952721294 loss: 0.1384 (0.1721) time: 0.1623 data: 0.0023 max mem: 2880
|
37 |
+
2021-05-27 22:45:07,172 INFO torchdistill.misc.log Epoch: [2] Total time: 0:00:36
|
38 |
+
2021-05-27 22:45:08,428 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/mrpc/default_experiment-1-0.arrow
|
39 |
+
2021-05-27 22:45:08,428 INFO __main__ Validation: accuracy = 0.8553921568627451, f1 = 0.8984509466437176
|
40 |
+
2021-05-27 22:45:08,590 INFO torchdistill.misc.log Epoch: [3] [ 0/230] eta: 0:00:37 lr: 1.9956521739130435e-05 sample/s: 25.15935804498244 loss: 0.0007 (0.0007) time: 0.1614 data: 0.0024 max mem: 2880
|
41 |
+
2021-05-27 22:45:16,673 INFO torchdistill.misc.log Epoch: [3] [ 50/230] eta: 0:00:29 lr: 1.7782608695652174e-05 sample/s: 25.001066968676422 loss: 0.0001 (0.0465) time: 0.1639 data: 0.0023 max mem: 2880
|
42 |
+
2021-05-27 22:45:24,669 INFO torchdistill.misc.log Epoch: [3] [100/230] eta: 0:00:20 lr: 1.5608695652173914e-05 sample/s: 27.17050458396156 loss: 0.0000 (0.0732) time: 0.1560 data: 0.0023 max mem: 2880
|
43 |
+
2021-05-27 22:45:32,573 INFO torchdistill.misc.log Epoch: [3] [150/230] eta: 0:00:12 lr: 1.3434782608695653e-05 sample/s: 25.425263387607142 loss: 0.0000 (0.0976) time: 0.1571 data: 0.0023 max mem: 2880
|
44 |
+
2021-05-27 22:45:40,580 INFO torchdistill.misc.log Epoch: [3] [200/230] eta: 0:00:04 lr: 1.126086956521739e-05 sample/s: 22.02089838648496 loss: 0.0000 (0.1179) time: 0.1596 data: 0.0024 max mem: 2880
|
45 |
+
2021-05-27 22:45:45,185 INFO torchdistill.misc.log Epoch: [3] Total time: 0:00:36
|
46 |
+
2021-05-27 22:45:46,434 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/mrpc/default_experiment-1-0.arrow
|
47 |
+
2021-05-27 22:45:46,434 INFO __main__ Validation: accuracy = 0.8112745098039216, f1 = 0.8499025341130605
|
48 |
+
2021-05-27 22:45:46,608 INFO torchdistill.misc.log Epoch: [4] [ 0/230] eta: 0:00:39 lr: 9.956521739130436e-06 sample/s: 23.514462112027584 loss: 0.3838 (0.3838) time: 0.1726 data: 0.0025 max mem: 2880
|
49 |
+
2021-05-27 22:45:54,606 INFO torchdistill.misc.log Epoch: [4] [ 50/230] eta: 0:00:28 lr: 7.782608695652174e-06 sample/s: 27.12218811178524 loss: 0.0000 (0.0474) time: 0.1598 data: 0.0023 max mem: 2880
|
50 |
+
2021-05-27 22:46:02,496 INFO torchdistill.misc.log Epoch: [4] [100/230] eta: 0:00:20 lr: 5.608695652173914e-06 sample/s: 27.583716001703305 loss: 0.0000 (0.0583) time: 0.1606 data: 0.0023 max mem: 2880
|
51 |
+
2021-05-27 22:46:10,378 INFO torchdistill.misc.log Epoch: [4] [150/230] eta: 0:00:12 lr: 3.4347826086956526e-06 sample/s: 22.05644916380617 loss: 0.0000 (0.0455) time: 0.1558 data: 0.0024 max mem: 2880
|
52 |
+
2021-05-27 22:46:18,525 INFO torchdistill.misc.log Epoch: [4] [200/230] eta: 0:00:04 lr: 1.2608695652173913e-06 sample/s: 23.465589513420824 loss: 0.0000 (0.0659) time: 0.1637 data: 0.0023 max mem: 2880
|
53 |
+
2021-05-27 22:46:22,985 INFO torchdistill.misc.log Epoch: [4] Total time: 0:00:36
|
54 |
+
2021-05-27 22:46:24,235 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/mrpc/default_experiment-1-0.arrow
|
55 |
+
2021-05-27 22:46:24,235 INFO __main__ Validation: accuracy = 0.8480392156862745, f1 = 0.8938356164383561
|
56 |
+
2021-05-27 22:46:27,993 INFO __main__ [Student: bert-base-uncased]
|
57 |
+
2021-05-27 22:46:29,258 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/mrpc/default_experiment-1-0.arrow
|
58 |
+
2021-05-27 22:46:29,259 INFO __main__ Test: accuracy = 0.8725490196078431, f1 = 0.9068100358422939
|
59 |
+
2021-05-27 22:46:29,259 INFO __main__ Start prediction for private dataset(s)
|
60 |
+
2021-05-27 22:46:29,260 INFO __main__ mrpc/test: 1725 samples
|
vocab.txt
ADDED
The diff for this file is too large to render.
See raw diff
|
|