yoshitomo-matsubara
commited on
Commit
•
7d0cf61
1
Parent(s):
0fb9641
initial commit
Browse files- README.md +17 -0
- config.json +26 -0
- pytorch_model.bin +3 -0
- special_tokens_map.json +1 -0
- tokenizer.json +0 -0
- tokenizer_config.json +1 -0
- training.log +56 -0
- vocab.txt +0 -0
README.md
ADDED
@@ -0,0 +1,17 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
language: en
|
3 |
+
tags:
|
4 |
+
- bert
|
5 |
+
- sst2
|
6 |
+
- glue
|
7 |
+
- torchdistill
|
8 |
+
license: apache-2.0
|
9 |
+
datasets:
|
10 |
+
- sst2
|
11 |
+
metrics:
|
12 |
+
- accuracy
|
13 |
+
---
|
14 |
+
|
15 |
+
`bert-base-uncased` fine-tuned on SST-2 dataset, using [***torchdistill***](https://github.com/yoshitomo-matsubara/torchdistill) and [Google Colab](https://colab.research.google.com/github/yoshitomo-matsubara/torchdistill/blob/master/demo/glue_finetuning_and_submission.ipynb).
|
16 |
+
The hyperparameters are the same as those in Hugging Face's example and/or the paper of BERT, and the training configuration (including hyperparameters) is available [here](https://github.com/yoshitomo-matsubara/torchdistill/blob/main/configs/sample/glue/sst2/ce/bert_base_uncased.yaml).
|
17 |
+
I submitted prediction files to [the GLUE leaderboard](https://gluebenchmark.com/leaderboard), and the overall GLUE score was **77.9**.
|
config.json
ADDED
@@ -0,0 +1,26 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"_name_or_path": "bert-base-uncased",
|
3 |
+
"architectures": [
|
4 |
+
"BertForSequenceClassification"
|
5 |
+
],
|
6 |
+
"attention_probs_dropout_prob": 0.1,
|
7 |
+
"finetuning_task": "sst2",
|
8 |
+
"gradient_checkpointing": false,
|
9 |
+
"hidden_act": "gelu",
|
10 |
+
"hidden_dropout_prob": 0.1,
|
11 |
+
"hidden_size": 768,
|
12 |
+
"initializer_range": 0.02,
|
13 |
+
"intermediate_size": 3072,
|
14 |
+
"layer_norm_eps": 1e-12,
|
15 |
+
"max_position_embeddings": 512,
|
16 |
+
"model_type": "bert",
|
17 |
+
"num_attention_heads": 12,
|
18 |
+
"num_hidden_layers": 12,
|
19 |
+
"pad_token_id": 0,
|
20 |
+
"position_embedding_type": "absolute",
|
21 |
+
"problem_type": "single_label_classification",
|
22 |
+
"transformers_version": "4.6.1",
|
23 |
+
"type_vocab_size": 2,
|
24 |
+
"use_cache": true,
|
25 |
+
"vocab_size": 30522
|
26 |
+
}
|
pytorch_model.bin
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:7b62912abee9dd3577d677e442c5f45daabbbd9681ea6ece286ce6c2f3de5f27
|
3 |
+
size 438024457
|
special_tokens_map.json
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
{"unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]"}
|
tokenizer.json
ADDED
The diff for this file is too large to render.
See raw diff
|
|
tokenizer_config.json
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
{"do_lower_case": true, "unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]", "tokenize_chinese_chars": true, "strip_accents": null, "do_lower": true, "model_max_length": 512, "special_tokens_map_file": null, "name_or_path": "bert-base-uncased"}
|
training.log
ADDED
@@ -0,0 +1,56 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
2021-05-27 21:46:40,498 INFO __main__ Namespace(adjust_lr=False, config='torchdistill/configs/sample/glue/sst2/ce/bert_base_uncased.yaml', log='log/glue/sst2/ce/bert_base_uncased.txt', private_output='leaderboard/glue/standard/bert_base_uncased/', seed=None, student_only=False, task_name='sst2', test_only=False, world_size=1)
|
2 |
+
2021-05-27 21:46:40,527 INFO __main__ Distributed environment: NO
|
3 |
+
Num processes: 1
|
4 |
+
Process index: 0
|
5 |
+
Local process index: 0
|
6 |
+
Device: cuda
|
7 |
+
Use FP16 precision: True
|
8 |
+
|
9 |
+
2021-05-27 21:46:45,657 WARNING datasets.builder Reusing dataset glue (/root/.cache/huggingface/datasets/glue/sst2/1.0.0/dacbe3125aa31d7f70367a07a8a9e72a5a0bfeb5fc42e75c9db75b96da6053ad)
|
10 |
+
2021-05-27 21:46:49,521 INFO __main__ Start training
|
11 |
+
2021-05-27 21:46:49,521 INFO torchdistill.models.util [student model]
|
12 |
+
2021-05-27 21:46:49,521 INFO torchdistill.models.util Using the original student model
|
13 |
+
2021-05-27 21:46:49,522 INFO torchdistill.core.training Loss = 1.0 * OrgLoss
|
14 |
+
2021-05-27 21:46:52,024 INFO torchdistill.misc.log Epoch: [0] [ 0/4210] eta: 0:08:32 lr: 2.9997624703087888e-05 sample/s: 34.45268233380702 loss: 0.6865 (0.6865) time: 0.1218 data: 0.0057 max mem: 1860
|
15 |
+
2021-05-27 21:47:42,462 INFO torchdistill.misc.log Epoch: [0] [ 500/4210] eta: 0:06:14 lr: 2.880997624703088e-05 sample/s: 49.37030933659777 loss: 0.2803 (0.3598) time: 0.0981 data: 0.0017 max mem: 2451
|
16 |
+
2021-05-27 21:48:32,727 INFO torchdistill.misc.log Epoch: [0] [1000/4210] eta: 0:05:23 lr: 2.7622327790973873e-05 sample/s: 39.871515416533974 loss: 0.1317 (0.3051) time: 0.0976 data: 0.0017 max mem: 2451
|
17 |
+
2021-05-27 21:49:23,257 INFO torchdistill.misc.log Epoch: [0] [1500/4210] eta: 0:04:33 lr: 2.6434679334916863e-05 sample/s: 35.84561181605885 loss: 0.1454 (0.2805) time: 0.0978 data: 0.0016 max mem: 2519
|
18 |
+
2021-05-27 21:50:13,560 INFO torchdistill.misc.log Epoch: [0] [2000/4210] eta: 0:03:42 lr: 2.5247030878859857e-05 sample/s: 44.91050137859036 loss: 0.0586 (0.2644) time: 0.0993 data: 0.0017 max mem: 2519
|
19 |
+
2021-05-27 21:51:03,797 INFO torchdistill.misc.log Epoch: [0] [2500/4210] eta: 0:02:52 lr: 2.405938242280285e-05 sample/s: 39.80218023083402 loss: 0.1723 (0.2589) time: 0.1047 data: 0.0017 max mem: 2524
|
20 |
+
2021-05-27 21:51:54,269 INFO torchdistill.misc.log Epoch: [0] [3000/4210] eta: 0:02:01 lr: 2.2871733966745845e-05 sample/s: 44.8911543329293 loss: 0.0482 (0.2504) time: 0.0993 data: 0.0016 max mem: 2524
|
21 |
+
2021-05-27 21:52:44,685 INFO torchdistill.misc.log Epoch: [0] [3500/4210] eta: 0:01:11 lr: 2.1684085510688836e-05 sample/s: 44.72767310855592 loss: 0.0169 (0.2495) time: 0.0991 data: 0.0017 max mem: 2524
|
22 |
+
2021-05-27 21:53:34,615 INFO torchdistill.misc.log Epoch: [0] [4000/4210] eta: 0:00:21 lr: 2.049643705463183e-05 sample/s: 44.74425402311726 loss: 0.0601 (0.2467) time: 0.0963 data: 0.0017 max mem: 2524
|
23 |
+
2021-05-27 21:53:55,646 INFO torchdistill.misc.log Epoch: [0] Total time: 0:07:03
|
24 |
+
2021-05-27 21:53:57,373 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/sst2/default_experiment-1-0.arrow
|
25 |
+
2021-05-27 21:53:57,374 INFO __main__ Validation: accuracy = 0.9197247706422018
|
26 |
+
2021-05-27 21:53:57,374 INFO __main__ Updating ckpt at ./resource/ckpt/glue/sst2/ce/sst2-bert-base-uncased
|
27 |
+
2021-05-27 21:53:58,421 INFO torchdistill.misc.log Epoch: [1] [ 0/4210] eta: 0:08:17 lr: 1.9997624703087886e-05 sample/s: 35.521468859567406 loss: 0.0474 (0.0474) time: 0.1181 data: 0.0055 max mem: 2524
|
28 |
+
2021-05-27 21:54:48,646 INFO torchdistill.misc.log Epoch: [1] [ 500/4210] eta: 0:06:12 lr: 1.880997624703088e-05 sample/s: 36.44645848042144 loss: 0.0000 (0.2791) time: 0.1028 data: 0.0017 max mem: 2524
|
29 |
+
2021-05-27 21:55:38,247 INFO torchdistill.misc.log Epoch: [1] [1000/4210] eta: 0:05:20 lr: 1.7622327790973874e-05 sample/s: 35.710640471680044 loss: 0.0000 (0.3181) time: 0.1011 data: 0.0018 max mem: 2524
|
30 |
+
2021-05-27 21:56:28,855 INFO torchdistill.misc.log Epoch: [1] [1500/4210] eta: 0:04:31 lr: 1.6434679334916864e-05 sample/s: 39.87085213729446 loss: 0.0165 (0.3264) time: 0.0996 data: 0.0017 max mem: 2524
|
31 |
+
2021-05-27 21:57:18,830 INFO torchdistill.misc.log Epoch: [1] [2000/4210] eta: 0:03:41 lr: 1.5247030878859858e-05 sample/s: 45.886779241949334 loss: 0.0044 (0.3258) time: 0.0958 data: 0.0017 max mem: 2524
|
32 |
+
2021-05-27 21:58:09,046 INFO torchdistill.misc.log Epoch: [1] [2500/4210] eta: 0:02:51 lr: 1.405938242280285e-05 sample/s: 39.92987550188378 loss: 0.1085 (0.3327) time: 0.1015 data: 0.0017 max mem: 2524
|
33 |
+
2021-05-27 21:58:58,696 INFO torchdistill.misc.log Epoch: [1] [3000/4210] eta: 0:02:01 lr: 1.2871733966745844e-05 sample/s: 45.834253539904 loss: 0.0002 (0.3333) time: 0.0970 data: 0.0017 max mem: 2524
|
34 |
+
2021-05-27 21:59:48,373 INFO torchdistill.misc.log Epoch: [1] [3500/4210] eta: 0:01:10 lr: 1.1684085510688835e-05 sample/s: 44.44754822352508 loss: 0.5874 (0.3352) time: 0.0967 data: 0.0017 max mem: 2524
|
35 |
+
2021-05-27 22:00:38,264 INFO torchdistill.misc.log Epoch: [1] [4000/4210] eta: 0:00:20 lr: 1.0496437054631829e-05 sample/s: 39.99126625063763 loss: 0.0001 (0.3336) time: 0.1006 data: 0.0017 max mem: 2524
|
36 |
+
2021-05-27 22:00:59,160 INFO torchdistill.misc.log Epoch: [1] Total time: 0:07:00
|
37 |
+
2021-05-27 22:01:00,882 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/sst2/default_experiment-1-0.arrow
|
38 |
+
2021-05-27 22:01:00,882 INFO __main__ Validation: accuracy = 0.9185779816513762
|
39 |
+
2021-05-27 22:01:00,987 INFO torchdistill.misc.log Epoch: [2] [ 0/4210] eta: 0:07:20 lr: 9.997624703087887e-06 sample/s: 39.848786999254195 loss: 0.6077 (0.6077) time: 0.1046 data: 0.0042 max mem: 2524
|
40 |
+
2021-05-27 22:01:50,624 INFO torchdistill.misc.log Epoch: [2] [ 500/4210] eta: 0:06:08 lr: 8.809976247030879e-06 sample/s: 36.41133843639655 loss: 0.0000 (0.1890) time: 0.0982 data: 0.0018 max mem: 2524
|
41 |
+
2021-05-27 22:02:40,203 INFO torchdistill.misc.log Epoch: [2] [1000/4210] eta: 0:05:18 lr: 7.622327790973873e-06 sample/s: 49.212313919892054 loss: 0.0000 (0.1807) time: 0.1030 data: 0.0017 max mem: 2524
|
42 |
+
2021-05-27 22:03:30,223 INFO torchdistill.misc.log Epoch: [2] [1500/4210] eta: 0:04:29 lr: 6.434679334916865e-06 sample/s: 40.599501012736035 loss: 0.0000 (0.1792) time: 0.1018 data: 0.0017 max mem: 2524
|
43 |
+
2021-05-27 22:04:19,806 INFO torchdistill.misc.log Epoch: [2] [2000/4210] eta: 0:03:39 lr: 5.247030878859857e-06 sample/s: 50.41261065270825 loss: 0.0000 (0.1790) time: 0.0993 data: 0.0018 max mem: 2524
|
44 |
+
2021-05-27 22:05:09,793 INFO torchdistill.misc.log Epoch: [2] [2500/4210] eta: 0:02:50 lr: 4.0593824228028505e-06 sample/s: 33.1207489137237 loss: 0.0000 (0.1839) time: 0.1056 data: 0.0017 max mem: 2524
|
45 |
+
2021-05-27 22:05:59,813 INFO torchdistill.misc.log Epoch: [2] [3000/4210] eta: 0:02:00 lr: 2.8717339667458436e-06 sample/s: 40.70687400369286 loss: 0.0000 (0.1804) time: 0.1029 data: 0.0017 max mem: 2524
|
46 |
+
2021-05-27 22:06:49,594 INFO torchdistill.misc.log Epoch: [2] [3500/4210] eta: 0:01:10 lr: 1.684085510688836e-06 sample/s: 35.86921598433294 loss: 0.0000 (0.1842) time: 0.1023 data: 0.0017 max mem: 2524
|
47 |
+
2021-05-27 22:07:39,232 INFO torchdistill.misc.log Epoch: [2] [4000/4210] eta: 0:00:20 lr: 4.96437054631829e-07 sample/s: 39.971353080901814 loss: 0.0000 (0.1840) time: 0.0981 data: 0.0018 max mem: 2524
|
48 |
+
2021-05-27 22:07:59,740 INFO torchdistill.misc.log Epoch: [2] Total time: 0:06:58
|
49 |
+
2021-05-27 22:08:01,462 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/sst2/default_experiment-1-0.arrow
|
50 |
+
2021-05-27 22:08:01,462 INFO __main__ Validation: accuracy = 0.9254587155963303
|
51 |
+
2021-05-27 22:08:01,462 INFO __main__ Updating ckpt at ./resource/ckpt/glue/sst2/ce/sst2-bert-base-uncased
|
52 |
+
2021-05-27 22:08:06,304 INFO __main__ [Student: bert-base-uncased]
|
53 |
+
2021-05-27 22:08:08,039 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/sst2/default_experiment-1-0.arrow
|
54 |
+
2021-05-27 22:08:08,039 INFO __main__ Test: accuracy = 0.9254587155963303
|
55 |
+
2021-05-27 22:08:08,040 INFO __main__ Start prediction for private dataset(s)
|
56 |
+
2021-05-27 22:08:08,040 INFO __main__ sst2/test: 1821 samples
|
vocab.txt
ADDED
The diff for this file is too large to render.
See raw diff
|
|