add the results on JSTS v1.1
Browse files
README.md
CHANGED
@@ -28,14 +28,15 @@ The experimental results evaluated on the dev set of
|
|
28 |
| Model | MARC-ja | JSTS | JNLI | JCommonsenseQA |
|
29 |
| ---------------------- | --------- | ------------------- | --------- | -------------- |
|
30 |
| | acc | Pearson/Spearman | acc | acc |
|
31 |
-
| **LUKE Japanese base** | **0.965** | **0.
|
32 |
| _Baselines:_ | |
|
33 |
-
| Tohoku BERT base | 0.958 | 0.
|
34 |
-
| NICT BERT base | 0.958 | 0.
|
35 |
-
| Waseda RoBERTa base | 0.962 | 0.
|
36 |
-
| XLM RoBERTa base | 0.961 | 0.
|
37 |
|
38 |
-
The baseline scores are obtained from
|
|
|
39 |
|
40 |
### Citation
|
41 |
|
|
|
28 |
| Model | MARC-ja | JSTS | JNLI | JCommonsenseQA |
|
29 |
| ---------------------- | --------- | ------------------- | --------- | -------------- |
|
30 |
| | acc | Pearson/Spearman | acc | acc |
|
31 |
+
| **LUKE Japanese base** | **0.965** | **0.916**/**0.877** | **0.912** | **0.842** |
|
32 |
| _Baselines:_ | |
|
33 |
+
| Tohoku BERT base | 0.958 | 0.909/0.868 | 0.899 | 0.808 |
|
34 |
+
| NICT BERT base | 0.958 | 0.910/0.871 | 0.902 | 0.823 |
|
35 |
+
| Waseda RoBERTa base | 0.962 | 0.913/0.873 | 0.895 | 0.840 |
|
36 |
+
| XLM RoBERTa base | 0.961 | 0.877/0.831 | 0.893 | 0.687 |
|
37 |
|
38 |
+
The baseline scores are obtained from
|
39 |
+
[here](https://github.com/yahoojapan/JGLUE/blob/a6832af23895d6faec8ecf39ec925f1a91601d62/README.md).
|
40 |
|
41 |
### Citation
|
42 |
|