Weiyun1025
commited on
Commit
•
b6a90b5
1
Parent(s):
b68bfcb
Update README.md
Browse files
README.md
CHANGED
@@ -24,4 +24,31 @@ For a robust evaluation, we adopt CircularEval as our evaluation strategy.
|
|
24 |
Under this setting, a question is considered as correctly answered only when the model consistently predicts the correct answer in each of the N iterations, with N corresponding to the number of choices.
|
25 |
In each iteration, a circular shift is applied to both the choices and the answer to form a new query for the model.
|
26 |
|
27 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
24 |
Under this setting, a question is considered as correctly answered only when the model consistently predicts the correct answer in each of the N iterations, with N corresponding to the number of choices.
|
25 |
In each iteration, a circular shift is applied to both the choices and the answer to form a new query for the model.
|
26 |
|
27 |
+
CRPE contains the following files:
|
28 |
+
- `crpe_exist.jsonl`: the evaluation data of **Existence** split.
|
29 |
+
- `crpe_exist_meta.jsonl`: the evaluation data of **Existence** split without CircularEval.
|
30 |
+
- `crpe_relation.jsonl`: the evaluation data of **Subject**, **Predicate**, and **Object** split.
|
31 |
+
- `crpe_relation_meta.jsonl`: the evaluation data of **Subject**, **Predicate**, and **Object** split without CircularEval.
|
32 |
+
|
33 |
+
**NOTE**: You should use `crpe_exist.jsonl` and `crpe_relation.jsonl` for evaluation. The evaluation script is presented [here](https://github.com/OpenGVLab/all-seeing/blob/main/all-seeing-v2/llava/eval/eval_crpe.py).
|
34 |
+
|
35 |
+
See our [project](https://github.com/OpenGVLab/all-seeing/all-seeing-v2) to learn more details!
|
36 |
+
|
37 |
+
# Citation
|
38 |
+
|
39 |
+
If you find our work useful in your research, please consider cite:
|
40 |
+
|
41 |
+
```BibTeX
|
42 |
+
@article{wang2023allseeing,
|
43 |
+
title={The All-Seeing Project: Towards Panoptic Visual Recognition and Understanding of the Open World},
|
44 |
+
author={Wang, Weiyun and Shi, Min and Li, Qingyun and Wang, Wenhai and Huang, Zhenhang and Xing, Linjie and Chen, Zhe and Li, Hao and Zhu, Xizhou and Cao, Zhiguo and others},
|
45 |
+
journal={arXiv preprint arXiv:2308.01907},
|
46 |
+
year={2023}
|
47 |
+
}
|
48 |
+
@article{wang2024allseeing_v2,
|
49 |
+
title={The All-Seeing Project V2: Towards General Relation Comprehension of the Open World},
|
50 |
+
author={Wang, Weiyun and Ren, Yiming and Luo, Haowen and Li, Tiantong and Yan, Chenxiang and Chen, Zhe and Wang, Wenhai and Li, Qingyun and Lu, Lewei and Zhu, Xizhou and others},
|
51 |
+
journal={arXiv preprint arXiv:2402.19474},
|
52 |
+
year={2024}
|
53 |
+
}
|
54 |
+
```
|