ragarwal commited on
Commit
8744fb9
1 Parent(s): e5b351c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +22 -3
README.md CHANGED
@@ -1,3 +1,22 @@
1
- ---
2
- license: apache-2.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ ---
4
+
5
+
6
+ ### Results
7
+ | Method | Rank | Mean Win Rate (%) | Average AUC |
8
+ |-------------------------|------|-------------------|-------------|
9
+ | **Llama-3-8B (FT) (Ours)** | **1** | **78.11** | **78.037** |
10
+ | **Flan-T5-L (FT) (Ours)** | **2** | **76.43** | **78.663** |
11
+ | MiniCheck-T5-L | 3 | 72.39 | 76.674 |
12
+ | gpt-3.5-turbo | 4 | 69.36 | 77.007 |
13
+ | Flan-T5-B (FT) (Ours) | 5 | 66.00 | 76.126 |
14
+ | AlignScore-L | 6 | 53.19 | 73.074 |
15
+ | Llama-3-8B | 7 | 53.20 | 75.085 |
16
+ | AlignScore-B | 8 | 39.39 | 71.319 |
17
+ | QuestEval | 9 | 37.37 | 66.089 |
18
+ | BARTScore | 10 | 26.94 | 62.637 |
19
+ | BERTScore | 11 | 20.88 | 61.263 |
20
+ | ROUGE-L | 12 | 6.73 | 54.678 |
21
+
22
+ *Comparison of different factuality evaluation methods across all test datasets. The methods are ranked based on the Mean Win Rate, which measures overall performance on factuality tasks. The Average AUC column represents the average of all individual AUC-ROC scores.*