Update README.md
Browse files
README.md
CHANGED
@@ -1,66 +1,199 @@
|
|
1 |
---
|
2 |
-
base_model: FreedomIntelligence/Apollo-7B
|
3 |
-
language:
|
4 |
-
- en
|
5 |
-
library_name: transformers
|
6 |
license: apache-2.0
|
7 |
-
|
8 |
---
|
9 |
-
## About
|
10 |
|
11 |
-
|
12 |
-
|
13 |
-
|
14 |
-
|
15 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
16 |
|
17 |
-
<!-- provided-files -->
|
18 |
-
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
|
19 |
-
## Usage
|
20 |
|
21 |
-
|
22 |
-
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
|
23 |
-
more details, including on how to concatenate multi-part files.
|
24 |
|
25 |
-
|
26 |
|
27 |
-
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
|
28 |
|
29 |
-
|
30 |
-
|
31 |
-
| [GGUF](https://huggingface.co/mradermacher/Apollo-7B-GGUF/resolve/main/Apollo-7B.Q2_K.gguf) | Q2_K | 3.6 | |
|
32 |
-
| [GGUF](https://huggingface.co/mradermacher/Apollo-7B-GGUF/resolve/main/Apollo-7B.IQ3_XS.gguf) | IQ3_XS | 3.9 | |
|
33 |
-
| [GGUF](https://huggingface.co/mradermacher/Apollo-7B-GGUF/resolve/main/Apollo-7B.IQ3_S.gguf) | IQ3_S | 4.1 | beats Q3_K* |
|
34 |
-
| [GGUF](https://huggingface.co/mradermacher/Apollo-7B-GGUF/resolve/main/Apollo-7B.Q3_K_S.gguf) | Q3_K_S | 4.1 | |
|
35 |
-
| [GGUF](https://huggingface.co/mradermacher/Apollo-7B-GGUF/resolve/main/Apollo-7B.IQ3_M.gguf) | IQ3_M | 4.2 | |
|
36 |
-
| [GGUF](https://huggingface.co/mradermacher/Apollo-7B-GGUF/resolve/main/Apollo-7B.Q3_K_M.gguf) | Q3_K_M | 4.5 | lower quality |
|
37 |
-
| [GGUF](https://huggingface.co/mradermacher/Apollo-7B-GGUF/resolve/main/Apollo-7B.Q3_K_L.gguf) | Q3_K_L | 4.8 | |
|
38 |
-
| [GGUF](https://huggingface.co/mradermacher/Apollo-7B-GGUF/resolve/main/Apollo-7B.IQ4_XS.gguf) | IQ4_XS | 4.9 | |
|
39 |
-
| [GGUF](https://huggingface.co/mradermacher/Apollo-7B-GGUF/resolve/main/Apollo-7B.Q4_K_S.gguf) | Q4_K_S | 5.1 | fast, recommended |
|
40 |
-
| [GGUF](https://huggingface.co/mradermacher/Apollo-7B-GGUF/resolve/main/Apollo-7B.Q4_K_M.gguf) | Q4_K_M | 5.4 | fast, recommended |
|
41 |
-
| [GGUF](https://huggingface.co/mradermacher/Apollo-7B-GGUF/resolve/main/Apollo-7B.Q5_K_S.gguf) | Q5_K_S | 6.1 | |
|
42 |
-
| [GGUF](https://huggingface.co/mradermacher/Apollo-7B-GGUF/resolve/main/Apollo-7B.Q5_K_M.gguf) | Q5_K_M | 6.2 | |
|
43 |
-
| [GGUF](https://huggingface.co/mradermacher/Apollo-7B-GGUF/resolve/main/Apollo-7B.Q6_K.gguf) | Q6_K | 7.1 | very good quality |
|
44 |
-
| [GGUF](https://huggingface.co/mradermacher/Apollo-7B-GGUF/resolve/main/Apollo-7B.Q8_0.gguf) | Q8_0 | 9.2 | fast, best quality |
|
45 |
|
46 |
|
47 |
-
|
48 |
-
types (lower is better):
|
49 |
|
50 |
-
![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png)
|
51 |
|
52 |
-
|
53 |
-
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
|
54 |
|
55 |
-
|
56 |
|
57 |
-
See https://huggingface.co/mradermacher/model_requests for some answers to
|
58 |
-
questions you might have and/or if you want some other model quantized.
|
59 |
|
60 |
-
##
|
61 |
|
62 |
-
|
63 |
-
me use its servers and providing upgrades to my workstation to enable
|
64 |
-
this work in my free time.
|
65 |
|
66 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
|
|
|
|
|
|
|
|
2 |
license: apache-2.0
|
3 |
+
|
4 |
---
|
|
|
5 |
|
6 |
+
# Multilingual Medicine: Model, Dataset, Benchmark, Code
|
7 |
+
|
8 |
+
Covering English, Chinese, French, Hindi, Spanish, Hindi, Arabic So far
|
9 |
+
|
10 |
+
|
11 |
+
<p align="center">
|
12 |
+
π¨π»βπ»<a href="https://github.com/FreedomIntelligence/Apollo" target="_blank">Github</a> β’π <a href="https://arxiv.org/abs/2403.03640" target="_blank">Paper</a> β’ π <a href="https://apollo.llmzoo.com/" target="_blank">Demo</a> β’ π€ <a href="https://huggingface.co/datasets/FreedomIntelligence/ApolloCorpus" target="_blank">ApolloCorpus</a> β’ π€ <a href="https://huggingface.co/datasets/FreedomIntelligence/XMedbench" target="_blank">XMedBench</a>
|
13 |
+
<br> <a href="./README_zh.md"> δΈζ </a> | <a href="./README.md"> English
|
14 |
+
</p>
|
15 |
+
|
16 |
+
|
17 |
+
![Apollo](assets/apollo_medium_final.png)
|
18 |
+
|
19 |
+
## π Update
|
20 |
+
|
21 |
+
We sincerely thank [mradermacher](https://huggingface.co/mradermacher/Apollo-7B-GGUF) for the assistance in providing multiple versions of the GGUF model.
|
22 |
+
* **[2024.04.28]** We have updated multiple versions of the Apollo-7B GGUF model.
|
23 |
+
* **[2024.03.07]** [Paper](https://arxiv.org/abs/2403.03640) released.
|
24 |
+
* **[2024.02.12]** <a href="https://huggingface.co/datasets/FreedomIntelligence/ApolloCorpus" target="_blank">ApolloCorpus</a> and <a href="https://huggingface.co/datasets/FreedomIntelligence/XMedbench" target="_blank">XMedBench</a> is publishedοΌπ
|
25 |
+
* **[2024.01.23]** Apollo repo is publishedοΌπ
|
26 |
+
|
27 |
+
## Overview
|
28 |
+
|
29 |
+
| Type | Size/GB | Notes |
|
30 |
+
|:-----|------:|:------|
|
31 |
+
| [Q2_K](https://huggingface.co/FreedomIntelligence/Apollo-7B-GGUF/resolve/main/Apollo-7B.Q2_K.gguf) | 3.6 | |
|
32 |
+
| [IQ3_XS](https://huggingface.co/FreedomIntelligence/Apollo-7B-GGUF/resolve/main/Apollo-7B.IQ3_XS.gguf) | 3.9 | |
|
33 |
+
| [IQ3_S](https://huggingface.co/FreedomIntelligence/Apollo-7B-GGUF/resolve/main/Apollo-7B.IQ3_S.gguf) | 4.1 | beats Q3_K* |
|
34 |
+
| [Q3_K_S](https://huggingface.co/FreedomIntelligence/Apollo-7B-GGUF/resolve/main/Apollo-7B.Q3_K_S.gguf) | 4.1 | |
|
35 |
+
| [IQ3_M](https://huggingface.co/FreedomIntelligence/Apollo-7B-GGUF/resolve/main/Apollo-7B.IQ3_M.gguf) | 4.2 | |
|
36 |
+
| [Q3_K_M](https://huggingface.co/FreedomIntelligence/Apollo-7B-GGUF/resolve/main/Apollo-7B.Q3_K_M.gguf) | 4.5 | lower quality |
|
37 |
+
| [Q3_K_L](https://huggingface.co/FreedomIntelligence/Apollo-7B-GGUF/resolve/main/Apollo-7B.Q3_K_L.gguf) | 4.8 | |
|
38 |
+
| [IQ4_XS](https://huggingface.co/FreedomIntelligence/Apollo-7B-GGUF/resolve/main/Apollo-7B.IQ4_XS.gguf) | 4.9 | |
|
39 |
+
| [Q4_K_S](https://huggingface.co/FreedomIntelligence/Apollo-7B-GGUF/resolve/main/Apollo-7B.Q4_K_S.gguf) | 5.1 | fast, recommended |
|
40 |
+
| [Q4_K_M](https://huggingface.co/FreedomIntelligence/Apollo-7B-GGUF/resolve/main/Apollo-7B.Q4_K_M.gguf) | 5.4 | fast, recommended |
|
41 |
+
| [Q5_K_S](https://huggingface.co/FreedomIntelligence/Apollo-7B-GGUF/resolve/main/Apollo-7B.Q5_K_S.gguf) | 6.1 | |
|
42 |
+
| [Q5_K_M](https://huggingface.co/FreedomIntelligence/Apollo-7B-GGUF/resolve/main/Apollo-7B.Q5_K_M.gguf) | 6.2 | |
|
43 |
+
| [Q6_K](https://huggingface.co/FreedomIntelligence/Apollo-7B-GGUF/resolve/main/Apollo-7B.Q6_K.gguf) | 7.1 | very good quality |
|
44 |
+
| [Q8_0](https://huggingface.co/FreedomIntelligence/Apollo-7B-GGUF/resolve/main/Apollo-7B.Q8_0.gguf) | 9.2 | fast, best quality, but very large |
|
45 |
+
|
46 |
+
|
47 |
+
## Results
|
48 |
+
|
49 |
+
π€<a href="https://huggingface.co/FreedomIntelligence/Apollo-0.5B" target="_blank">Apollo-0.5B</a> β’ π€ <a href="https://huggingface.co/FreedomIntelligence/Apollo-1.8B" target="_blank">Apollo-1.8B</a> β’ π€ <a href="https://huggingface.co/FreedomIntelligence/Apollo-2B" target="_blank">Apollo-2B</a> β’ π€ <a href="https://huggingface.co/FreedomIntelligence/Apollo-6B" target="_blank">Apollo-6B</a> β’ π€ <a href="https://huggingface.co/FreedomIntelligence/Apollo-7B" target="_blank">Apollo-7B</a>
|
50 |
+
|
51 |
+
π€ <a href="https://huggingface.co/FreedomIntelligence/Apollo-0.5B-GGUF" target="_blank">Apollo-0.5B-GGUF</a> β’ π€ <a href="https://huggingface.co/FreedomIntelligence/Apollo-2B-GGUF" target="_blank">Apollo-2B-GGUF</a> β’ π€ <a href="https://huggingface.co/FreedomIntelligence/Apollo-6B-GGUF" target="_blank">Apollo-6B-GGUF</a> β’ π€ <a href="https://huggingface.co/FreedomIntelligence/Apollo-7B-GGUF" target="_blank">Apollo-7B-GGUF</a>
|
52 |
+
|
53 |
+
|
54 |
+
![Apollo](assets/result.png)
|
55 |
+
|
56 |
+
|
57 |
+
|
58 |
+
|
59 |
+
## Dataset & Evaluation
|
60 |
+
|
61 |
+
- Dataset
|
62 |
+
π€ <a href="https://huggingface.co/datasets/FreedomIntelligence/ApolloCorpus" target="_blank">ApolloCorpus</a>
|
63 |
+
|
64 |
+
<details><summary>Click to expand</summary>
|
65 |
+
|
66 |
+
|
67 |
+
![Apollo](assets/dataset.png)
|
68 |
+
|
69 |
+
- [Zip File](https://huggingface.co/datasets/FreedomIntelligence/ApolloCorpus/blob/main/ApolloCorpus.zip)
|
70 |
+
|
71 |
+
- [Data category](https://huggingface.co/datasets/FreedomIntelligence/ApolloCorpus/tree/main/train)
|
72 |
+
|
73 |
+
- Pretrain:
|
74 |
+
|
75 |
+
- data item:
|
76 |
+
|
77 |
+
- json_name: {data_source}_{language}_{data_type}.json
|
78 |
+
|
79 |
+
- data_type: medicalBook, medicalGuideline, medicalPaper, medicalWeb(from online forum), medicalWiki
|
80 |
+
|
81 |
+
- language: en(English), zh(chinese), es(spanish), fr(french), hi(Hindi)
|
82 |
+
|
83 |
+
- data_type: qa(generated qa from text)
|
84 |
+
|
85 |
+
- data_type==text: list of string
|
86 |
+
|
87 |
+
```
|
88 |
+
[
|
89 |
+
"string1",
|
90 |
+
"string2",
|
91 |
+
...
|
92 |
+
]
|
93 |
+
```
|
94 |
+
|
95 |
+
- data_type==qa: list of qa pairs(list of string)
|
96 |
+
|
97 |
+
```
|
98 |
+
[
|
99 |
+
[
|
100 |
+
"q1",
|
101 |
+
"a1",
|
102 |
+
"q2",
|
103 |
+
"a2",
|
104 |
+
...
|
105 |
+
],
|
106 |
+
...
|
107 |
+
]
|
108 |
+
```
|
109 |
+
|
110 |
+
- SFT:
|
111 |
+
|
112 |
+
- json_name: {data_source}_{language}.json
|
113 |
+
|
114 |
+
- data_type: code, general, math, medicalExam, medicalPatient
|
115 |
+
|
116 |
+
- data item: list of qa pairs(list of string)
|
117 |
+
|
118 |
+
```
|
119 |
+
[
|
120 |
+
[
|
121 |
+
"q1",
|
122 |
+
"a1",
|
123 |
+
"q2",
|
124 |
+
"a2",
|
125 |
+
...
|
126 |
+
],
|
127 |
+
...
|
128 |
+
]
|
129 |
+
```
|
130 |
+
|
131 |
+
|
132 |
+
</details>
|
133 |
+
|
134 |
+
|
135 |
+
|
136 |
+
- Evaluation
|
137 |
+
π€ <a href="https://huggingface.co/datasets/FreedomIntelligence/XMedbench" target="_blank">XMedBench</a>
|
138 |
+
|
139 |
+
<details><summary>Click to expand</summary>
|
140 |
+
|
141 |
+
|
142 |
+
- EN:
|
143 |
+
- [MedQA-USMLE](https://huggingface.co/datasets/GBaker/MedQA-USMLE-4-options)
|
144 |
+
- [MedMCQA](https://huggingface.co/datasets/medmcqa/viewer/default/test)
|
145 |
+
- [PubMedQA](https://huggingface.co/datasets/pubmed_qa): Because the results fluctuated too much, they were not used in the paper.
|
146 |
+
- [MMLU-Medical](https://huggingface.co/datasets/cais/mmlu)
|
147 |
+
- Clinical knowledge, Medical genetics, Anatomy, Professional medicine, College biology, College medicine
|
148 |
+
- ZH:
|
149 |
+
- [MedQA-MCMLE](https://huggingface.co/datasets/bigbio/med_qa/viewer/med_qa_zh_4options_bigbio_qa/test)
|
150 |
+
- [CMB-single](https://huggingface.co/datasets/FreedomIntelligence/CMB): Not used in the paper
|
151 |
+
- Randomly sample 2,000 multiple-choice questions with single answer.
|
152 |
+
- [CMMLU-Medical](https://huggingface.co/datasets/haonan-li/cmmlu)
|
153 |
+
- Anatomy, Clinical_knowledge, College_medicine, Genetics, Nutrition, Traditional_chinese_medicine, Virology
|
154 |
+
- [CExam](https://github.com/williamliujl/CMExam): Not used in the paper
|
155 |
+
- Randomly sample 2,000 multiple-choice questions
|
156 |
+
|
157 |
+
|
158 |
+
- ES: [Head_qa](https://huggingface.co/datasets/head_qa)
|
159 |
+
- FR: [Frenchmedmcqa](https://github.com/qanastek/FrenchMedMCQA)
|
160 |
+
- HI: [MMLU_HI](https://huggingface.co/datasets/FreedomIntelligence/MMLU_Arabic)
|
161 |
+
- Clinical knowledge, Medical genetics, Anatomy, Professional medicine, College biology, College medicine
|
162 |
+
- AR: [MMLU_Ara](https://huggingface.co/datasets/FreedomIntelligence/MMLU_Hindi)
|
163 |
+
- Clinical knowledge, Medical genetics, Anatomy, Professional medicine, College biology, College medicine
|
164 |
+
|
165 |
+
|
166 |
+
</details>
|
167 |
|
|
|
|
|
|
|
168 |
|
169 |
+
## Results reproduction
|
|
|
|
|
170 |
|
171 |
+
<details><summary>Click to expand</summary>
|
172 |
|
|
|
173 |
|
174 |
+
**Waiting for Update**
|
175 |
+
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
176 |
|
177 |
|
178 |
+
</details>
|
|
|
179 |
|
|
|
180 |
|
181 |
+
## Acknowledgment
|
|
|
182 |
|
183 |
+
We sincerely thank [mradermacher](https://huggingface.co/mradermacher/Apollo-7B-GGUF) for the assistance in providing multiple versions of the Apollo-7B GGUF model.
|
184 |
|
|
|
|
|
185 |
|
186 |
+
## Citation
|
187 |
|
188 |
+
Please use the following citation if you intend to use our dataset for training or evaluation:
|
|
|
|
|
189 |
|
190 |
+
```
|
191 |
+
@misc{wang2024apollo,
|
192 |
+
title={Apollo: Lightweight Multilingual Medical LLMs towards Democratizing Medical AI to 6B People},
|
193 |
+
author={Xidong Wang and Nuo Chen and Junyin Chen and Yan Hu and Yidong Wang and Xiangbo Wu and Anningzhe Gao and Xiang Wan and Haizhou Li and Benyou Wang},
|
194 |
+
year={2024},
|
195 |
+
eprint={2403.03640},
|
196 |
+
archivePrefix={arXiv},
|
197 |
+
primaryClass={cs.CL}
|
198 |
+
}
|
199 |
+
```
|