Quantization made by Richard Erkhov. [Github](https://github.com/RichardErkhov) [Discord](https://discord.gg/pvy7H8DZMG) [Request more models](https://github.com/RichardErkhov/quant_request) Apollo-2B - bnb 8bits - Model creator: https://huggingface.co/FreedomIntelligence/ - Original model: https://huggingface.co/FreedomIntelligence/Apollo-2B/ Original model description: --- license: apache-2.0 --- # Multilingual Medicine: Model, Dataset, Benchmark, Code Covering English, Chinese, French, Hindi, Spanish, Hindi, Arabic So far

πŸ‘¨πŸ»β€πŸ’»Github β€’πŸ“ƒ Paper β€’ 🌐 Demo β€’ πŸ€— ApolloCorpus β€’ πŸ€— XMedBench
δΈ­ζ–‡ | English

![Apollo](assets/apollo_medium_final.png) ## 🌈 Update * **[2024.03.07]** [Paper](https://arxiv.org/abs/2403.03640) released. * **[2024.02.12]** ApolloCorpus and XMedBench is publishedοΌπŸŽ‰ * **[2024.01.23]** Apollo repo is publishedοΌπŸŽ‰ ## Results πŸ€—Apollo-0.5B β€’ πŸ€— Apollo-1.8B β€’ πŸ€— Apollo-2B β€’ πŸ€— Apollo-6B β€’ πŸ€— Apollo-7B πŸ€— Apollo-0.5B-GGUF β€’ πŸ€— Apollo-2B-GGUF β€’ πŸ€— Apollo-6B-GGUF β€’ πŸ€— Apollo-7B-GGUF ![Apollo](assets/result.png) ## Usage Format User:{query}\nAssistant:{response}<|endoftext|> ## Dataset & Evaluation - Dataset πŸ€— ApolloCorpus
Click to expand ![Apollo](assets/dataset.png) - [Zip File](https://huggingface.co/datasets/FreedomIntelligence/ApolloCorpus/blob/main/ApolloCorpus.zip) - [Data category](https://huggingface.co/datasets/FreedomIntelligence/ApolloCorpus/tree/main/train) - Pretrain: - data item: - json_name: {data_source}_{language}_{data_type}.json - data_type: medicalBook, medicalGuideline, medicalPaper, medicalWeb(from online forum), medicalWiki - language: en(English), zh(chinese), es(spanish), fr(french), hi(Hindi) - data_type: qa(generated qa from text) - data_type==text: list of string ``` [ "string1", "string2", ... ] ``` - data_type==qa: list of qa pairs(list of string) ``` [ [ "q1", "a1", "q2", "a2", ... ], ... ] ``` - SFT: - json_name: {data_source}_{language}.json - data_type: code, general, math, medicalExam, medicalPatient - data item: list of qa pairs(list of string) ``` [ [ "q1", "a1", "q2", "a2", ... ], ... ] ```
- Evaluation πŸ€— XMedBench
Click to expand - EN: - [MedQA-USMLE](https://huggingface.co/datasets/GBaker/MedQA-USMLE-4-options) - [MedMCQA](https://huggingface.co/datasets/medmcqa/viewer/default/test) - [PubMedQA](https://huggingface.co/datasets/pubmed_qa): Because the results fluctuated too much, they were not used in the paper. - [MMLU-Medical](https://huggingface.co/datasets/cais/mmlu) - Clinical knowledge, Medical genetics, Anatomy, Professional medicine, College biology, College medicine - ZH: - [MedQA-MCMLE](https://huggingface.co/datasets/bigbio/med_qa/viewer/med_qa_zh_4options_bigbio_qa/test) - [CMB-single](https://huggingface.co/datasets/FreedomIntelligence/CMB): Not used in the paper - Randomly sample 2,000 multiple-choice questions with single answer. - [CMMLU-Medical](https://huggingface.co/datasets/haonan-li/cmmlu) - Anatomy, Clinical_knowledge, College_medicine, Genetics, Nutrition, Traditional_chinese_medicine, Virology - [CExam](https://github.com/williamliujl/CMExam): Not used in the paper - Randomly sample 2,000 multiple-choice questions - ES: [Head_qa](https://huggingface.co/datasets/head_qa) - FR: [Frenchmedmcqa](https://github.com/qanastek/FrenchMedMCQA) - HI: [MMLU_HI](https://huggingface.co/datasets/FreedomIntelligence/MMLU_Arabic) - Clinical knowledge, Medical genetics, Anatomy, Professional medicine, College biology, College medicine - AR: [MMLU_Ara](https://huggingface.co/datasets/FreedomIntelligence/MMLU_Hindi) - Clinical knowledge, Medical genetics, Anatomy, Professional medicine, College biology, College medicine
## Results reproduction
Click to expand **Waiting for Update**
## Citation Please use the following citation if you intend to use our dataset for training or evaluation: ``` @misc{wang2024apollo, title={Apollo: Lightweight Multilingual Medical LLMs towards Democratizing Medical AI to 6B People}, author={Xidong Wang and Nuo Chen and Junyin Chen and Yan Hu and Yidong Wang and Xiangbo Wu and Anningzhe Gao and Xiang Wan and Haizhou Li and Benyou Wang}, year={2024}, eprint={2403.03640}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```