library_name: transformers | |
pipeline_tag: text-generation | |
language: | |
- multilingual | |
tags: | |
- generation | |
- question answering | |
- instruction tuning | |
datasets: | |
- MBZUAI/Bactrian-X | |
license: cc-by-nc-4.0 | |
### Model Description | |
This HF repository hosts instruction fine-tuned multilingual BLOOM model using the parallel instruction dataset called Bactrain-X in 52 languages. | |
We progressively add a language during instruction fine-tuning at each time, and train 52 models in total. Then, we evaluate those models in three multilingual benchmarks. | |
Please refer to our paper for more details. | |
#### Instruction tuning details | |
* Base model: [BLOOM 7B1](https://huggingface.co/bigscience/bloom-7b1) | |
* Instruction languages: English, Chinese | |
* Instruction language codes: en, zh | |
* Training method: full-parameter fine-tuning. | |
#### Usage | |
The model checkpoint should be loaded using `transformers` library. | |
```python | |
from transformers import AutoTokenizer, AutoModelForCausalLM | |
tokenizer = AutoTokenizer.from_pretrained("MaLA-LM/lucky52-bloom-7b1-no-2") | |
model = AutoModelForCausalLM.from_pretrained("MaLA-LM/lucky52-bloom-7b1-no-2") | |
``` | |
#### Citation | |
``` | |
@article{ | |
} | |
``` | |