jisx commited on
Commit
072597e
1 Parent(s): 20ea957

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +39 -0
README.md ADDED
@@ -0,0 +1,39 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
+ ---
3
+ language:
4
+ - es
5
+ tags:
6
+ - generation
7
+ - question answering
8
+ - instruction tuning
9
+ license: cc-by-nc-4.0
10
+ ---
11
+
12
+ ### Model Description
13
+
14
+ This HF repository contains base LLMs instruction tuned (SFT) with full-parameter fine-tuning and then used to study whether monolingual or multilingual instruction tuning is more favourable.
15
+ * [GitHub](https://github.com/hplt-project/monolingual-multilingual-instruction-tuning/tree/main)
16
+ * [Paper](https://arxiv.org/abs/2309.08958)
17
+
18
+ #### Instruction tuning details
19
+ * Base model: [pythia-6.9b-deduped](https://huggingface.co/pythia-6.9b-deduped)
20
+ * Instruction tuning language: Spanish
21
+ * Training method: full-parameter fine-tuning.
22
+ * Best checkpoint: best cross-entropy on a validation set, trained for 3 epochs.
23
+ * Dataset: machine-translated from [yahma/alpaca-cleaned](https://huggingface.co/datasets/yahma/alpaca-cleaned). You can download our data [HERE](https://github.com/hplt-project/monolingual-multilingual-instruction-tuning/tree/main/training-data).
24
+
25
+ #### Usage
26
+ The model checkpoint should be loaded using `transformers` library.
27
+
28
+ Please refer to our Github repository [HERE](https://github.com/hplt-project/monolingual-multilingual-instruction-tuning/tree/main/fpft) for inference and training instructions.
29
+
30
+ #### Citation
31
+ ```
32
+ @inproceedings{chen-etal-2024-monolingual,
33
+ title="Monolingual or multilingual instruction tuning: Which makes a better {Alpaca}",
34
+ author="Pinzhen Chen and Shaoxiong Ji and Nikolay Bogoychev and Andrey Kutuzov and Barry Haddow and Kenneth Heafield",
35
+ year="2024",
36
+ booktitle = "Findings of the Association for Computational Linguistics: EACL 2024",
37
+ }
38
+ ```
39
+