avans06 commited on
Commit
9f74291
1 Parent(s): 50b6fb6

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +68 -0
README.md CHANGED
@@ -1,3 +1,71 @@
1
  ---
2
  license: mit
 
 
 
 
 
 
 
 
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: mit
3
+ tags:
4
+ - ctranslate2
5
+ - quantization
6
+ - int8
7
+ - float16
8
+ - text-generation
9
+ - ALMA
10
+ - llama
11
  ---
12
+
13
+ # ALMA-7B model for CTranslate2
14
+
15
+ The model is quantized version of the [haoranxu/ALMA-7B](https://huggingface.co/haoranxu/ALMA-7B) with int8_float16 quantization and can be used in [CTranslate2](https://github.com/OpenNMT/CTranslate2).
16
+
17
+ **ALMA** (**A**dvanced **L**anguage **M**odel-based tr**A**nslator) is an LLM-based translation model, which adopts a new translation model paradigm: it begins with fine-tuning on monolingual data and is further optimized using high-quality parallel data. This two-step fine-tuning process ensures strong translation performance.
18
+
19
+ - Model creator: [Haoran Xu](https://huggingface.co/haoranxu)
20
+ - Original model: [ALMA 7B](https://huggingface.co/haoranxu/ALMA-7B)
21
+
22
+
23
+ ## Conversion details
24
+
25
+ The original model was converted on 2023-12 with the following command:
26
+
27
+ ```
28
+ ct2-transformers-converter --model haoranxu/ALMA-7B --quantization int8_float16 --output_dir ALMA-7B-ct2_int8_float16 \
29
+ --copy_files generation_config.json special_tokens_map.json tokenizer.model tokenizer_config.json
30
+ ```
31
+
32
+
33
+ ## Prompt template: ALMA
34
+
35
+ ```
36
+ Translate this from English to Chinese:
37
+ English: {prompt}
38
+ Chinese:
39
+ ```
40
+
41
+
42
+ ## Example
43
+
44
+ This example code is obtained from [CTranslate2_transformers](https://opennmt.net/CTranslate2/guides/transformers.html#mpt).
45
+ More detailed information about the `generate_batch` methon can be found at [CTranslate2_Generator.generate_batch](https://opennmt.net/CTranslate2/python/ctranslate2.Generator.html#ctranslate2.Generator.generate_batch).
46
+
47
+ ```python
48
+ import ctranslate2
49
+ import transformers
50
+
51
+ generator = ctranslate2.Generator("avans06/ALMA-7B-ct2_int8_float16")
52
+ tokenizer = transformers.AutoTokenizer.from_pretrained("haoranxu/ALMA-7B")
53
+
54
+ text = "Who is Alan Turing?"
55
+ prompt = f"Translate this from English to Chinese:\nEnglish: {text}\nChinese:"
56
+ tokens = tokenizer.convert_ids_to_tokens(tokenizer.encode(prompt))
57
+
58
+ results = generator.generate_batch([tokens], max_length=256, sampling_temperature=0.7, sampling_topp=0.9, repetition_penalty=1.1, include_prompt_in_result=False)
59
+
60
+ output = tokenizer.decode(results[0].sequences_ids[0])
61
+ ```
62
+
63
+
64
+ ## The following explanations are excerpted from the [FAQ section of the author's GitHub README](https://github.com/fe1ixxu/ALMA#what-language-directions-do-alma-support).
65
+ - **What language directions do ALMA support?**
66
+ Currently, ALMA supports 10 directions: English↔German, Englishs↔Czech, Englishs↔Icelandic, Englishs↔Chinese, Englishs↔Russian. However, it may surprise us in other directions :)
67
+
68
+
69
+ ## More information
70
+
71
+ For more information about the original model, see its [GitHub repository](https://github.com/fe1ixxu/ALMA)