YC-Chen commited on
Commit
f95fec8
1 Parent(s): 28fe02a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -8,7 +8,7 @@ Breeze-7B-Instruct-v0.1 is a 7-billion-parameter language model built from Mistr
8
  This model expands the TC vocabulary (extra 30k TC tokens) based on the original Mistral-7B to better adapt to TC and improve inference speed,
9
  resulting in a doubling of the original tokenizer's inference speed.
10
  To the best of our knowledge, this is the first work on vocabulary expansion in TC.
11
- This model uses 250GB of TC data for continued pre-training and further uses over 1M instances for fine-tuning.
12
  Breeze-7B-Instruct-v0.1 performs well on both EN and TC benchmarks.
13
  This model outperforms Taiwan-LLM-7B-v2.1-chat, Taiwan-LLM-13B-v2.0-chat, and Yi-6B-Chat on all TC benchmarks
14
  and is comparable with Mistral-7B-Instruct-v0.1 on MMLU and MT-Bench in English.
 
8
  This model expands the TC vocabulary (extra 30k TC tokens) based on the original Mistral-7B to better adapt to TC and improve inference speed,
9
  resulting in a doubling of the original tokenizer's inference speed.
10
  To the best of our knowledge, this is the first work on vocabulary expansion in TC.
11
+ This model uses 250GB of TC data for continued pre-training and uses over 1M instances for further supervised fine-tuning.
12
  Breeze-7B-Instruct-v0.1 performs well on both EN and TC benchmarks.
13
  This model outperforms Taiwan-LLM-7B-v2.1-chat, Taiwan-LLM-13B-v2.0-chat, and Yi-6B-Chat on all TC benchmarks
14
  and is comparable with Mistral-7B-Instruct-v0.1 on MMLU and MT-Bench in English.