Update README.md
Browse files
README.md
CHANGED
@@ -4,7 +4,10 @@ pipeline_tag: text-generation
|
|
4 |
|
5 |
# Model Card for Breeze-7B-Instruct-v0.1
|
6 |
|
7 |
-
|
|
|
|
|
|
|
8 |
|
9 |
## Model Details
|
10 |
- **Finetuned from:** [MediaTek-Research/Breeze-7B-Base-v0.1](https://huggingface.co/MediaTek-Research/Breeze-7B-Base-v0.1)
|
|
|
4 |
|
5 |
# Model Card for Breeze-7B-Instruct-v0.1
|
6 |
|
7 |
+
The Breeze-7B-Instruct-v0.1 is a 7-billion-parameter language model built from Mistral-7B and tailored for Traditional Chinese (TC).
|
8 |
+
This model incorporates an additional 30k TC vocabularies to better adapt to TC and improve inference speed, resulting in a doubling of the original tokenizer's inference speed.
|
9 |
+
Breeze-7B-Instruct-v0.1 performs well on both EN and TC benchmarks.
|
10 |
+
This model outperforms Taiwan-LLM-7B-v2.1-chat, Taiwan-LLM-13B-v2.0-chat, and Yi-6B-Chat on all TC benchmarks we tested, and is comparable with Mistral-7B-Instruct on the Open LLM Leaderboard.
|
11 |
|
12 |
## Model Details
|
13 |
- **Finetuned from:** [MediaTek-Research/Breeze-7B-Base-v0.1](https://huggingface.co/MediaTek-Research/Breeze-7B-Base-v0.1)
|