jianqing666 commited on
Commit
e75a07a
1 Parent(s): 4cb711c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -5
README.md CHANGED
@@ -5,22 +5,22 @@ language:
5
  ---
6
  # <b>AceGPT</b>
7
  AceGPT is a fully fine-tuned generative text model collection based on LlaMA2, particularly in the
8
- Arabic language domain. This is the repository for the 7B-chat pretrained model.
9
 
10
  ---
11
  ## Model Details
12
  We have released the AceGPT family of large language models, which is a collection of fully fine-tuned generative text models based on LlaMA2, ranging from 7B to 13B parameters. Our models include two main categories: AceGPT and AceGPT-chat. AceGPT-chat is an optimized version specifically designed for dialogue applications. It is worth mentioning that our models have demonstrated superior performance compared to all currently available open-source Arabic dialogue models in multiple benchmark tests. Furthermore, in our human evaluations, our models have shown comparable satisfaction levels to some closed-source models, such as ChatGPT, in the Arabic language.
13
  ## Model Developers
14
- We are from the School of Data Science, the Chinese University of Hong Kong, Shenzhen (CUHKSZ), and the Shenzhen Research Institute of Big Data (SRIBD).
15
  ## Variations
16
- AceGPT famils comes in a range of parameter sizes —— 7B and 13B, each size of model has a base categorie and a -chat categorie.
17
  ## Input
18
  Models input text only.
19
  ## Output
20
  Models output text only.
21
  ## Model Evaluation Results
22
 
23
- Experiments on Arabic Vicuna-80, Arabic AlpacaEval. Numbers are the average perfor-mance ratio of ChatGPT over three runs. We do not report results of raw Llama-2 models since they cannot properly generate Arabic texts.
24
  | | Arabic Vicuna-80 | Arabic AlpacaEval |
25
  |------------------------------|--------------------|---------------------|
26
  | Phoenix Chen et al. (2023a) | 71.92% ± 0.2% | 65.62% ± 0.3% |
@@ -70,4 +70,4 @@ Experiments on Arabic Vicuna-80, Arabic AlpacaEval. Numbers are the average perf
70
  16. تعلم كيفية تحديد الأولويات: تعلم كيفية تحديد الأولويات والتركيز على المهام الأكثر أهمية أولاً.
71
  17. استخدم تقنية الترتيب الثلاثي: تقنية تتطلب منك ترتيب المهام حسب الأهمية والعاجلة، ثم تعمل على المهمة الأعلى أولاً.
72
  18. تعلم كيفية تحسين التركيز: تعلم"
73
- # You can get more detail at https://github.com/FreedomIntelligence/AceGPT/tree/main
 
5
  ---
6
  # <b>AceGPT</b>
7
  AceGPT is a fully fine-tuned generative text model collection based on LlaMA2, particularly in the
8
+ Arabic language domain. This is the repository for the 7B-chat pre-trained model.
9
 
10
  ---
11
  ## Model Details
12
  We have released the AceGPT family of large language models, which is a collection of fully fine-tuned generative text models based on LlaMA2, ranging from 7B to 13B parameters. Our models include two main categories: AceGPT and AceGPT-chat. AceGPT-chat is an optimized version specifically designed for dialogue applications. It is worth mentioning that our models have demonstrated superior performance compared to all currently available open-source Arabic dialogue models in multiple benchmark tests. Furthermore, in our human evaluations, our models have shown comparable satisfaction levels to some closed-source models, such as ChatGPT, in the Arabic language.
13
  ## Model Developers
14
+ We are from the School of Data Science, the Chinese University of Hong Kong, Shenzhen (CUHKSZ), the Shenzhen Research Institute of Big Data (SRIBD), and the King Abdullah University of Science and Technology (KAUST).
15
  ## Variations
16
+ AceGPT famils come in a range of parameter sizes —— 7B and 13B, each size of model has a base category and a -chat category.
17
  ## Input
18
  Models input text only.
19
  ## Output
20
  Models output text only.
21
  ## Model Evaluation Results
22
 
23
+ Experiments on Arabic Vicuna-80, Arabic AlpacaEval. Numbers are the average performance ratio of ChatGPT over three runs. We do not report the results of raw Llama-2 models since they cannot properly generate Arabic texts.
24
  | | Arabic Vicuna-80 | Arabic AlpacaEval |
25
  |------------------------------|--------------------|---------------------|
26
  | Phoenix Chen et al. (2023a) | 71.92% ± 0.2% | 65.62% ± 0.3% |
 
70
  16. تعلم كيفية تحديد الأولويات: تعلم كيفية تحديد الأولويات والتركيز على المهام الأكثر أهمية أولاً.
71
  17. استخدم تقنية الترتيب الثلاثي: تقنية تتطلب منك ترتيب المهام حسب الأهمية والعاجلة، ثم تعمل على المهمة الأعلى أولاً.
72
  18. تعلم كيفية تحسين التركيز: تعلم"
73
+ # You can get more details at https://github.com/FreedomIntelligence/AceGPT/tree/main