riotu-lab commited on
Commit
b7256cb
1 Parent(s): 1d8d479

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -5
README.md CHANGED
@@ -11,11 +11,11 @@ tags:
11
 
12
  ArabianGPT is a custom-trained version of the GPT-2 base model, specifically tailored for the Arabic language. It is designed to understand and generate Arabic text, making it suitable for various natural language processing tasks in Arabic.
13
 
14
- * Model Name: ArabianGPT
15
- * Architecture: GPT-2
16
 
17
  | Specification | Value |
18
  |-----------------------|----------|
 
 
19
  | Layers | 12 |
20
  | MAL (Model Attention Layers) | 12 |
21
  | Model Size | 134M |
@@ -36,7 +36,6 @@ ArabianGPT is a custom-trained version of the GPT-2 base model, specifically tai
36
  | ArabianGPT-base | NDIVIA A100 | 7.5M | 512 | 313.5K | 3 | 3.97 |
37
 
38
 
39
- > [!NOTE]
40
  > The model was trained on the Abu Elkhiar dataset, a comprehensive Arabic text corpus encompassing a wide range of topics. The training process focused on adapting the model to understand the nuances and complexities of the Arabic language.
41
 
42
  # Tokenizer
@@ -66,8 +65,7 @@ pipe.predict(text)
66
  ```
67
 
68
  # Limitations
69
- > [
70
- !TIP]
71
  > As with any language model, ArabianGPT may have limitations in understanding context or generating text in certain scenarios. Users should be aware of these limitations and use the model accordingly.
72
 
73
  # Ethical Considerations
 
11
 
12
  ArabianGPT is a custom-trained version of the GPT-2 base model, specifically tailored for the Arabic language. It is designed to understand and generate Arabic text, making it suitable for various natural language processing tasks in Arabic.
13
 
 
 
14
 
15
  | Specification | Value |
16
  |-----------------------|----------|
17
+ | Model Name | ArabianGPT|
18
+ | Architecture |GPT-2 |
19
  | Layers | 12 |
20
  | MAL (Model Attention Layers) | 12 |
21
  | Model Size | 134M |
 
36
  | ArabianGPT-base | NDIVIA A100 | 7.5M | 512 | 313.5K | 3 | 3.97 |
37
 
38
 
 
39
  > The model was trained on the Abu Elkhiar dataset, a comprehensive Arabic text corpus encompassing a wide range of topics. The training process focused on adapting the model to understand the nuances and complexities of the Arabic language.
40
 
41
  # Tokenizer
 
65
  ```
66
 
67
  # Limitations
68
+
 
69
  > As with any language model, ArabianGPT may have limitations in understanding context or generating text in certain scenarios. Users should be aware of these limitations and use the model accordingly.
70
 
71
  # Ethical Considerations