Agnuxo commited on
Commit
f572884
1 Parent(s): ea0140d

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +1 -157
README.md CHANGED
@@ -1,6 +1,6 @@
1
 
2
  ---
3
- base_model: Agnuxo/tiny-llama_32bit
4
  language: ['en', 'es']
5
  license: apache-2.0
6
  tags: ['text-generation-inference', 'transformers', 'unsloth', 'mistral', 'gguf']
@@ -9,34 +9,8 @@ library_name: adapter-transformers
9
  metrics:
10
  - accuracy
11
  - bertscore
12
- - bleu
13
- - bleurt
14
- - brier_score
15
- - cer
16
- - character
17
- - charcut_mt
18
- - chrf
19
- - code_eval
20
- - comet
21
- - competition_math
22
- - confusion_matrix
23
- - coval
24
- - cuad
25
- - exact_match
26
- - f1
27
- - frugalscore
28
  - glue
29
- - google_bleu
30
- - indic_glue
31
- - mauve
32
- - meteor
33
  - perplexity
34
- - rouge
35
- - sacrebleu
36
- - sari
37
- - squad
38
- - squad_v2
39
- - ter
40
  ---
41
 
42
  # Uploaded model
@@ -65,146 +39,16 @@ This model has been fine-tuned for various tasks and evaluated on the following
65
 
66
  ![bertscore Bertscore](./bertscore_bertscore.png)
67
 
68
- ### bleu
69
- **Bleu:** Not Available
70
-
71
- ![bleu Bleu](./bleu_bleu.png)
72
-
73
- ### bleurt
74
- **Bleurt:** Not Available
75
-
76
- ![bleurt Bleurt](./bleurt_bleurt.png)
77
-
78
- ### brier_score
79
- **Brier_score:** Not Available
80
-
81
- ![brier_score Brier_score](./brier_score_brier_score.png)
82
-
83
- ### cer
84
- **Cer:** Not Available
85
-
86
- ![cer Cer](./cer_cer.png)
87
-
88
- ### character
89
- **Character:** Not Available
90
-
91
- ![character Character](./character_character.png)
92
-
93
- ### charcut_mt
94
- **Charcut_mt:** Not Available
95
-
96
- ![charcut_mt Charcut_mt](./charcut_mt_charcut_mt.png)
97
-
98
- ### chrf
99
- **Chrf:** Not Available
100
-
101
- ![chrf Chrf](./chrf_chrf.png)
102
-
103
- ### code_eval
104
- **Code_eval:** Not Available
105
-
106
- ![code_eval Code_eval](./code_eval_code_eval.png)
107
-
108
- ### comet
109
- **Comet:** Not Available
110
-
111
- ![comet Comet](./comet_comet.png)
112
-
113
- ### competition_math
114
- **Competition_math:** Not Available
115
-
116
- ![competition_math Competition_math](./competition_math_competition_math.png)
117
-
118
- ### confusion_matrix
119
- **Confusion_matrix:** Not Available
120
-
121
- ![confusion_matrix Confusion_matrix](./confusion_matrix_confusion_matrix.png)
122
-
123
- ### coval
124
- **Coval:** Not Available
125
-
126
- ![coval Coval](./coval_coval.png)
127
-
128
- ### cuad
129
- **Cuad:** Not Available
130
-
131
- ![cuad Cuad](./cuad_cuad.png)
132
-
133
- ### exact_match
134
- **Exact_match:** Not Available
135
-
136
- ![exact_match Exact_match](./exact_match_exact_match.png)
137
-
138
- ### f1
139
- **F1:** Not Available
140
-
141
- ![f1 F1](./f1_f1.png)
142
-
143
- ### frugalscore
144
- **Frugalscore:** Not Available
145
-
146
- ![frugalscore Frugalscore](./frugalscore_frugalscore.png)
147
-
148
  ### glue
149
  **Glue:** Not Available
150
 
151
  ![glue Glue](./glue_glue.png)
152
 
153
- ### google_bleu
154
- **Google_bleu:** Not Available
155
-
156
- ![google_bleu Google_bleu](./google_bleu_google_bleu.png)
157
-
158
- ### indic_glue
159
- **Indic_glue:** Not Available
160
-
161
- ![indic_glue Indic_glue](./indic_glue_indic_glue.png)
162
-
163
- ### mauve
164
- **Mauve:** Not Available
165
-
166
- ![mauve Mauve](./mauve_mauve.png)
167
-
168
- ### meteor
169
- **Meteor:** Not Available
170
-
171
- ![meteor Meteor](./meteor_meteor.png)
172
-
173
  ### perplexity
174
  **Perplexity:** Not Available
175
 
176
  ![perplexity Perplexity](./perplexity_perplexity.png)
177
 
178
- ### rouge
179
- **Rouge:** Not Available
180
-
181
- ![rouge Rouge](./rouge_rouge.png)
182
-
183
- ### sacrebleu
184
- **Sacrebleu:** Not Available
185
-
186
- ![sacrebleu Sacrebleu](./sacrebleu_sacrebleu.png)
187
-
188
- ### sari
189
- **Sari:** Not Available
190
-
191
- ![sari Sari](./sari_sari.png)
192
-
193
- ### squad
194
- **Squad:** Not Available
195
-
196
- ![squad Squad](./squad_squad.png)
197
-
198
- ### squad_v2
199
- **Squad_v2:** Not Available
200
-
201
- ![squad_v2 Squad_v2](./squad_v2_squad_v2.png)
202
-
203
- ### ter
204
- **Ter:** Not Available
205
-
206
- ![ter Ter](./ter_ter.png)
207
-
208
 
209
  Model Size: 4,124,864 parameters
210
  Required Memory: 0.02 GB
 
1
 
2
  ---
3
+ base_model: Meta/tiny-llama
4
  language: ['en', 'es']
5
  license: apache-2.0
6
  tags: ['text-generation-inference', 'transformers', 'unsloth', 'mistral', 'gguf']
 
9
  metrics:
10
  - accuracy
11
  - bertscore
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
12
  - glue
 
 
 
 
13
  - perplexity
 
 
 
 
 
 
14
  ---
15
 
16
  # Uploaded model
 
39
 
40
  ![bertscore Bertscore](./bertscore_bertscore.png)
41
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
42
  ### glue
43
  **Glue:** Not Available
44
 
45
  ![glue Glue](./glue_glue.png)
46
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
47
  ### perplexity
48
  **Perplexity:** Not Available
49
 
50
  ![perplexity Perplexity](./perplexity_perplexity.png)
51
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
52
 
53
  Model Size: 4,124,864 parameters
54
  Required Memory: 0.02 GB