sujithatz commited on
Commit
fa5219a
1 Parent(s): 0c6183a

sujithatz/finbot-transofrmer-based-phi3.5_adapter

Browse files
Files changed (4) hide show
  1. README.md +200 -101
  2. adapter_config.json +7 -4
  3. adapter_model.safetensors +2 -2
  4. training_args.bin +1 -1
README.md CHANGED
@@ -18,7 +18,7 @@ should probably proofread and complete it, then remove this comment. -->
18
 
19
  This model is a fine-tuned version of [microsoft/Phi-3.5-mini-instruct](https://huggingface.co/microsoft/Phi-3.5-mini-instruct) on an unknown dataset.
20
  It achieves the following results on the evaluation set:
21
- - Loss: 0.5655
22
 
23
  ## Model description
24
 
@@ -37,117 +37,216 @@ More information needed
37
  ### Training hyperparameters
38
 
39
  The following hyperparameters were used during training:
40
- - learning_rate: 0.0002
41
  - train_batch_size: 4
42
  - eval_batch_size: 4
43
  - seed: 0
44
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
  - lr_scheduler_type: cosine
46
  - lr_scheduler_warmup_ratio: 0.01
47
- - num_epochs: 4
48
 
49
  ### Training results
50
 
51
  | Training Loss | Epoch | Step | Validation Loss |
52
  |:-------------:|:------:|:----:|:---------------:|
53
- | 1.6693 | 0.0405 | 3 | 1.6168 |
54
- | 1.3656 | 0.0811 | 6 | 1.2375 |
55
- | 1.0964 | 0.1216 | 9 | 1.0552 |
56
- | 0.8795 | 0.1622 | 12 | 0.9069 |
57
- | 0.7104 | 0.2027 | 15 | 0.8512 |
58
- | 0.7107 | 0.2432 | 18 | 0.7911 |
59
- | 0.6494 | 0.2838 | 21 | 0.7575 |
60
- | 0.6856 | 0.3243 | 24 | 0.7386 |
61
- | 0.7626 | 0.3649 | 27 | 0.7219 |
62
- | 0.6758 | 0.4054 | 30 | 0.6914 |
63
- | 0.6051 | 0.4459 | 33 | 0.6597 |
64
- | 0.6694 | 0.4865 | 36 | 0.6429 |
65
- | 0.4273 | 0.5270 | 39 | 0.6305 |
66
- | 0.6037 | 0.5676 | 42 | 0.6197 |
67
- | 0.579 | 0.6081 | 45 | 0.6080 |
68
- | 0.4144 | 0.6486 | 48 | 0.5822 |
69
- | 0.5971 | 0.6892 | 51 | 0.5576 |
70
- | 0.589 | 0.7297 | 54 | 0.5488 |
71
- | 0.6413 | 0.7703 | 57 | 0.5471 |
72
- | 0.5992 | 0.8108 | 60 | 0.5386 |
73
- | 0.5468 | 0.8514 | 63 | 0.5232 |
74
- | 0.7074 | 0.8919 | 66 | 0.5172 |
75
- | 0.4391 | 0.9324 | 69 | 0.5085 |
76
- | 0.6243 | 0.9730 | 72 | 0.4970 |
77
- | 0.3784 | 1.0135 | 75 | 0.4958 |
78
- | 0.3136 | 1.0541 | 78 | 0.4939 |
79
- | 0.3832 | 1.0946 | 81 | 0.5047 |
80
- | 0.3463 | 1.1351 | 84 | 0.5085 |
81
- | 0.4151 | 1.1757 | 87 | 0.5182 |
82
- | 0.3072 | 1.2162 | 90 | 0.5147 |
83
- | 0.2954 | 1.2568 | 93 | 0.5210 |
84
- | 0.3114 | 1.2973 | 96 | 0.5145 |
85
- | 0.2628 | 1.3378 | 99 | 0.5141 |
86
- | 0.3768 | 1.3784 | 102 | 0.5129 |
87
- | 0.2737 | 1.4189 | 105 | 0.5058 |
88
- | 0.3409 | 1.4595 | 108 | 0.5049 |
89
- | 0.4281 | 1.5 | 111 | 0.4975 |
90
- | 0.4059 | 1.5405 | 114 | 0.4848 |
91
- | 0.4375 | 1.5811 | 117 | 0.4880 |
92
- | 0.2794 | 1.6216 | 120 | 0.4849 |
93
- | 0.3575 | 1.6622 | 123 | 0.4802 |
94
- | 0.5313 | 1.7027 | 126 | 0.4834 |
95
- | 0.2644 | 1.7432 | 129 | 0.4811 |
96
- | 0.3878 | 1.7838 | 132 | 0.4746 |
97
- | 0.3286 | 1.8243 | 135 | 0.4641 |
98
- | 0.3327 | 1.8649 | 138 | 0.4564 |
99
- | 0.2176 | 1.9054 | 141 | 0.4547 |
100
- | 0.4059 | 1.9459 | 144 | 0.4528 |
101
- | 0.2943 | 1.9865 | 147 | 0.4540 |
102
- | 0.2527 | 2.0270 | 150 | 0.4554 |
103
- | 0.4193 | 2.0676 | 153 | 0.4742 |
104
- | 0.2857 | 2.1081 | 156 | 0.5054 |
105
- | 0.1813 | 2.1486 | 159 | 0.5248 |
106
- | 0.1805 | 2.1892 | 162 | 0.5251 |
107
- | 0.0996 | 2.2297 | 165 | 0.5196 |
108
- | 0.181 | 2.2703 | 168 | 0.5206 |
109
- | 0.2093 | 2.3108 | 171 | 0.5134 |
110
- | 0.1637 | 2.3514 | 174 | 0.5138 |
111
- | 0.1239 | 2.3919 | 177 | 0.5120 |
112
- | 0.2012 | 2.4324 | 180 | 0.5150 |
113
- | 0.2687 | 2.4730 | 183 | 0.5152 |
114
- | 0.1168 | 2.5135 | 186 | 0.5202 |
115
- | 0.2365 | 2.5541 | 189 | 0.5221 |
116
- | 0.289 | 2.5946 | 192 | 0.5174 |
117
- | 0.1814 | 2.6351 | 195 | 0.5128 |
118
- | 0.1923 | 2.6757 | 198 | 0.5078 |
119
- | 0.1834 | 2.7162 | 201 | 0.5016 |
120
- | 0.1661 | 2.7568 | 204 | 0.4995 |
121
- | 0.1359 | 2.7973 | 207 | 0.4983 |
122
- | 0.1327 | 2.8378 | 210 | 0.5041 |
123
- | 0.2567 | 2.8784 | 213 | 0.5087 |
124
- | 0.2326 | 2.9189 | 216 | 0.5074 |
125
- | 0.2025 | 2.9595 | 219 | 0.5016 |
126
- | 0.146 | 3.0 | 222 | 0.4954 |
127
- | 0.1048 | 3.0405 | 225 | 0.4967 |
128
- | 0.1218 | 3.0811 | 228 | 0.5027 |
129
- | 0.3124 | 3.1216 | 231 | 0.5094 |
130
- | 0.1518 | 3.1622 | 234 | 0.5186 |
131
- | 0.2543 | 3.2027 | 237 | 0.5275 |
132
- | 0.0982 | 3.2432 | 240 | 0.5364 |
133
- | 0.1014 | 3.2838 | 243 | 0.5447 |
134
- | 0.1216 | 3.3243 | 246 | 0.5548 |
135
- | 0.0768 | 3.3649 | 249 | 0.5589 |
136
- | 0.086 | 3.4054 | 252 | 0.5633 |
137
- | 0.1213 | 3.4459 | 255 | 0.5654 |
138
- | 0.1437 | 3.4865 | 258 | 0.5668 |
139
- | 0.1341 | 3.5270 | 261 | 0.5670 |
140
- | 0.1122 | 3.5676 | 264 | 0.5659 |
141
- | 0.0832 | 3.6081 | 267 | 0.5662 |
142
- | 0.0668 | 3.6486 | 270 | 0.5637 |
143
- | 0.2023 | 3.6892 | 273 | 0.5662 |
144
- | 0.104 | 3.7297 | 276 | 0.5640 |
145
- | 0.1181 | 3.7703 | 279 | 0.5650 |
146
- | 0.1242 | 3.8108 | 282 | 0.5636 |
147
- | 0.0874 | 3.8514 | 285 | 0.5644 |
148
- | 0.1133 | 3.8919 | 288 | 0.5637 |
149
- | 0.073 | 3.9324 | 291 | 0.5640 |
150
- | 0.0724 | 3.9730 | 294 | 0.5655 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
151
 
152
 
153
  ### Framework versions
 
18
 
19
  This model is a fine-tuned version of [microsoft/Phi-3.5-mini-instruct](https://huggingface.co/microsoft/Phi-3.5-mini-instruct) on an unknown dataset.
20
  It achieves the following results on the evaluation set:
21
+ - Loss: 0.5253
22
 
23
  ## Model description
24
 
 
37
  ### Training hyperparameters
38
 
39
  The following hyperparameters were used during training:
40
+ - learning_rate: 0.0001
41
  - train_batch_size: 4
42
  - eval_batch_size: 4
43
  - seed: 0
44
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
  - lr_scheduler_type: cosine
46
  - lr_scheduler_warmup_ratio: 0.01
47
+ - num_epochs: 8
48
 
49
  ### Training results
50
 
51
  | Training Loss | Epoch | Step | Validation Loss |
52
  |:-------------:|:------:|:----:|:---------------:|
53
+ | 1.7134 | 0.0405 | 3 | 1.8221 |
54
+ | 1.7619 | 0.0811 | 6 | 1.6961 |
55
+ | 1.5449 | 0.1216 | 9 | 1.5453 |
56
+ | 1.3625 | 0.1622 | 12 | 1.3982 |
57
+ | 1.1389 | 0.2027 | 15 | 1.2786 |
58
+ | 1.1091 | 0.2432 | 18 | 1.1889 |
59
+ | 1.0605 | 0.2838 | 21 | 1.1050 |
60
+ | 0.9908 | 0.3243 | 24 | 1.0395 |
61
+ | 0.9653 | 0.3649 | 27 | 0.9886 |
62
+ | 0.9258 | 0.4054 | 30 | 0.9401 |
63
+ | 0.8964 | 0.4459 | 33 | 0.8945 |
64
+ | 0.8189 | 0.4865 | 36 | 0.8615 |
65
+ | 0.7202 | 0.5270 | 39 | 0.8325 |
66
+ | 0.7553 | 0.5676 | 42 | 0.8109 |
67
+ | 0.7415 | 0.6081 | 45 | 0.7911 |
68
+ | 0.6421 | 0.6486 | 48 | 0.7730 |
69
+ | 0.7638 | 0.6892 | 51 | 0.7411 |
70
+ | 0.7495 | 0.7297 | 54 | 0.7208 |
71
+ | 0.7678 | 0.7703 | 57 | 0.7102 |
72
+ | 0.7027 | 0.8108 | 60 | 0.7002 |
73
+ | 0.7106 | 0.8514 | 63 | 0.6892 |
74
+ | 0.8461 | 0.8919 | 66 | 0.6852 |
75
+ | 0.5863 | 0.9324 | 69 | 0.6826 |
76
+ | 0.7466 | 0.9730 | 72 | 0.6802 |
77
+ | 0.5847 | 1.0135 | 75 | 0.6696 |
78
+ | 0.5349 | 1.0541 | 78 | 0.6590 |
79
+ | 0.5991 | 1.0946 | 81 | 0.6560 |
80
+ | 0.5777 | 1.1351 | 84 | 0.6526 |
81
+ | 0.6342 | 1.1757 | 87 | 0.6488 |
82
+ | 0.5053 | 1.2162 | 90 | 0.6494 |
83
+ | 0.4909 | 1.2568 | 93 | 0.6485 |
84
+ | 0.5154 | 1.2973 | 96 | 0.6458 |
85
+ | 0.4728 | 1.3378 | 99 | 0.6375 |
86
+ | 0.5648 | 1.3784 | 102 | 0.6327 |
87
+ | 0.4878 | 1.4189 | 105 | 0.6260 |
88
+ | 0.5677 | 1.4595 | 108 | 0.6165 |
89
+ | 0.6598 | 1.5 | 111 | 0.6059 |
90
+ | 0.5811 | 1.5405 | 114 | 0.6021 |
91
+ | 0.5984 | 1.5811 | 117 | 0.6018 |
92
+ | 0.4477 | 1.6216 | 120 | 0.6010 |
93
+ | 0.5762 | 1.6622 | 123 | 0.5944 |
94
+ | 0.7896 | 1.7027 | 126 | 0.5924 |
95
+ | 0.449 | 1.7432 | 129 | 0.5849 |
96
+ | 0.6014 | 1.7838 | 132 | 0.5793 |
97
+ | 0.4798 | 1.8243 | 135 | 0.5744 |
98
+ | 0.4943 | 1.8649 | 138 | 0.5715 |
99
+ | 0.3907 | 1.9054 | 141 | 0.5692 |
100
+ | 0.6352 | 1.9459 | 144 | 0.5631 |
101
+ | 0.469 | 1.9865 | 147 | 0.5633 |
102
+ | 0.4819 | 2.0270 | 150 | 0.5623 |
103
+ | 0.7567 | 2.0676 | 153 | 0.5610 |
104
+ | 0.533 | 2.1081 | 156 | 0.5641 |
105
+ | 0.4195 | 2.1486 | 159 | 0.5615 |
106
+ | 0.4015 | 2.1892 | 162 | 0.5609 |
107
+ | 0.2958 | 2.2297 | 165 | 0.5642 |
108
+ | 0.4477 | 2.2703 | 168 | 0.5602 |
109
+ | 0.4111 | 2.3108 | 171 | 0.5530 |
110
+ | 0.3958 | 2.3514 | 174 | 0.5495 |
111
+ | 0.3053 | 2.3919 | 177 | 0.5437 |
112
+ | 0.4952 | 2.4324 | 180 | 0.5400 |
113
+ | 0.5617 | 2.4730 | 183 | 0.5322 |
114
+ | 0.298 | 2.5135 | 186 | 0.5273 |
115
+ | 0.5439 | 2.5541 | 189 | 0.5256 |
116
+ | 0.5791 | 2.5946 | 192 | 0.5215 |
117
+ | 0.4429 | 2.6351 | 195 | 0.5205 |
118
+ | 0.4454 | 2.6757 | 198 | 0.5251 |
119
+ | 0.4071 | 2.7162 | 201 | 0.5267 |
120
+ | 0.3948 | 2.7568 | 204 | 0.5327 |
121
+ | 0.3196 | 2.7973 | 207 | 0.5342 |
122
+ | 0.3567 | 2.8378 | 210 | 0.5344 |
123
+ | 0.5284 | 2.8784 | 213 | 0.5292 |
124
+ | 0.491 | 2.9189 | 216 | 0.5182 |
125
+ | 0.4267 | 2.9595 | 219 | 0.5137 |
126
+ | 0.3587 | 3.0 | 222 | 0.5098 |
127
+ | 0.3587 | 3.0405 | 225 | 0.5131 |
128
+ | 0.377 | 3.0811 | 228 | 0.5200 |
129
+ | 0.6423 | 3.1216 | 231 | 0.5214 |
130
+ | 0.4839 | 3.1622 | 234 | 0.5139 |
131
+ | 0.566 | 3.2027 | 237 | 0.5123 |
132
+ | 0.38 | 3.2432 | 240 | 0.5172 |
133
+ | 0.3995 | 3.2838 | 243 | 0.5207 |
134
+ | 0.3486 | 3.3243 | 246 | 0.5148 |
135
+ | 0.2418 | 3.3649 | 249 | 0.5104 |
136
+ | 0.3178 | 3.4054 | 252 | 0.5086 |
137
+ | 0.4065 | 3.4459 | 255 | 0.5031 |
138
+ | 0.3472 | 3.4865 | 258 | 0.5050 |
139
+ | 0.4543 | 3.5270 | 261 | 0.5046 |
140
+ | 0.4066 | 3.5676 | 264 | 0.5020 |
141
+ | 0.2606 | 3.6081 | 267 | 0.5010 |
142
+ | 0.2332 | 3.6486 | 270 | 0.5007 |
143
+ | 0.5026 | 3.6892 | 273 | 0.5003 |
144
+ | 0.3901 | 3.7297 | 276 | 0.5057 |
145
+ | 0.3552 | 3.7703 | 279 | 0.5126 |
146
+ | 0.3921 | 3.8108 | 282 | 0.5179 |
147
+ | 0.3366 | 3.8514 | 285 | 0.5092 |
148
+ | 0.3706 | 3.8919 | 288 | 0.5008 |
149
+ | 0.2791 | 3.9324 | 291 | 0.4961 |
150
+ | 0.2247 | 3.9730 | 294 | 0.4968 |
151
+ | 0.2879 | 4.0135 | 297 | 0.4971 |
152
+ | 0.3355 | 4.0541 | 300 | 0.5036 |
153
+ | 0.3928 | 4.0946 | 303 | 0.5023 |
154
+ | 0.2399 | 4.1351 | 306 | 0.5056 |
155
+ | 0.3396 | 4.1757 | 309 | 0.5089 |
156
+ | 0.2602 | 4.2162 | 312 | 0.5091 |
157
+ | 0.2565 | 4.2568 | 315 | 0.5110 |
158
+ | 0.24 | 4.2973 | 318 | 0.5156 |
159
+ | 0.2364 | 4.3378 | 321 | 0.5216 |
160
+ | 0.3694 | 4.3784 | 324 | 0.5224 |
161
+ | 0.2185 | 4.4189 | 327 | 0.5183 |
162
+ | 0.337 | 4.4595 | 330 | 0.5119 |
163
+ | 0.3404 | 4.5 | 333 | 0.5084 |
164
+ | 0.3049 | 4.5405 | 336 | 0.5071 |
165
+ | 0.4811 | 4.5811 | 339 | 0.5098 |
166
+ | 0.338 | 4.6216 | 342 | 0.5092 |
167
+ | 0.305 | 4.6622 | 345 | 0.5090 |
168
+ | 0.5273 | 4.7027 | 348 | 0.5079 |
169
+ | 0.3122 | 4.7432 | 351 | 0.5044 |
170
+ | 0.2995 | 4.7838 | 354 | 0.4991 |
171
+ | 0.2654 | 4.8243 | 357 | 0.4935 |
172
+ | 0.3992 | 4.8649 | 360 | 0.4946 |
173
+ | 0.2272 | 4.9054 | 363 | 0.5003 |
174
+ | 0.3094 | 4.9459 | 366 | 0.5026 |
175
+ | 0.2773 | 4.9865 | 369 | 0.5021 |
176
+ | 0.3934 | 5.0270 | 372 | 0.4993 |
177
+ | 0.271 | 5.0676 | 375 | 0.5015 |
178
+ | 0.3928 | 5.1081 | 378 | 0.5040 |
179
+ | 0.2105 | 5.1486 | 381 | 0.5134 |
180
+ | 0.2548 | 5.1892 | 384 | 0.5182 |
181
+ | 0.2424 | 5.2297 | 387 | 0.5104 |
182
+ | 0.4469 | 5.2703 | 390 | 0.5122 |
183
+ | 0.2866 | 5.3108 | 393 | 0.5112 |
184
+ | 0.2958 | 5.3514 | 396 | 0.5090 |
185
+ | 0.2034 | 5.3919 | 399 | 0.5051 |
186
+ | 0.4091 | 5.4324 | 402 | 0.5023 |
187
+ | 0.1415 | 5.4730 | 405 | 0.5059 |
188
+ | 0.4137 | 5.5135 | 408 | 0.5098 |
189
+ | 0.2784 | 5.5541 | 411 | 0.5134 |
190
+ | 0.158 | 5.5946 | 414 | 0.5160 |
191
+ | 0.4701 | 5.6351 | 417 | 0.5183 |
192
+ | 0.2256 | 5.6757 | 420 | 0.5168 |
193
+ | 0.1868 | 5.7162 | 423 | 0.5147 |
194
+ | 0.2868 | 5.7568 | 426 | 0.5130 |
195
+ | 0.2142 | 5.7973 | 429 | 0.5147 |
196
+ | 0.2693 | 5.8378 | 432 | 0.5130 |
197
+ | 0.2882 | 5.8784 | 435 | 0.5108 |
198
+ | 0.3243 | 5.9189 | 438 | 0.5098 |
199
+ | 0.343 | 5.9595 | 441 | 0.5067 |
200
+ | 0.2602 | 6.0 | 444 | 0.5002 |
201
+ | 0.2237 | 6.0405 | 447 | 0.5001 |
202
+ | 0.3727 | 6.0811 | 450 | 0.5039 |
203
+ | 0.2471 | 6.1216 | 453 | 0.5076 |
204
+ | 0.4095 | 6.1622 | 456 | 0.5145 |
205
+ | 0.2445 | 6.2027 | 459 | 0.5188 |
206
+ | 0.2387 | 6.2432 | 462 | 0.5231 |
207
+ | 0.2322 | 6.2838 | 465 | 0.5258 |
208
+ | 0.2998 | 6.3243 | 468 | 0.5270 |
209
+ | 0.2463 | 6.3649 | 471 | 0.5251 |
210
+ | 0.1931 | 6.4054 | 474 | 0.5237 |
211
+ | 0.2254 | 6.4459 | 477 | 0.5187 |
212
+ | 0.278 | 6.4865 | 480 | 0.5177 |
213
+ | 0.3654 | 6.5270 | 483 | 0.5162 |
214
+ | 0.2886 | 6.5676 | 486 | 0.5130 |
215
+ | 0.229 | 6.6081 | 489 | 0.5150 |
216
+ | 0.2361 | 6.6486 | 492 | 0.5158 |
217
+ | 0.1497 | 6.6892 | 495 | 0.5165 |
218
+ | 0.2926 | 6.7297 | 498 | 0.5179 |
219
+ | 0.2979 | 6.7703 | 501 | 0.5211 |
220
+ | 0.244 | 6.8108 | 504 | 0.5200 |
221
+ | 0.2846 | 6.8514 | 507 | 0.5197 |
222
+ | 0.1897 | 6.8919 | 510 | 0.5200 |
223
+ | 0.2106 | 6.9324 | 513 | 0.5210 |
224
+ | 0.3168 | 6.9730 | 516 | 0.5210 |
225
+ | 0.2002 | 7.0135 | 519 | 0.5192 |
226
+ | 0.3515 | 7.0541 | 522 | 0.5202 |
227
+ | 0.1807 | 7.0946 | 525 | 0.5214 |
228
+ | 0.2331 | 7.1351 | 528 | 0.5212 |
229
+ | 0.1571 | 7.1757 | 531 | 0.5215 |
230
+ | 0.186 | 7.2162 | 534 | 0.5194 |
231
+ | 0.2281 | 7.2568 | 537 | 0.5207 |
232
+ | 0.2534 | 7.2973 | 540 | 0.5219 |
233
+ | 0.3643 | 7.3378 | 543 | 0.5212 |
234
+ | 0.4516 | 7.3784 | 546 | 0.5203 |
235
+ | 0.181 | 7.4189 | 549 | 0.5226 |
236
+ | 0.256 | 7.4595 | 552 | 0.5214 |
237
+ | 0.2802 | 7.5 | 555 | 0.5212 |
238
+ | 0.1913 | 7.5405 | 558 | 0.5196 |
239
+ | 0.2293 | 7.5811 | 561 | 0.5207 |
240
+ | 0.2282 | 7.6216 | 564 | 0.5213 |
241
+ | 0.1954 | 7.6622 | 567 | 0.5225 |
242
+ | 0.3199 | 7.7027 | 570 | 0.5216 |
243
+ | 0.2687 | 7.7432 | 573 | 0.5231 |
244
+ | 0.2122 | 7.7838 | 576 | 0.5218 |
245
+ | 0.3616 | 7.8243 | 579 | 0.5228 |
246
+ | 0.1206 | 7.8649 | 582 | 0.5212 |
247
+ | 0.148 | 7.9054 | 585 | 0.5216 |
248
+ | 0.3779 | 7.9459 | 588 | 0.5224 |
249
+ | 0.272 | 7.9865 | 591 | 0.5253 |
250
 
251
 
252
  ### Framework versions
adapter_config.json CHANGED
@@ -20,10 +20,13 @@
20
  "rank_pattern": {},
21
  "revision": null,
22
  "target_modules": [
23
- "gate_up_proj",
24
- "o_proj",
25
- "qkv_proj",
26
- "down_proj"
 
 
 
27
  ],
28
  "task_type": "CAUSAL_LM",
29
  "use_dora": false,
 
20
  "rank_pattern": {},
21
  "revision": null,
22
  "target_modules": [
23
+ "v_proj",
24
+ "gate_proj",
25
+ "down_proj",
26
+ "q_proj",
27
+ "k_proj",
28
+ "up_proj",
29
+ "o_proj"
30
  ],
31
  "task_type": "CAUSAL_LM",
32
  "use_dora": false,
adapter_model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:d085116f5270d42fbd3f522f7e5401ba43b2cf120ac015700e4ac0a39d893626
3
- size 100697728
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a6384479f1901fab07ee93beefabcc99df61642bc4f24b3c1910077549d41057
3
+ size 35668592
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:b38ec32d6537ab3d4e404fffac63b5d9a8c23e446f15fd22f45b1947ba118c42
3
  size 5432
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:128f09edee5ebec935e582046b4486a1b9e6c63ac546255911b25d1112ea524c
3
  size 5432