v3b_mistral_lora / README.md
mtzig's picture
Training in progress, step 100
387e8a7 verified
|
raw
history blame
11.6 kB
metadata
library_name: peft
base_model: peiyi9979/math-shepherd-mistral-7b-prm
tags:
  - generated_from_trainer
metrics:
  - accuracy
  - precision
  - recall
  - f1
model-index:
  - name: v3b_mistral_lora
    results: []

v3b_mistral_lora

This model is a fine-tuned version of peiyi9979/math-shepherd-mistral-7b-prm on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0126
  • Accuracy: 0.9936
  • Precision: 0.8793
  • Recall: 0.9808
  • F1: 0.9273

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 765837
  • distributed_type: multi-GPU
  • num_devices: 4
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 64
  • total_eval_batch_size: 32
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 1

Training results

Training Loss Epoch Step Validation Loss Accuracy Precision Recall F1
No log 0 0 0.3911 0.9098 0.2222 0.4615 0.3
0.7904 0.0095 20 0.3855 0.9163 0.24 0.4615 0.3158
0.6496 0.0190 40 0.3578 0.9300 0.2603 0.3654 0.304
0.5209 0.0284 60 0.3011 0.9469 0.2941 0.1923 0.2326
0.482 0.0379 80 0.2597 0.9525 0.3478 0.1538 0.2133
0.4165 0.0474 100 0.2602 0.9549 0.4524 0.3654 0.4043
0.5055 0.0569 120 0.2183 0.9614 0.5345 0.5962 0.5636
0.2795 0.0664 140 0.1609 0.9541 0.4699 0.75 0.5778
0.3314 0.0758 160 0.1574 0.9316 0.3659 0.8654 0.5143
0.257 0.0853 180 0.1001 0.9597 0.5119 0.8269 0.6324
0.3089 0.0948 200 0.0747 0.9750 0.6567 0.8462 0.7395
0.329 0.1043 220 0.0722 0.9742 0.6471 0.8462 0.7333
0.4088 0.1138 240 0.0581 0.9815 0.7458 0.8462 0.7928
0.2598 0.1233 260 0.0531 0.9815 0.7458 0.8462 0.7928
0.1978 0.1327 280 0.0428 0.9863 0.8070 0.8846 0.8440
0.1661 0.1422 300 0.0452 0.9839 0.75 0.9231 0.8276
0.2944 0.1517 320 0.0511 0.9815 0.7101 0.9423 0.8099
0.2463 0.1612 340 0.0379 0.9847 0.7463 0.9615 0.8403
0.1767 0.1707 360 0.0379 0.9871 0.7812 0.9615 0.8621
0.1581 0.1801 380 0.0315 0.9903 0.8571 0.9231 0.8889
0.1162 0.1896 400 0.0378 0.9863 0.7692 0.9615 0.8547
0.1442 0.1991 420 0.0395 0.9855 0.7656 0.9423 0.8448
0.1957 0.2086 440 0.0343 0.9903 0.8333 0.9615 0.8929
0.2593 0.2181 460 0.0308 0.9919 0.875 0.9423 0.9074
0.1137 0.2275 480 0.0236 0.9936 0.8929 0.9615 0.9259
0.2213 0.2370 500 0.0264 0.9879 0.7846 0.9808 0.8718
0.2092 0.2465 520 0.0336 0.9895 0.8197 0.9615 0.8850
0.2247 0.2560 540 0.0287 0.9928 0.8909 0.9423 0.9159
0.2684 0.2655 560 0.0286 0.9919 0.8621 0.9615 0.9091
0.1485 0.2749 580 0.0354 0.9895 0.8197 0.9615 0.8850
0.1168 0.2844 600 0.0274 0.9928 0.8909 0.9423 0.9159
0.1899 0.2939 620 0.0262 0.9911 0.8475 0.9615 0.9009
0.1626 0.3034 640 0.0274 0.9895 0.8197 0.9615 0.8850
0.1919 0.3129 660 0.0435 0.9847 0.7324 1.0 0.8455
0.1953 0.3224 680 0.0215 0.9936 0.8929 0.9615 0.9259
0.1978 0.3318 700 0.0260 0.9903 0.8333 0.9615 0.8929
0.311 0.3413 720 0.0168 0.9944 0.9091 0.9615 0.9346
0.226 0.3508 740 0.0166 0.9928 0.8525 1.0 0.9204
0.1394 0.3603 760 0.0195 0.9919 0.85 0.9808 0.9107
0.1702 0.3698 780 0.0358 0.9855 0.75 0.9808 0.85
0.1413 0.3792 800 0.0269 0.9911 0.8361 0.9808 0.9027
0.2015 0.3887 820 0.0238 0.9911 0.8361 0.9808 0.9027
0.1752 0.3982 840 0.0253 0.9895 0.8 1.0 0.8889
0.212 0.4077 860 0.0250 0.9911 0.8475 0.9615 0.9009
0.1556 0.4172 880 0.0247 0.9928 0.8909 0.9423 0.9159
0.1495 0.4266 900 0.0141 0.9919 0.85 0.9808 0.9107
0.1666 0.4361 920 0.0187 0.9936 0.9074 0.9423 0.9245
0.1475 0.4456 940 0.0199 0.9911 0.8361 0.9808 0.9027
0.1459 0.4551 960 0.0166 0.9919 0.8387 1.0 0.9123
0.1264 0.4646 980 0.0155 0.9936 0.8793 0.9808 0.9273
0.1389 0.4740 1000 0.0152 0.9944 0.9091 0.9615 0.9346
0.1433 0.4835 1020 0.0147 0.9944 0.9091 0.9615 0.9346
0.1389 0.4930 1040 0.0126 0.9968 0.9444 0.9808 0.9623
0.1452 0.5025 1060 0.0230 0.9903 0.8125 1.0 0.8966
0.1623 0.5120 1080 0.0128 0.9952 0.9259 0.9615 0.9434
0.1179 0.5215 1100 0.0156 0.9952 0.8966 1.0 0.9455
0.1256 0.5309 1120 0.0175 0.9952 0.9107 0.9808 0.9444
0.1536 0.5404 1140 0.0221 0.9919 0.85 0.9808 0.9107
0.1873 0.5499 1160 0.0234 0.9903 0.8125 1.0 0.8966
0.1234 0.5594 1180 0.0174 0.9944 0.9091 0.9615 0.9346
0.205 0.5689 1200 0.0162 0.9944 0.8814 1.0 0.9369
0.1975 0.5783 1220 0.0155 0.9944 0.9091 0.9615 0.9346
0.1302 0.5878 1240 0.0131 0.9960 0.9123 1.0 0.9541
0.2996 0.5973 1260 0.0172 0.9944 0.8814 1.0 0.9369
0.1381 0.6068 1280 0.0145 0.9936 0.8929 0.9615 0.9259
0.1559 0.6163 1300 0.0142 0.9936 0.8929 0.9615 0.9259
0.1588 0.6257 1320 0.0194 0.9911 0.8361 0.9808 0.9027
0.1101 0.6352 1340 0.0149 0.9936 0.8929 0.9615 0.9259
0.1533 0.6447 1360 0.0152 0.9936 0.8667 1.0 0.9286
0.1567 0.6542 1380 0.0139 0.9928 0.8772 0.9615 0.9174
0.1771 0.6637 1400 0.0176 0.9936 0.8793 0.9808 0.9273
0.1531 0.6731 1420 0.0142 0.9928 0.8772 0.9615 0.9174
0.1366 0.6826 1440 0.0157 0.9928 0.8525 1.0 0.9204
0.1597 0.6921 1460 0.0149 0.9944 0.8814 1.0 0.9369
0.1527 0.7016 1480 0.0156 0.9919 0.8387 1.0 0.9123
0.2199 0.7111 1500 0.0137 0.9944 0.8814 1.0 0.9369
0.2215 0.7205 1520 0.0140 0.9944 0.8814 1.0 0.9369
0.199 0.7300 1540 0.0167 0.9936 0.8667 1.0 0.9286
0.1505 0.7395 1560 0.0106 0.9960 0.9273 0.9808 0.9533
0.2082 0.7490 1580 0.0146 0.9928 0.8525 1.0 0.9204
0.1431 0.7585 1600 0.0108 0.9944 0.8947 0.9808 0.9358
0.1388 0.7680 1620 0.0125 0.9952 0.8966 1.0 0.9455
0.1221 0.7774 1640 0.0100 0.9960 0.9273 0.9808 0.9533
0.1571 0.7869 1660 0.0108 0.9960 0.9273 0.9808 0.9533
0.1472 0.7964 1680 0.0115 0.9944 0.8947 0.9808 0.9358
0.1236 0.8059 1700 0.0121 0.9936 0.8793 0.9808 0.9273
0.1477 0.8154 1720 0.0113 0.9952 0.9107 0.9808 0.9444
0.1781 0.8248 1740 0.0139 0.9936 0.8793 0.9808 0.9273
0.1517 0.8343 1760 0.0129 0.9936 0.8793 0.9808 0.9273
0.1811 0.8438 1780 0.0123 0.9944 0.8947 0.9808 0.9358
0.1592 0.8533 1800 0.0136 0.9944 0.8814 1.0 0.9369
0.0799 0.8628 1820 0.0149 0.9944 0.8814 1.0 0.9369
0.1294 0.8722 1840 0.0130 0.9944 0.8814 1.0 0.9369
0.2076 0.8817 1860 0.0121 0.9936 0.8793 0.9808 0.9273
0.1628 0.8912 1880 0.0121 0.9936 0.8793 0.9808 0.9273
0.156 0.9007 1900 0.0128 0.9936 0.8793 0.9808 0.9273
0.0762 0.9102 1920 0.0128 0.9936 0.8793 0.9808 0.9273
0.1372 0.9196 1940 0.0124 0.9936 0.8793 0.9808 0.9273
0.0837 0.9291 1960 0.0123 0.9936 0.8793 0.9808 0.9273
0.1353 0.9386 1980 0.0126 0.9936 0.8793 0.9808 0.9273
0.1914 0.9481 2000 0.0127 0.9936 0.8793 0.9808 0.9273
0.1072 0.9576 2020 0.0127 0.9936 0.8793 0.9808 0.9273
0.2165 0.9671 2040 0.0129 0.9936 0.8793 0.9808 0.9273
0.1417 0.9765 2060 0.0127 0.9936 0.8793 0.9808 0.9273
0.101 0.9860 2080 0.0126 0.9936 0.8793 0.9808 0.9273
0.2136 0.9955 2100 0.0126 0.9936 0.8793 0.9808 0.9273

Framework versions

  • PEFT 0.13.2
  • Transformers 4.46.0
  • Pytorch 2.5.1+cu124
  • Datasets 3.1.0
  • Tokenizers 0.20.3