--- license: apache-2.0 library_name: peft tags: - trl - dpo - generated_from_trainer base_model: TheBloke/OpenHermes-2-Mistral-7B-GPTQ model-index: - name: mistral-dpo results: [] --- # mistral-dpo This model is a fine-tuned version of [TheBloke/OpenHermes-2-Mistral-7B-GPTQ](https://huggingface.co/TheBloke/OpenHermes-2-Mistral-7B-GPTQ) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.7572 - Rewards/chosen: 1.2167 - Rewards/rejected: 1.0623 - Rewards/accuracies: 0.5096 - Rewards/margins: 0.1544 - Logps/rejected: -175.8600 - Logps/chosen: -185.2850 - Logits/rejected: -2.4403 - Logits/chosen: -2.5187 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 1 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 2 - training_steps: 250 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Rewards/chosen | Rewards/rejected | Rewards/accuracies | Rewards/margins | Logps/rejected | Logps/chosen | Logits/rejected | Logits/chosen | |:-------------:|:-----:|:----:|:---------------:|:--------------:|:----------------:|:------------------:|:---------------:|:--------------:|:------------:|:---------------:|:-------------:| | 0.7021 | 0.0 | 10 | 0.6890 | 0.1146 | 0.0999 | 0.5192 | 0.0147 | -185.4845 | -196.3061 | -2.3759 | -2.4612 | | 0.6841 | 0.0 | 20 | 0.7001 | 0.0160 | 0.0098 | 0.5 | 0.0062 | -186.3854 | -197.2926 | -2.3872 | -2.4717 | | 0.8876 | 0.0 | 30 | 0.7302 | -0.1955 | -0.1884 | 0.5385 | -0.0071 | -188.3676 | -199.4075 | -2.4070 | -2.4917 | | 0.8433 | 0.0 | 40 | 0.7276 | -0.0970 | -0.0927 | 0.5288 | -0.0043 | -187.4101 | -198.4224 | -2.4156 | -2.4975 | | 0.8355 | 0.0 | 50 | 0.7405 | 0.3519 | 0.3619 | 0.4808 | -0.0100 | -182.8643 | -193.9333 | -2.4256 | -2.5051 | | 0.7391 | 0.0 | 60 | 0.7472 | 0.5734 | 0.5592 | 0.5481 | 0.0142 | -180.8914 | -191.7185 | -2.4426 | -2.5190 | | 0.5922 | 0.01 | 70 | 0.7534 | 0.8179 | 0.7898 | 0.5192 | 0.0281 | -178.5854 | -189.2730 | -2.4396 | -2.5167 | | 0.6762 | 0.01 | 80 | 0.7436 | 0.7843 | 0.7179 | 0.5385 | 0.0664 | -179.3046 | -189.6097 | -2.4306 | -2.5082 | | 0.5934 | 0.01 | 90 | 0.7474 | 0.8646 | 0.7935 | 0.5192 | 0.0711 | -178.5482 | -188.8059 | -2.4360 | -2.5117 | | 0.5773 | 0.01 | 100 | 0.7527 | 0.8864 | 0.8060 | 0.5481 | 0.0804 | -178.4233 | -188.5887 | -2.4363 | -2.5114 | | 1.159 | 0.01 | 110 | 0.7513 | 0.7767 | 0.6900 | 0.5385 | 0.0867 | -179.5837 | -189.6852 | -2.4325 | -2.5061 | | 0.5871 | 0.01 | 120 | 0.7514 | 0.6924 | 0.6190 | 0.5385 | 0.0733 | -180.2931 | -190.5286 | -2.4307 | -2.5040 | | 0.6655 | 0.01 | 130 | 0.7515 | 0.5617 | 0.4862 | 0.5385 | 0.0755 | -181.6214 | -191.8357 | -2.4293 | -2.5029 | | 0.5963 | 0.01 | 140 | 0.7489 | 0.4748 | 0.3917 | 0.5481 | 0.0831 | -182.5665 | -192.7043 | -2.4326 | -2.5072 | | 0.7817 | 0.01 | 150 | 0.7466 | 0.5389 | 0.4527 | 0.5288 | 0.0862 | -181.9568 | -192.0635 | -2.4306 | -2.5059 | | 0.7836 | 0.01 | 160 | 0.7399 | 0.5166 | 0.4148 | 0.5288 | 0.1017 | -182.3349 | -192.2867 | -2.4256 | -2.5014 | | 0.6246 | 0.01 | 170 | 0.7478 | 0.9222 | 0.8063 | 0.5 | 0.1159 | -178.4202 | -188.2300 | -2.4449 | -2.5218 | | 0.6159 | 0.01 | 180 | 0.7637 | 1.1539 | 1.0352 | 0.5 | 0.1187 | -176.1314 | -185.9132 | -2.4491 | -2.5259 | | 0.9218 | 0.02 | 190 | 0.7670 | 1.1914 | 1.0684 | 0.4808 | 0.1230 | -175.7993 | -185.5382 | -2.4471 | -2.5233 | | 0.8469 | 0.02 | 200 | 0.7670 | 1.2246 | 1.0991 | 0.5096 | 0.1255 | -175.4922 | -185.2060 | -2.4455 | -2.5220 | | 0.5824 | 0.02 | 210 | 0.7601 | 1.2119 | 1.0773 | 0.5096 | 0.1346 | -175.7105 | -185.3338 | -2.4418 | -2.5188 | | 0.5718 | 0.02 | 220 | 0.7590 | 1.2120 | 1.0736 | 0.5096 | 0.1384 | -175.7473 | -185.3322 | -2.4392 | -2.5168 | | 0.7219 | 0.02 | 230 | 0.7578 | 1.2033 | 1.0583 | 0.4904 | 0.1450 | -175.9007 | -185.4198 | -2.4385 | -2.5165 | | 2.6464 | 0.02 | 240 | 0.7570 | 1.2096 | 1.0575 | 0.5 | 0.1520 | -175.9079 | -185.3567 | -2.4392 | -2.5175 | | 0.8964 | 0.02 | 250 | 0.7572 | 1.2167 | 1.0623 | 0.5096 | 0.1544 | -175.8600 | -185.2850 | -2.4403 | -2.5187 | ### Framework versions - PEFT 0.8.2 - Transformers 4.37.0 - Pytorch 2.0.1+cu117 - Datasets 2.15.0 - Tokenizers 0.15.1