File size: 245 Bytes
59cd9f9 |
1 2 3 4 5 6 7 8 9 10 11 12 |
base_model_name: meta-llama/Llama-2-7b-hf batch_size: 4 cot: true dataset_name: BENBENBENb/ARC1000COT epochs: 20 eval_strategy: epoch learning_rate: 0.0001 logging_steps: 1 output_dir: brettbbb/llama_finetune_arc_20_cot seed: 42 warmup_steps: 5 |