llama-3.2-1B-instruct-dpo / adapter_config.json

Commit History

Uploading merged DPO-trained model
4238367
verified

anshikaagarwal commited on