Qwen2-7B-Instruct-DPO-math-beta0.5 / adapter_model.safetensors

Commit History

Upload adapter_model.safetensors with huggingface_hub
79c0c2e
verified

XiaoY1 commited on