Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
Kukedlc
/
NeutriX7000-7b-DPO
like
0
Merge
mergekit
#dpo
MaximeLabonne
#mergeofmerge
License:
apache-2.0
Model card
Files
Files and versions
Community
main
NeutriX7000-7b-DPO
Commit History
Update README.md
950bcbc
verified
Kukedlc
commited on
Feb 14
Update README.md
f722b0c
verified
Kukedlc
commited on
Feb 11
Update README.md
a07eb60
verified
Kukedlc
commited on
Feb 11
initial commit
37bdf66
verified
Kukedlc
commited on
Feb 11