Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Hellisotherpeople
/
toxic-dpo-v2-mistral7b-lora
like
3
PEFT
Safetensors
unalignment/toxic-dpo-v0.2
English
Not-For-All-Audiences
License:
mit
Model card
Files
Files and versions
Community
Use this model
6d75ab7
toxic-dpo-v2-mistral7b-lora
1 contributor
History:
3 commits
Hellisotherpeople
Upload rng_state.pth
6d75ab7
verified
9 months ago
.gitattributes
Safe
1.52 kB
initial commit
9 months ago
README.md
Safe
21 Bytes
initial commit
9 months ago
rng_state.pth
pickle
Detected Pickle imports (7)
"torch.ByteStorage"
,
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"numpy.dtype"
,
"_codecs.encode"
,
"numpy.ndarray"
,
"numpy.core.multiarray._reconstruct"
How to fix it?
14.2 kB
LFS
Upload rng_state.pth
9 months ago
trainer_state.json
Safe
1.39 kB
Upload trainer_state.json
9 months ago