Add extra gated access information
This PR adds an extra snippet of text regarding the conditions to access this model. Happy to expand / iterate on it as needed!
I believe adding restrictions, even reasonable ones, to open source software (thus making it outside of that category, for users who download it trough Hugging Face) is generally not a good idea at all.
Also, it being an agreement, has it been reviewed by a lawyer?
Sure, would the following stricter, but more precise, statement be OK?
By clicking on “Access repository” below, you also agree to not use the model to conduct experiments that involve human subjects.
That would exclude, for example, evaluating the non-automated part of TruthfulQA, since that involves having humans raters. Also, restricting it to humans who "know" it's a bot would restrict various forms of turing tests. We could formulate it as "non-expecting" humans. Alternatively, I think we can just find a precise wording of "harm", such as "physical or mental damage" and leave the original statement.