Update README.md
Browse files
README.md
CHANGED
@@ -20,7 +20,7 @@ Safe-CLIP, introduced in the paper [**Safe-CLIP: Removing NSFW Concepts from Vis
|
|
20 |
Based on the CLIP model, Safe-CLIP is fine-tuned to serve the association between linguistic and visual concepts, ensuring **safer outputs** in text-to-image and image-to-text retrieval and generation tasks.
|
21 |
|
22 |
## NSFW Definition
|
23 |
-
In our work, with inspiration taken from this [paper](https://arxiv.org/abs/2211.05105), we define NSFW as a finite and fixed set concepts that are considered inappropriate, offensive, or harmful to individuals. These concepts are divided into
|
24 |
|
25 |
#### Use with Transformers
|
26 |
See the snippet below for usage with Transformers:
|
|
|
20 |
Based on the CLIP model, Safe-CLIP is fine-tuned to serve the association between linguistic and visual concepts, ensuring **safer outputs** in text-to-image and image-to-text retrieval and generation tasks.
|
21 |
|
22 |
## NSFW Definition
|
23 |
+
In our work, with inspiration taken from this [paper](https://arxiv.org/abs/2211.05105), we define NSFW as a finite and fixed set concepts that are considered inappropriate, offensive, or harmful to individuals. These concepts are divided into seven categories: _hate, harassment, violence, self-harm, sexual, shocking and illegal activities_.
|
24 |
|
25 |
#### Use with Transformers
|
26 |
See the snippet below for usage with Transformers:
|