support for longer prompt and weighting using custom_pipeline
#29
by
skytnt
- opened
Now we can input prompt without 77 tokens limit and adjust weighting by using custom_pipeline="waifu-research-department/long-prompt-weighting-pipeline".
It requires diffusers>=0.4.0
Check out waifu-research-department/long-prompt-weighting-pipeline for detial.