What is a difference between t5-v1_1-xxl-encoder-f16.gguf and t5xxl_fp16.safetensors
#10 opened 4 days ago
by
Geralt28
How to use the model?
1
#8 opened 3 months ago
by
AIer0107
how do I use this can't load t5 gguf with clip l safetensor
12
#6 opened 3 months ago
by
MANOFAi94
For the fastest inference on 12GB VRAM, are the following GGUF models appropriate to use?
3
#4 opened 3 months ago
by
ViratX
Comparisons to FP8 e4m3fn ?
1
#3 opened 3 months ago
by
NielsGx
Where do i put it which folder please?
2
#2 opened 3 months ago
by
Ashkacha
How do I load t5-v1_1-xxl-encoder-gguf?
11
#1 opened 3 months ago
by
YuFeiLiu