hli's picture
Add new SentenceTransformer model.
f5ad3f4
raw
history blame contribute delete
164 Bytes
{
"tokenizer_class": "sentence_transformers.models.tokenizer.WhitespaceTokenizer.WhitespaceTokenizer",
"update_embeddings": false,
"max_seq_length": 1000000
}