Upload ONNX weights exported via optimum with `library='sentence-transformers'`
#64
by
Xenova
HF staff
- opened
Command ran:
optimum-cli export onnx --model sentence-transformers/all-MiniLM-L6-v2 ./sbert/
Output:
Validating ONNX model sbert/model.onnx...
-[β] ONNX model output names match reference model (sentence_embedding, token_embeddings)
- Validating ONNX Model output "token_embeddings":
-[β] (2, 16, 384) matches (2, 16, 384)
-[β] all values close (atol: 1e-05)
- Validating ONNX Model output "sentence_embedding":
-[β] (2, 384) matches (2, 384)
-[β] all values close (atol: 1e-05)
Note: This is slightly different to https://huggingface.co/Xenova/all-MiniLM-L6-v2. That model has the last_hidden_state
output name, whereas this model has token_embeddings
and sentence_embedding
.
btw I reverted newly-exported tokenizer and config files to keep diff small and just in case there are backwards-compatibility issues (there shouldn't be though).
@Xenova does optimum-cli export onnx also work if you have a fine trained setfit models?
when i use the huggingface setfit library to export to onnx i see:
what is token_embeddings and sentence_embedding? where is the predicted label in token_embeddings or sentence_embedding?
tomaarsen
changed pull request status to
merged