DiTy commited on
Commit
0927c23
1 Parent(s): 761ee1c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -70,7 +70,7 @@ print(rank_result)
70
  Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
71
 
72
  ```python
73
- # import torch
74
  from transformers import AutoTokenizer, AutoModelForSequenceClassification
75
 
76
  model = AutoModelForSequenceClassification.from_pretrained('DiTy/cross-encoder-russian-msmarco')
 
70
  Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
71
 
72
  ```python
73
+ import torch
74
  from transformers import AutoTokenizer, AutoModelForSequenceClassification
75
 
76
  model = AutoModelForSequenceClassification.from_pretrained('DiTy/cross-encoder-russian-msmarco')