Update README.md
Browse files
README.md
CHANGED
@@ -70,7 +70,7 @@ print(rank_result)
|
|
70 |
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
|
71 |
|
72 |
```python
|
73 |
-
|
74 |
from transformers import AutoTokenizer, AutoModelForSequenceClassification
|
75 |
|
76 |
model = AutoModelForSequenceClassification.from_pretrained('DiTy/cross-encoder-russian-msmarco')
|
|
|
70 |
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
|
71 |
|
72 |
```python
|
73 |
+
import torch
|
74 |
from transformers import AutoTokenizer, AutoModelForSequenceClassification
|
75 |
|
76 |
model = AutoModelForSequenceClassification.from_pretrained('DiTy/cross-encoder-russian-msmarco')
|