Update README.md
Browse files
README.md
CHANGED
@@ -8,7 +8,7 @@ tags:
|
|
8 |
---
|
9 |
|
10 |
# msmarco-MiniLM-L6-cos-v5
|
11 |
-
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a
|
12 |
|
13 |
|
14 |
## Usage (Sentence-Transformers)
|
@@ -112,7 +112,7 @@ In the following some technical details how this model must be used:
|
|
112 |
|
113 |
| Setting | Value |
|
114 |
| --- | :---: |
|
115 |
-
| Dimensions |
|
116 |
| Produces normalized embeddings | Yes |
|
117 |
| Pooling-Method | Mean pooling |
|
118 |
| Suitable score functions | dot-product (`util.dot_score`), cosine-similarity (`util.cos_sim`), or euclidean distance |
|
|
|
8 |
---
|
9 |
|
10 |
# msmarco-MiniLM-L6-cos-v5
|
11 |
+
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 384 dimensional dense vector space and was designed for **semantic search**. It has been trained on 500k (query, answer) pairs from the [MS MARCO Passages dataset](https://github.com/microsoft/MSMARCO-Passage-Ranking). For an introduction to semantic search, have a look at: [SBERT.net - Semantic Search](https://www.sbert.net/examples/applications/semantic-search/README.html)
|
12 |
|
13 |
|
14 |
## Usage (Sentence-Transformers)
|
|
|
112 |
|
113 |
| Setting | Value |
|
114 |
| --- | :---: |
|
115 |
+
| Dimensions | 384 |
|
116 |
| Produces normalized embeddings | Yes |
|
117 |
| Pooling-Method | Mean pooling |
|
118 |
| Suitable score functions | dot-product (`util.dot_score`), cosine-similarity (`util.cos_sim`), or euclidean distance |
|