yeliu918 commited on
Commit
2390634
1 Parent(s): 236ead4

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +12 -2
README.md CHANGED
@@ -3277,7 +3277,7 @@ license: cc-by-nc-4.0
3277
 
3278
  **SFR-Embedding by Salesforce Research.**
3279
 
3280
- The model is trained on top of [E5-mistral-7b-instruct](https://huggingface.co/intfloat/e5-mistral-7b-instruct) and [Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1). The model has 32 layers and the embedding size is 4096.
3281
 
3282
  More technical details will be updated later.
3283
 
@@ -3330,7 +3330,8 @@ embeddings = F.normalize(embeddings, p=2, dim=1)
3330
  scores = (embeddings[:2] @ embeddings[2:].T) * 100
3331
  print(scores.tolist())
3332
  ```
3333
- More technical details will be updated later.
 
3334
 
3335
  SFR-Embedding Team
3336
 
@@ -3339,3 +3340,12 @@ SFR-Embedding Team
3339
  * Semih Yavuz
3340
  * Yingbo Zhou
3341
  * Caiming Xiong
 
 
 
 
 
 
 
 
 
 
3277
 
3278
  **SFR-Embedding by Salesforce Research.**
3279
 
3280
+ The model is trained on top of [E5-mistral-7b-instruct](https://huggingface.co/intfloat/e5-mistral-7b-instruct) and [Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1). The model has 32 layers and the embedding size is 4096.
3281
 
3282
  More technical details will be updated later.
3283
 
 
3330
  scores = (embeddings[:2] @ embeddings[2:].T) * 100
3331
  print(scores.tolist())
3332
  ```
3333
+
3334
+ Full MTEB evaluation will be added soon.
3335
 
3336
  SFR-Embedding Team
3337
 
 
3340
  * Semih Yavuz
3341
  * Yingbo Zhou
3342
  * Caiming Xiong
3343
+
3344
+ This project is for research purposes only. Third-party datasets may be subject to additional terms and conditions under their associated licenses. Please refer to specific papers for more details:
3345
+ - [E5 Dataset](https://arxiv.org/abs/2212.03533)
3346
+ - [MTEB Dataset](https://arxiv.org/abs/2210.07316)
3347
+ - [Mistral Dataset](https://arxiv.org/abs/2310.06825)
3348
+
3349
+
3350
+
3351
+