malenia1 commited on
Commit
b62ee0b
1 Parent(s): 3e65550

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +23 -3
README.md CHANGED
@@ -7750,11 +7750,31 @@ tags:
7750
  ---
7751
  # ternary-weight-embedding
7752
 
7753
- 基于xiaobu-embedding-v2[1]的三元权重text embedding模型,在nli-zh[2]和部分t2ranking[3]数据集(共37MB)上微调得到。模型中所有Linear层的权重取值为1,0或-1。推理时间和存储空间可以达到全精度模型的0.37x的和0.13x。
7754
 
7755
- 抱歉暂时不可以直接使用,代码会在后续公布。
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7756
 
7757
  ## Reference
7758
  1. https://huggingface.co/lier007/xiaobu-embedding-v2
7759
  2. https://huggingface.co/datasets/shibing624/nli-zh-all/tree/main/sampled_data
7760
- 3. https://huggingface.co/datasets/sentence-transformers/t2ranking/tree/main/triplet
 
 
7750
  ---
7751
  # ternary-weight-embedding
7752
 
7753
+ 基于xiaobu-embedding-v2[1],在nli-zh[2]t2ranking[3]数据集文本上微调得到的三元权重text embedding模型。模型中所有Linear层的权重取值为1,0或-1。模型中所有Linear层的权重取值为1,0或-1。推理时间和存储空间可以达到全精度模型的0.37x(在A800上的测试结果)的和0.13x。
7754
 
7755
+
7756
+ 使用请安装BITBLAS[4]
7757
+ ```
7758
+ pip install bitblas
7759
+ ```
7760
+ 初次运行可能会花一些时间
7761
+
7762
+
7763
+ 使用Sentence-Transformers进行测试
7764
+ ```
7765
+ pip install -U sentence-transformers
7766
+ ```
7767
+
7768
+ ```
7769
+ model = SentenceTransformer('malenia1/ternary-weight-embedding',trust_remote_code=True)
7770
+ print(model)
7771
+ tasks = mteb.get_tasks("OnlineShopping")
7772
+ evaluation = mteb.MTEB(tasks=tasks)
7773
+ results = evaluation.run(model, output_folder=f"results")
7774
+ ```
7775
 
7776
  ## Reference
7777
  1. https://huggingface.co/lier007/xiaobu-embedding-v2
7778
  2. https://huggingface.co/datasets/shibing624/nli-zh-all/tree/main/sampled_data
7779
+ 3. https://huggingface.co/datasets/sentence-transformers/t2ranking/tree/main/triplet
7780
+ 4. https://github.com/microsoft/BitBLAS