SeanLee97/bellm-llama-7b-nli
It is a pretrained weight for BeLLM: Backward Dependency Enhanced Large Language Model for Sentence Embeddings (NAACL24)
For more usage, please refer to: https://github.com/4AI/BeLLM
Citation
@inproceedings{li2024bellm,
title = "BeLLM: Backward Dependency Enhanced Large Language Model for Sentence Embeddings",
author = "Li, Xianming and Li, Jing",
booktitle = "Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics",
year = "2024",
publisher = "Association for Computational Linguistics"
}
- Downloads last month
- 262
Model tree for SeanLee97/bellm-llama-7b-nli
Base model
NousResearch/Llama-2-7b-hf