BEIR Exp. NFCORPUS with colbertv2.0
#7
by
tsatsu
- opened
I ran the BEIR experiments in this repo.
It reproduced the numbers in the paper with the checkpoint file "msmarco.psg.l2/checkpoints/colbert-300000.dnn".
Then, I save a dnn file for colbertv2.0 with codes below:
import torch
from transformers import AutoModel, AutoTokenizer
model_name = "colbert-ir/colbertv2.0"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModel.from_pretrained(model_name)
checkpoint = {
'epoch': 0,
'batch': 0,
'model_state_dict': model.state_dict(),
# 'optimizer_state_dict': optimizer.state_dict(),
# 'arguments': {}
}
torch.save(checkpoint, "colbertv2.dnn")
It can still run but the result is completely broken.
NDCG@10 = 0.009
Did I miss something?
Thanks.