Edit model card

Question-Answer to Statement Converter

A question answer pair to statement converter from https://github.com/jifan-chen/QA-Verification-Via-NLI

See:

@article{chen2021can,
  title={Can NLI Models Verify QA Systems' Predictions?},
  author={Chen, Jifan and Choi, Eunsol and Durrett, Greg},
  journal={EMNLP Findings},
  year={2021}
}

Note: I am not the maintainer or orginal author just keeping it here to use huggingface APIs to produce statements from question answer pair for downstream applications.

TL;DR:

We fine-tune a seq2seq model, T5-3B (Raffel et al., 2020), using the (a,q,d)(a, q, d) pairs annotated by Demszky et al. (2018).

Where a is answer, q is question, and d is declerative sentence (i.e. a statement).

See Appendex B.2 of Chen et al. for more.

Usage

The prompt should be {question} {seperator} {answer} where the seperator is </s>.

from transformers import AutoTokenizer,  AutoModelForSeq2SeqLM

tokenizer = AutoTokenizer.from_pretrained('domenicrosati/question_converter-3b')
model = AutoModelForSeq2SeqLM.from_pretrained('domenicrosati/question_converter-3b')

question = "Where in the world is Carmen Sandiego?"
answer = "She is in Abruzzo"

prompt = f'{question} </s> {answer}'
input_ids = tokenizer(prompt, return_tensors='pt').input_ids
output_ids = model.generate(input_ids)
responses = tokenizer.batch_decode(output_ids, skip_special_tokens=True)

['Carmen Sandiego is in Abruzzo.']

Downloads last month
40
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train domenicrosati/question_converter-3b