Arabic Question Generation Model
This model is ready to use for Question Generation task, simply input the text and answer, the model will generate a question, This model is a fine-tuned version of AraT5-Base Model
Live Demo
Get the Question from given Context and a Answer : Arabic QG Model
Model in Action 🚀
#Requirements: !pip install transformers
from transformers import AutoTokenizer,AutoModelForSeq2SeqLM
model = AutoModelForSeq2SeqLM.from_pretrained("MIIB-NLP/Arabic-question-generation")
tokenizer = AutoTokenizer.from_pretrained("MIIB-NLP/Arabic-question-generation")
def get_question(context,answer):
text="context: " +context + " " + "answer: " + answer + " </s>"
text_encoding = tokenizer.encode_plus(
text,return_tensors="pt"
)
model.eval()
generated_ids = model.generate(
input_ids=text_encoding['input_ids'],
attention_mask=text_encoding['attention_mask'],
max_length=64,
num_beams=5,
num_return_sequences=1
)
return tokenizer.decode(generated_ids[0],skip_special_tokens=True,clean_up_tokenization_spaces=True).replace('question: ',' ')
context="الثورة الجزائرية أو ثورة المليون شهيد، اندلعت في 1 نوفمبر 1954 ضد المستعمر الفرنسي ودامت 7 سنوات ونصف. استشهد فيها أكثر من مليون ونصف مليون جزائري"
answer =" 7 سنوات ونصف"
get_question(context,answer)
#output : question="كم استمرت الثورة الجزائرية؟ "
Details of Ara-T5
The Ara-T5 model was presented in AraT5: Text-to-Text Transformers for Arabic Language Generation by El Moatez Billah Nagoudi, AbdelRahim Elmadany, Muhammad Abdul-Mageed
Contacts
Mihoubi Akram Fawzi: Linkedin | Github | mihhakram@gmail.com
Ibrir Adel: Linkedin | Github | adelibrir2015@gmail.com
- Downloads last month
- 101
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Space using MIIB-NLP/Arabic-question-generation 1
Evaluation results
- Bleu1self-reported37.620
- Bleu2self-reported27.800
- Bleu3self-reported20.890
- Bleu4self-reported15.870
- meteorself-reported33.190
- rougelself-reported43.370