mt5-small for Turkish Question Generation
Automated question generation and question answering using text-to-text transformers by OBSS AI.
from core.api import GenerationAPI
generation_api = GenerationAPI('mt5-small-3task-prepend-tquad2', qg_format='prepend')
Citation 📜
@article{akyon2022questgen,
author = {Akyon, Fatih Cagatay and Cavusoglu, Ali Devrim Ekin and Cengiz, Cemil and Altinuc, Sinan Onur and Temizel, Alptekin},
doi = {10.3906/elk-1300-0632.3914},
journal = {Turkish Journal of Electrical Engineering and Computer Sciences},
title = {{Automated question generation and question answering from Turkish texts}},
url = {https://journals.tubitak.gov.tr/elektrik/vol30/iss5/17/},
year = {2022}
}
Overview ✔️
Language model: mt5-small
Language: Turkish
Downstream-task: Extractive QA/QG, Answer Extraction
Training data: TQuADv2-train
Code: https://github.com/obss/turkish-question-generation
Paper: https://journals.tubitak.gov.tr/elektrik/vol30/iss5/17/
Hyperparameters
batch_size = 256
n_epochs = 15
base_LM_model = "mt5-small"
max_source_length = 512
max_target_length = 64
learning_rate = 1.0e-3
task_lisst = ["qa", "qg", "ans_ext"]
qg_format = "prepend"
Performance
Refer to paper.
Usage 🔥
from core.api import GenerationAPI
generation_api = GenerationAPI('mt5-small-3task-prepend-tquad2', qg_format='prepend')
context = """
Bu modelin eğitiminde, Türkçe soru cevap verileri kullanılmıştır.
Çalışmada sunulan yöntemle, Türkçe metinlerden otomatik olarak soru ve cevap
üretilebilir. Bu proje ile paylaşılan kaynak kodu ile Türkçe Soru Üretme
/ Soru Cevaplama konularında yeni akademik çalışmalar yapılabilir.
Projenin detaylarına paylaşılan Github ve Arxiv linklerinden ulaşılabilir.
"""
# a) Fully Automated Question Generation
generation_api(task='question-generation', context=context)
# b) Question Answering
question = "Bu model ne işe yarar?"
generation_api(task='question-answering', context=context, question=question)
# b) Answer Extraction
generation_api(task='answer-extraction', context=context)
- Downloads last month
- 20
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.