Download PDFOpen PDF in browserDownstream Transformer Generation of Question-Answer Pairs with Preprocessing and Postprocessing PipelinesEasyChair Preprint 87398 pages•Date: August 29, 2022AbstractWe present a method to perform a downstream task of transformers on generating question-answer pairs (QAPs) from a given article. We first finetune pretrained transformers on QAP datasets. We then use a preprocessing pipeline to select appropriate answers from the article, and feed each answer and the relevant context to the finetuned transformer to generate a candidate QAP. Finally we use a postprocessing pipeline to filter inadequate QAPs. In particular, using pretrained T5 models as transformers and the SQuAD dataset as the finetruning dataset, we obtain a finetuned T5 model that outperforms previous models on standard performance measures over the SQuAD dataset. We then show that our method based on this finetuned model generates a satisfactory number of QAPs with high qualities on the Gaokao-EN dataset assessed by human judges. Keyphrases: Information Extraction, Natural Language Generation, Natural Language Processing, neural networks, question generation
|