Information
論文タイトル:Question Generation for Reading Comprehension of Language Learning Test
著者:単駿杰, 西原陽子, 山西良典, 前田亮
概要:In this paper, we propose a method using seq2seq approach with the Transformer model to generate questions for reading comprehension of language learning test. First, the method uses the attention mechanism to extract sentences from given articles and combines the extracted sentences with the corresponding question to make pairs of sentences and a question. Second, the method trains the sentences-question pairs to make a seq2seq model. The model can generate questions once sentences are input. Evaluation results showed that more than 50% of the generated questions were appropriate as it is reasonable to be used in reading comprehension tests. Through the analysis of the generated questions, we found the differences between the types of generated questions: one is the question directly related to the given article (DR question) and the other is a common question (CM question). We discussed the reasons why the model tends to generate more CM questions than DR questions. Through the discussion, we found a future direction to improve the method to generate more DR questions.
書籍情報:2019 International Conference on Technologies and Applications of Artificial Intelligence (TAAI) (pp. 1-6). IEEE.
発表種別:国際会議論文
発表日:2019年11月22日