LIU-DISSERTATION-2021.pdf (1.98 MB)
Download fileKnowledge-enhance Neural Text Generation
thesis
posted on 2021-12-01, 00:00 authored by Ye LiuText generation is one of the most important yet challenging tasks in natural language
processing. Even though various neural generation models have been proposed to achieve the goal by generating output text from the input text, the input text alone usually provides limited knowledge to generate the desired output. Therefore, text generation performance is still far from satisfactory in many real-world scenarios. To make machines express like a human, we have considered incorporating various forms of internal and external knowledge beyond the input text into the generation models to solve this issue.
The first task is on the question refinement task, which incorporates deep reinforcement
learning that considers both word-level rewards as immediate rewards but also question-level rewards as a long-time reward. For the second task, in light of the fact that the knowledge graph can provide the relational information to enhance the reasoning capacity and provide adjunct words to the concept, I propose the second approach. It is a novel knowledge graph-augmented framework for generative commonsense reasoning. In the third task, I incorporate the explicit syntactic and semantic structures of languages into a non-autoregressive Transformer, for the task of neural machine translation. In the fourth task, I'll introduce a Transformer-based the pre-trained model with multi-granularity sparse attentions for long-text summarization.
History
Advisor
Yu, Philip S.Chair
Yu, Philip S.Department
Computer ScienceDegree Grantor
University of Illinois at ChicagoDegree Level
- Doctoral
Degree name
PhD, Doctor of PhilosophyCommittee Member
Zhang, Xinhua Parde, Natalie Zheleva, Elena He, LifangSubmitted date
December 2021Thesis type
application/pdfLanguage
- en