XU-DISSERTATION-2020.pdf (3.54 MB)
Lifelong Representation Learning for NLP Applications
thesis
posted on 2020-05-01, 00:00 authored by Hu XuRepresentation learning lives at the heart of deep learning for natural language processing (NLP).
Traditional representation learning (such as softmax-based classification, pre-trained word embeddings, and language models, graph representations) focuses on learning general or static representations with the hope to help any end task. As the world keeps evolving, emerging knowledge (such as new tasks, domains, entities or relations) typically come with a small amount of data with shifted distributions that challenge the existing representations to be effective. As a result, how to effectively learn representations for new knowledge becomes crucial. Lifelong learning is a machine learning paradigm that aims to build an AI agent that keeps learning from the evolving world, like humans' learning from the world. This dissertation focuses on improving representations on different types of new knowledge (classification, word-level, contextual-level, and knowledge graph) for a myriad of NLP end tasks, ranging from text classification, sentiment analysis, entity recognition, question answering to the more complex dialog system. With the help of lifelong representation learning, models' performance on tasks is greatly improved beyond existing general representation learning.
History
Advisor
Yu, Philip S.Liu, BingChair
Yu, Philip S.Department
Computer ScienceDegree Grantor
University of Illinois at ChicagoDegree Level
- Doctoral
Degree name
PhD, Doctor of PhilosophyCommittee Member
Gmytrasiewicz, Piotr Parde, Natalie Xie, SihongSubmitted date
May 2020Thesis type
application/pdfLanguage
- en
Usage metrics
Categories
No categories selectedLicence
Exports
RefWorks
BibTeX
Ref. manager
Endnote
DataCite
NLM
DC