Advanced search options

Advanced Search Options 🞨

Browse by author name (“Author name starts with…”).

Find ETDs with:

in
/  
in
/  
in
/  
in

Written in Published in Earliest date Latest date

Sorted by

Results per page:

You searched for id:"handle:1773/46787". One record found.

Search Limiters

Last 2 Years | English Only

No search limiters apply to these results.

▼ Search Limiters


University of Washington

1. Li, Dianqi. Deep Generative Models for Natural Language Generation.

Degree: PhD, 2021, University of Washington

Natural language generation plays an important role in language intelligence, which is an essential topic of artificial intelligence over the past years. Recent advances in generative models combining with deep neural networks have achieved tremendous successes in many natural language generation tasks. Establishing suitable and effective generative models is the key challenge for researchers to fulfill different language generation purposes under varied application scenarios. This thesis focuses on investigating and providing better deep generative models with respect to various natural language generation tasks. This thesis consists of two parts. The first part explores the ranking-based generative adversarial network for generating texts. We first examine limitations of the commonly used Generative Adversarial Networks (GANs) on text generation tasks, and propose a novel ranking-based generative adversarial network, RankGAN, for generating high-quality language descriptions. Rather than training the discriminator to learn and assign an absolute binary predicate for an individual data sample, the proposed RankGAN is able to analyze and rank a collection of human-written and machine-written sentences by giving a reference group. Concretely, by viewing a set of data samples collectively and evaluating their quality through relative ranking scores, the discriminator is able to make a better assessment which in turn helps to learn a better generator for text generation tasks. We then take a step further to apply RankGAN in image captioning. We explore how to generate captions that are not only accurate in describing an image but also diverse across different images. By ranking human-written captions above image-mismatched captions within the image-caption joint space, the corresponding caption generator effectively exploits the inherent characteristics of human languages, and generates more diverse captions. In the second part, we focus on how to effectively edit inputs to generate new texts for specific natural language generation tasks, e.g., text style transfer and textual adversarial example generation. For the text style transfer task, we first examine the limitations and drawbacks of current generative models for text style transfer tasks with limited data. We then develop domain adaptive text style transfer models to leverage massively available data from other domains to solve the scarce data issue in the target domain. To generate textual adversarial examples, while previous rule-based editing methods are agnostic to the input context, we propose a contextualized perturbation approach to generate fluent and grammatical adversaries with better textual similarity. We further investigate three different perturbations to construct a richer range of generation strategies, resulting in a higher attack success rate of generated adversaries. Advisors/Committee Members: Sun, Ming-Ting (advisor).

Subjects/Keywords: Deep Generative Model; Deep Learning; Generation; Natural Language Processing; Electrical engineering; Computer science; Electrical engineering

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Li, D. (2021). Deep Generative Models for Natural Language Generation. (Doctoral Dissertation). University of Washington. Retrieved from http://hdl.handle.net/1773/46787

Chicago Manual of Style (16th Edition):

Li, Dianqi. “Deep Generative Models for Natural Language Generation.” 2021. Doctoral Dissertation, University of Washington. Accessed April 22, 2021. http://hdl.handle.net/1773/46787.

MLA Handbook (7th Edition):

Li, Dianqi. “Deep Generative Models for Natural Language Generation.” 2021. Web. 22 Apr 2021.

Vancouver:

Li D. Deep Generative Models for Natural Language Generation. [Internet] [Doctoral dissertation]. University of Washington; 2021. [cited 2021 Apr 22]. Available from: http://hdl.handle.net/1773/46787.

Council of Science Editors:

Li D. Deep Generative Models for Natural Language Generation. [Doctoral Dissertation]. University of Washington; 2021. Available from: http://hdl.handle.net/1773/46787

.