Stylized Dialogue Generation

Shih Wen Ke, Wei Liang Chen

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Dialogue systems such as intelligent online customer services, online chatbots or smart kiosks are becoming increasingly popular. Currently dialogue systems lack personality and ability to respond according to contexts. In this study, we propose an approach to transfer the text into multiple styles when generating dialogue responses. It is especially challenging to build a stylized dialogue system as it combines supervised and unsupervised tasks. In practice, the dialogue data are usually paired, i.e. query paired response while styled text is not. Therefore, we propose using lightweight deep neural network models to bridge the dialogue response generation model and the style transfer model. This structure allows the model to generate responses of different styles to the same query. Our approach will be evaluated against selected state-of-the-art dialogue generation and style transfer techniques.

Original languageEnglish
Title of host publication2021 IEEE International Conference on Industrial Engineering and Engineering Management, IEEM 2021
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1456-1460
Number of pages5
ISBN (Electronic)9781665437714
DOIs
StatePublished - 2021
Event2021 IEEE International Conference on Industrial Engineering and Engineering Management, IEEM 2021 - Virtual, Online, Singapore
Duration: 13 Dec 202116 Dec 2021

Publication series

Name2021 IEEE International Conference on Industrial Engineering and Engineering Management, IEEM 2021

Conference

Conference2021 IEEE International Conference on Industrial Engineering and Engineering Management, IEEM 2021
Country/TerritorySingapore
CityVirtual, Online
Period13/12/2116/12/21

Keywords

  • Deep learning
  • Dialogue generation
  • Sequence-to-sequence
  • Style transfer

Fingerprint

Dive into the research topics of 'Stylized Dialogue Generation'. Together they form a unique fingerprint.

Cite this