Xiang Zhang and Qiang Yang. Transfer Hierarchical Attention Network for Generative Dialog System. International Journal of Automation and Computing, vol. 16, no. 6, pp. 720-736, 2019. DOI: 10.1007/s11633-019-1200-0
Citation: Xiang Zhang and Qiang Yang. Transfer Hierarchical Attention Network for Generative Dialog System. International Journal of Automation and Computing, vol. 16, no. 6, pp. 720-736, 2019. DOI: 10.1007/s11633-019-1200-0

Transfer Hierarchical Attention Network for Generative Dialog System

  • In generative dialog systems, learning representations for the dialog context is a crucial step in generating high quality responses. The dialog systems are required to capture useful and compact information from mutually dependent sentences such that the generation process can effectively attend to the central semantics. Unfortunately, existing methods may not effectively identify importance distributions for each lower position when computing an upper level feature, which may lead to the loss of information critical to the constitution of the final context representations. To address this issue, we propose a transfer learning based method named transfer hierarchical attention network (THAN). The THAN model can leverage useful prior knowledge from two related auxiliary tasks, i.e., keyword extraction and sentence entailment, to facilitate the dialog representation learning for the main dialog generation task. During the transfer process, the syntactic structure and semantic relationship from the auxiliary tasks are distilled to enhance both the word-level and sentence-level attention mechanisms for the dialog system. Empirically, extensive experiments on the Twitter Dialog Corpus and the PERSONA-CHAT dataset demonstrate the effectiveness of the proposed THAN model compared with the state-of-the-art methods.
  • loading

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return