Continual Learning for Natural Language Generation in Task-oriented Dialog Systems

Fei Mi, Liangwei Chen, Mengjie Zhao, Minlie Huang, Boi Faltings


Abstract
Natural language generation (NLG) is an essential component of task-oriented dialog systems. Despite the recent success of neural approaches for NLG, they are typically developed in an offline manner for particular domains. To better fit real-life applications where new data come in a stream, we study NLG in a “continual learning” setting to expand its knowledge to new domains or functionalities incrementally. The major challenge towards this goal is catastrophic forgetting, meaning that a continually trained model tends to forget the knowledge it has learned before. To this end, we propose a method called ARPER (Adaptively Regularized Prioritized Exemplar Replay) by replaying prioritized historical exemplars, together with an adaptive regularization technique based on Elastic Weight Consolidation. Extensive experiments to continually learn new domains and intents are conducted on MultiWoZ-2.0 to benchmark ARPER with a wide range of techniques. Empirical results demonstrate that ARPER significantly outperforms other methods by effectively mitigating the detrimental catastrophic forgetting issue.
Anthology ID:
2020.findings-emnlp.310
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Editors:
Trevor Cohn, Yulan He, Yang Liu
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3461–3474
Language:
URL:
https://aclanthology.org/2020.findings-emnlp.310
DOI:
10.18653/v1/2020.findings-emnlp.310
Bibkey:
Cite (ACL):
Fei Mi, Liangwei Chen, Mengjie Zhao, Minlie Huang, and Boi Faltings. 2020. Continual Learning for Natural Language Generation in Task-oriented Dialog Systems. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 3461–3474, Online. Association for Computational Linguistics.
Cite (Informal):
Continual Learning for Natural Language Generation in Task-oriented Dialog Systems (Mi et al., Findings 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.findings-emnlp.310.pdf