《山东大学学报(理学版)》 ›› 2019, Vol. 54 ›› Issue (7): 68-76.doi: 10.6040/j.issn.1671-9352.1.2018.120
郝长盈1,2(),兰艳艳1,2,*(),张海楠1,2,郭嘉丰1,2,徐君1,2,庞亮1,2,程学旗1,2
Chang-ying HAO1,2(),Yan-yan LAN1,2,*(),Hai-nan ZHANG1,2,Jia-feng GUO1,2,Jun XU1,2,Liang PANG1,2,Xue-qi CHENG1,2
摘要:
在对话过程中,人们通常根据对方上一句话的关键词做出相应的回复。为了生成与关键词含义相关的回复,提出了拓展关键词信息注意力机制的对话生成模型。首先从输入语句中提取关键词,然后根据关键词词向量余弦相似度找出与关键词相关的词语构成拓展关键词集合,将集合中词语的词向量通过注意力机制的方式加入解码过程来影响回复生成。在中文微博数据集及英文Twitter数据集上的实验表明,该模型在回复语句的相关性及多样性方面取得了优于其他模型的结果。
中图分类号:
1 | SUTSKEVER I, VINYALS O, LE Q V. Sequence to sequence learning with neural networks[C]// Advances in Neural Information Processing Systems. Cambridge: MIT Press, 2014: 3104-3112. |
2 | LI J W, GALLEY M, BROCKETT C, et al. A diversity-promoting objective function for neural conversation models C]// The 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Stroudsburg: Association for Computational Linguistics, 2016: 110-119. |
3 | MIKOLOV T , SUTSKEVER I , CHEN K , et al. Distributed representations of words and phrases and their compositionality[J]. Advances in Neural Information Processing Systems, 2013, (26): 3111- 3119. |
4 | TOMAS M.Word2vec project[DB/OL].[2018-05-20]. http://code.google.com/p/word2vec/. |
5 | MIKOLOV T, CHEN K, CORRADO G, et al. Efficient estimation of word representations in vector space[EB/OL].[2013-09-07].https://arxiv.org/abs/1301.3781v3. |
6 | SHANG Lifeng, LU Zhengdong, LI Hang. Neural responding machine for short-text conversation[C]// Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics. Stroudsburg: Association for Computational Linguistics, 2015: 1577-1586. |
7 | BAHDANAU D, CHO K, BENGIO Y. Neural machine translation by jointly learning to align and translate[EB/OL].[2016-05-19]. https://arxiv.org/abs/1409.0473. |
8 | VINYALS O, LE Q. A neural conversational model[EB/OL].[2015-07-22]. https://arxiv.org/abs/1506.05869v3. |
9 | LI J W, MONROE W, RITTER A, et al. Deep reinforcement learning for dialogue generation[C]// Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing. Stroudsburg: Association for Computational Linguistics, 2016: 1192-1202. |
10 | LI J W, MONROE W, SHI T, et al. Adversarial learning for neural dialogue generation[C]// Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. Stroudsburg: Association for Computational Linguistics, 2017: 2147-2159. |
11 | CHEN Xing, WEI Wu, YU Wu, et al. Topic aware neural response generation[C]// Proceedings of the AAAI Conference on Artificial Intelligence. Menlo Park: AAAI Press, 2017: 3351-3357. |
12 | CHO K, VAN MERRIËNBOER B, GULCEHRE C, et al. Learning phrase representations using RNN encoder-decoder for statistical machine translation[C]// Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. Stroudsburg: Association for Computational Linguistics, 2014: 1724-1734. |
13 | PAPINENI K, ROUKOS S, WARD T, et al. BLEU: a method for automatic evaluation of machine translation[C]// Proceedings of the 40th Annual Meeting on Association for Computational Linguistics. Stroudsburg: Association for Computational Linguistics, 2002: 311-318. |
14 | MIHALCEA R, TARAU P. Textrank: bringing order into text[C]// Proceedings of the 2004 Conference on Empirical Methods in Natural Language Processing. Stroudsburg: Association for Computational Linguistics, 2004: 404-411. |
15 | ROSE S, ENGEL D, CRAMER N, et al. Automatic keyword extraction from individual documents[M]// Text Mining. Chichester: John Wiley & Sons, Ltd, 2010: 1-20. |
[1] | 刘飚,路哲,黄雨薇,焦萌,李泉其,薛瑞. 神经网络结构在功耗分析中的性能对比[J]. 《山东大学学报(理学版)》, 2019, 54(1): 60-66. |
[2] | 庞博,刘远超. 融合pointwise及深度学习方法的篇章排序[J]. 山东大学学报(理学版), 2018, 53(3): 30-35. |
[3] | 刘明明,张敏情,刘佳,高培贤. 一种基于浅层卷积神经网络的隐写分析方法[J]. 山东大学学报(理学版), 2018, 53(3): 63-70. |
[4] | 刘铭, 昝红英, 原慧斌. 基于SVM与RNN的文本情感关键句判定与抽取[J]. 山东大学学报(理学版), 2014, 49(11): 68-73. |
|