您的位置:山东大学 -> 科技期刊社 -> 《山东大学学报(理学版)》

《山东大学学报(理学版)》 ›› 2023, Vol. 58 ›› Issue (1): 40-47.doi: 10.6040/j.issn.1671-9352.1.2021.048

• • 上一篇    

融合分段编码与仿射机制的相似案例匹配方法

赖华1,2,张恒滔1,2,线岩团1,2*,黄于欣1,2   

  1. 1.昆明理工大学信息工程与自动化学院, 云南 昆明 650500;2.昆明理工大学云南省人工智能重点实验室, 云南 昆明 650500
  • 发布日期:2023-02-12
  • 作者简介:赖华(1966— ),男,硕士,副教授,研究方向为智能信息处理、复杂过程建模与控制.E-mail:405904235@qq.com*通信作者简介:线岩团(1981— ),男,博士研究生,副教授,研究方向为自然语言处理、信息抽取、机器翻译.E-mail:xianyantuan@qq.com
  • 基金资助:
    国家自然科学基金资助项目(61966020);国家重点研发计划资助项目(2018YFC0830104,2018YFC0830105,2018YFC0830100);云南省基础研究计划资助项目(202001AT070046)

A similarity case matching method combining segment encoding and affine-mechanism

LAI Hua1,2, ZHANG Heng-tao1,2, XIAN Yan-tuan1,2*, HUANG Yu-xin1,2   

  1. 1.Faculty of Information Engineering and Automation, Kunming University of Science and Technology, Kunming 650500, Yunnan, China;
    2. Yunnan Key Laboratory of Artificial Intelligence, Kunming University of Science and Technology, Kunming, 650500, Yunnan, China
  • Published:2023-02-12

摘要: 相似案例匹配任务旨在判断2篇裁判文书所描述的案件是否相似,通常被看作裁判文书的文本匹配问题,在司法审判过程中具有重要的应用。现有深度学习模型大多将案例长文本编码为单一向量表示,模型很难从长文本中学习到裁判文书之间的细微差异。考虑到案例文本各部分的内容较为固定,本文提出将案例长文本拆分为多个片断并分别编码,以便获取不同部分的细微特征;同时,采用可学习仿射变换改进相似度打分模块,使模型学习到了更多细微的差异,进一步提高了案例匹配的性能。在CAIL2019-SCM数据集上的实验结果表明,本文提出方法与现有方法相比准确率提升了1.89%。

关键词: 相似案例匹配, 文本匹配, 法律智能, 卷积, 仿射变换

Abstract: Similarity case matching(SCM)task is to judge whether the cases described in two judgment documents are similar. SCM is usually regarded as the text matching problem of judgment documents and has important applications in the judicial trial. Existing deep learning models mostly encode long texts of cases into a single vector, and it is difficult for the model to learn the subtle differences between the cases from long texts. Considering that the content of each part of the case text is relatively fixed, this paper proposes to split the long case text into multiple pieces and encode them separately to obtain the subtle features of different parts. At the same time, learnable affine-transformation is used to improve the similarity scoring module, so that the model learn more subtle differences, which further improves the performance of case matching. The experimental results on the CAIL2019-SCM data set show that compared to another model, the accuracy of the method proposed in this paper have increased by 1.89%.

Key words: similarity case matching, text matching, legal intelligence, convolution, affine transformation

中图分类号: 

  • TP391
[1] HALL P A V, DOWLING G R. Approximate string matching[J]. ACM Computing Surveys(CSUR), 1980, 12(4):381-402.
[2] SALTON G, BUCKLEY C. Term-weighting approaches in automatic text retrieval[J]. Information Processing & Management, 1988, 24(5):513-523.
[3] HUANG C H, YIN J, HOU F. A text similarity measurement combining word semantic information with TF-IDF method[J]. Chinese Journal of Computers, 2011, 34(5):856-864.
[4] NIRAULA N, BANJADE R, ??塁TEFANESCU D, et al. Experiments with semantic similarity measures based on LDA and LSA[C] //Proceedings of the First International Conference on Statistical Language and Speech Processing. Berlin: Springer, 2013: 188-199.
[5] WANG Z Z, HE M, DU Y P. Text similarity computing based on topic model LDA[J]. Computer Science, 2013, 40(12):229-232.
[6] MIKOLOV T, CHEN K, CORRADO G, et al. Efficient estimation of word representations in vector space[EB/OL].(2013-09-07)[2021-07-01]. https://arxiv. org/abs/1301.3781.pdf.
[7] LE Q, MIKOLOV T. Distributed representations of sentences and documents[C] //Proceedings of the 31st International Conference on Machine Learning. Beijing: JMLR, 2014: 1188-1196.
[8] MUELLER J, THYAGARAJAN A. Siamese recurrent architectures for learning sentence similarity[C] //Proceedings of the AAAI Conference on Artificial Intelligence. Arizona: AAAI Press, 2016, 30(1):2786-2792.
[9] REIMERS N, GUREVYCH I. Sentence-BERT: sentence embeddings using siamese BERT-networks[C] //Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing(EMNLP-IJCNLP). Hong Kong:Association for Computational Linguistics, 2019(1):3980-3990.
[10] DEVLIN Jacob, CHANG Ming-wei, LEE Kenton, et al. BERT: pre-training of deep bidirectional transformers for language understanding.[C] //Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Long and Short Papers. Minneapolis: Association for Computational Linguistics, 2018: 4171-4186.
[11] WANG Z, HAMZA W, FLORIAN R. Bilateral multi-perspective matching for natural language sentences[C] //Proceedings of Twenty-sixth International Joint Conference on Artificial Intelligence. Melbourne: IJCAI, 2017: 4144-4150.
[12] CHEN Z, ZHANG H, ZHANG X, et al. Quora question pairs[EB/OL].(2018-05-25)[2021-07-01]. http://static.hongbozhang.me/doc/STAT_441_Report.pdf.
[13] YANG Y, YIH W, MEEK C. Wikiqa: a challenge dataset for open-domain question answering[C] //Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. Lisbon: The Association for Computational Linguistics, 2015: 2013-2018.
[14] XIAO C, ZHONG H, GUO Z. CAIL2019-SCM: a dataset of similar case matching in legal domain[EB/OL].(2019-09-25)[2021-07-01]. https://arxiv.org/abs/1911.08962.pdf.
[15] HUANG P S, HE X, GAO J, et al. Learning deep structured semantic models for web search using clickthrough data[C] //Proceedings of the 22nd ACM International Conference on Information & Knowledge Management. San Francisco: ACM, 2013: 2333-2338.
[16] CHOPRA S, HADSELL R, LECUN Y. Learning a similarity metric discriminatively, with application to face verification[C] //Proceedings of 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition(CVPR'05). San Diego: IEEE, 2005: 539-546.
[17] SHEN Y, HE X, GAO J, et al. A latent semantic model with convolutional-pooling structured for information retrieval[C] //Proceedings of the 23rd ACM International Conference on Conference on Information and Knowledge Management. Shanghai: CIKM, 2014: 101-110.
[18] CHEN Q, ZHU X, LING Z, et al. Enhanced LSTM for natural language inference[C] //Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics. Vancouver: ACL, 2017: 1657-1668.
[19] ROCKTÄSCHEL T, GREFENSTETTE E, HERMANN K M, et al. Reasoning about entailment with neural attention[C] //Proceedings of 2016 International Conference on Learning Representations. San Juan: ICLR, 2016.
[20] SHAO Y, MAO J, LIU Y, et al. BERT-PLI: modeling paragraph-level interactions for legal case retrieval[C] //Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence, IJCAI-20. Yokohama: The Association for Computational Linguistics, 2020: 3501-3507.
[21] VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[C] //Proceedings of the 31st International Conference on Neural Information Processing Systems. Long Beach: NIPS, 2017: 5998-6008.
[22] DING S, SHANG J, WANG S, et al. ERNIE-DOC: the retrospective long-document modeling transformer[J/OL]. arXiv, 2020, https://arxiv.org/abs/2012.15688.pdf.
[23] HONG Z, ZHOU Q, ZHANG R, et al. Legal feature enhanced semantic matching network for similar case matching[C] //Proceedings of 2020 International Joint Conference on Neural Networks(IJCNN). Glasgow: IEEE, 2020: 1-8.
[1] 阴爱英,林建洲,吴运兵,廖祥文. 融合图卷积神经网络的文本情感分类[J]. 《山东大学学报(理学版)》, 2021, 56(11): 15-23.
[2] 银温社,贺建峰. 基于深度学习的眼底图像出血点检测方法[J]. 《山东大学学报(理学版)》, 2020, 55(9): 62-71.
[3] 李妮,关焕梅,杨飘,董文永. 基于BERT-IDCNN-CRF的中文命名实体识别方法[J]. 《山东大学学报(理学版)》, 2020, 55(1): 102-109.
[4] 刘洋,赵科军,葛连升,刘恒. 一种基于深度学习的快速DGA域名分类算法[J]. 《山东大学学报(理学版)》, 2019, 54(7): 106-112.
[5] 王文卿,撖奥洋,于立涛,张智晟. 自编码器与PSOA-CNN结合的短期负荷预测模型[J]. 《山东大学学报(理学版)》, 2019, 54(7): 50-56.
[6] 刘明明,张敏情,刘佳,高培贤. 一种基于浅层卷积神经网络的隐写分析方法[J]. 山东大学学报(理学版), 2018, 53(3): 63-70.
[7] 张芳芳,曹兴超. 基于字面和语义相关性匹配的智能篇章排序[J]. 山东大学学报(理学版), 2018, 53(3): 46-53.
[8] 秦静,林鸿飞,徐博. 基于示例语义的音乐检索模型[J]. 山东大学学报(理学版), 2017, 52(6): 40-48.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!