《山东大学学报(理学版)》 ›› 2021, Vol. 56 ›› Issue (11): 24-30.doi: 10.6040/j.issn.1671-9352.1.2020.043
• • 上一篇
唐光远1,2,郭军军1,2,余正涛1,2,张亚飞1,2,高盛祥1,2
TANG Guang-yuan1,2 , GUO Jun-jun1,2 , YU Zheng-tao1,2, ZHANG Ya-fei 1,2,GAO Sheng-xiang1,2
摘要: 针对传统法条推荐方法知识利用不足的问题,结合预训练BERT模型,提出了一种基于司法领域法律条文知识驱动的法条推荐方法。首先基于BERT预训练模型对法条知识和案件描述分别进行表征,并基于双向LSTM对案件描述文本进行特征提取,然后基于注意力机制提取融合法条知识的案件描述文本特征,最终实现法条智能推荐。该方法在法研杯公共数据集上,法条推荐F1值达到0.88,结果表明,融合法条知识的BERT模型对法条推荐具有显著提升作用,并且可以有效地解决易混淆法条推荐问题。
中图分类号:
[1] LAUDERDALE B E, CLARK T S. The supreme courts many median justices[J]. American Political Science Review, 2012, 106(4):847-866. [2] SEGAL J A. Predicting supreme court cases probabilistic cally: the search and seizure cases, 1962-1981[J]. American Political Science Review, 1984, 78(4): 891-900. [3] ALETRAS N, TSARAPATSANIS D, PREOTIUC-PIETRO D, et al. Predicting judicial decisions of the European court of human rights: a natural language processing perspective[J]. Peer J Computer Science, 2016. https://peerj.com/articles/cs-93/. [4] LIU Y H, CHEN Y L, HO W L. Predicting as sociated statutesfor legal problems[J]. Information Processing & Management, 2015, 51(1):194-211. [5] LONG W, TANG Y, TIAN Y. Investor sentiment identification based on the universum SVM[J]. Neural Computing and Applications, 2018, 30(2): 661-670. [6] LUO B F, FENG Y S, XU J B, et al. Learning to predict charges for criminal cases with legal basis[C] //Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. Copenhagen: ACL, 2017: 2727- 2736. [7] JIANG X, YE H, LUO Z C, et al. Interpretable rationale augmented charge prediction system[C] //Proceedings of the 27th International Conference on Computational Linguistics: System Demonstrations. Santa Fe: ACL, 2018: 146-151. [8] DEVLIN J, CHANG M W, LEE K, et al. BERT: pre-training of deep bidirectional transformers for language understanding[EB/OL].(2018-10-11)[2020-05-18]. https://arxiv.org/pdf/1810.04805. [9] CHEN Q, ZHUO Z, WANG W. BERT for joint intent classification and slot filling[EB/OL].(2019-02-28)[2020-05-18]. https://arxiv.org/pdf/1902.10909. [10] ADHIKARI A, RAM A, TANG R, et al. DocBERT: BERT for document classification[EB/OL].(2019-04-17)[2020-05-18]. https://arxiv.org/abs/1904.08398?context=cs. [11] ALBERTI C, LEE K, COLLINS M. A BERT baseline for the natural questions[EB/OL].(2019-01-24)[2020-05-18]. https://arxiv.org/abs/1901.08634. [12] JI Z C, WEI Q, XU H. Bert-based ranking for biomedical entity normalization[EB/OL].(2019-08-09)[2020-05-18]. https://arxiv.org/abs/1908.03548. [13] MAO J, LIU W. Factuality classification using the pre-trained language representation model BERT[C] //Proceedings of the Iberian Languages Evaluation Forum(IberLEF 2019). Bilbao: CEUR Workshop Proceedings, 2019: 126-131. [14] LI W, ZHAO J. TextRank algorithm by exploiting Wikipe dia for short text keywords extraction[C] // Proceedings of the 2016 3rd International Conference on Information Science and Control Engineering(ICISCE). Tokyo: IEEE, 2016: 683-686. [15] XIAO C J, ZHONG H X, GUO Z P, et al. CAIL 2018: a large-scale legal dataset for judgment prediction[EB/OL].(2018-07-04)[2020-05-18]. https:// arxiv.org/abs /1807.0247. [16] CAI J, LI J, LI W, et al. Deeplearning model used in text classification[C] // Proceedings of the 2018 15th International Computer Conference on Wavelet Active Media Technology and Information Processing(ICCWAMTIP). Tokyo: IEEE, 2018: 123-126. [17] VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[C] // Proceedings of the 31st International Conference on Neural Information Processing. Long Beach: ACM, 2017: 6000-6010. |
[1] | 鲍亮,陈志豪,陈文章,叶锴,廖祥文. 基于双重多路注意力匹配的观点型阅读理解[J]. 《山东大学学报(理学版)》, 2021, 56(3): 44-53. |
[2] | 李妮,关焕梅,杨飘,董文永. 基于BERT-IDCNN-CRF的中文命名实体识别方法[J]. 《山东大学学报(理学版)》, 2020, 55(1): 102-109. |
[3] | 郝长盈,兰艳艳,张海楠,郭嘉丰,徐君,庞亮,程学旗. 基于拓展关键词信息的对话生成模型[J]. 《山东大学学报(理学版)》, 2019, 54(7): 68-76. |
|