JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE) ›› 2021, Vol. 56 ›› Issue (11): 24-30.doi: 10.6040/j.issn.1671-9352.1.2020.043

Previous Articles    

Method of recommendation based on knowledge driven by BERT and law

TANG Guang-yuan1,2 , GUO Jun-jun1,2 , YU Zheng-tao1,2, ZHANG Ya-fei 1,2,GAO Sheng-xiang1,2   

  1. 1. Faculty of Information Engineering and Automation, Kunming University of Science and Technology, Kunming 650500, Yunan, China;
    2. Yunnan Key Laboratory of Artificial Intelligence, Kunming University of Science and Technology, Kunming 650500, Yunnan, China
  • Published:2021-11-15

Abstract: Aiming at the problem of insufficient knowledge utilization of traditional law recommendation methods, this article combines the pre-training BERT(bidirectional encoder representation from transformers)model to propose a law recommendation method based on knowledge-driven legal provisions in the judicial field. First based on the BERT pre-training model, the legal knowledge and the case description are characterized separately, and the case description text is extracted based on the two-way LSTM, and then the case description text features combined with the legal knowledge are extracted based on the attention mechanism, and the intelligent recommendation of the legal clause is finally realized. Using this method on the law research cup public data set, the recommended F1 value of the law can reach 0.88. From this effect, it can be seen that the BERT model fused with knowledge of the law can significantly improve the law recommendation and can effectively solve the easily confused method of recommended questions.

Key words: recommendation of law, BERT model, knowledge fusion of law, attention mechanism

CLC Number: 

  • TP391
[1] LAUDERDALE B E, CLARK T S. The supreme courts many median justices[J]. American Political Science Review, 2012, 106(4):847-866.
[2] SEGAL J A. Predicting supreme court cases probabilistic cally: the search and seizure cases, 1962-1981[J]. American Political Science Review, 1984, 78(4): 891-900.
[3] ALETRAS N, TSARAPATSANIS D, PREOTIUC-PIETRO D, et al. Predicting judicial decisions of the European court of human rights: a natural language processing perspective[J]. Peer J Computer Science, 2016. https://peerj.com/articles/cs-93/.
[4] LIU Y H, CHEN Y L, HO W L. Predicting as sociated statutesfor legal problems[J]. Information Processing & Management, 2015, 51(1):194-211.
[5] LONG W, TANG Y, TIAN Y. Investor sentiment identification based on the universum SVM[J]. Neural Computing and Applications, 2018, 30(2): 661-670.
[6] LUO B F, FENG Y S, XU J B, et al. Learning to predict charges for criminal cases with legal basis[C] //Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. Copenhagen: ACL, 2017: 2727- 2736.
[7] JIANG X, YE H, LUO Z C, et al. Interpretable rationale augmented charge prediction system[C] //Proceedings of the 27th International Conference on Computational Linguistics: System Demonstrations. Santa Fe: ACL, 2018: 146-151.
[8] DEVLIN J, CHANG M W, LEE K, et al. BERT: pre-training of deep bidirectional transformers for language understanding[EB/OL].(2018-10-11)[2020-05-18]. https://arxiv.org/pdf/1810.04805.
[9] CHEN Q, ZHUO Z, WANG W. BERT for joint intent classification and slot filling[EB/OL].(2019-02-28)[2020-05-18]. https://arxiv.org/pdf/1902.10909.
[10] ADHIKARI A, RAM A, TANG R, et al. DocBERT: BERT for document classification[EB/OL].(2019-04-17)[2020-05-18]. https://arxiv.org/abs/1904.08398?context=cs.
[11] ALBERTI C, LEE K, COLLINS M. A BERT baseline for the natural questions[EB/OL].(2019-01-24)[2020-05-18]. https://arxiv.org/abs/1901.08634.
[12] JI Z C, WEI Q, XU H. Bert-based ranking for biomedical entity normalization[EB/OL].(2019-08-09)[2020-05-18]. https://arxiv.org/abs/1908.03548.
[13] MAO J, LIU W. Factuality classification using the pre-trained language representation model BERT[C] //Proceedings of the Iberian Languages Evaluation Forum(IberLEF 2019). Bilbao: CEUR Workshop Proceedings, 2019: 126-131.
[14] LI W, ZHAO J. TextRank algorithm by exploiting Wikipe dia for short text keywords extraction[C] // Proceedings of the 2016 3rd International Conference on Information Science and Control Engineering(ICISCE). Tokyo: IEEE, 2016: 683-686.
[15] XIAO C J, ZHONG H X, GUO Z P, et al. CAIL 2018: a large-scale legal dataset for judgment prediction[EB/OL].(2018-07-04)[2020-05-18]. https:// arxiv.org/abs /1807.0247.
[16] CAI J, LI J, LI W, et al. Deeplearning model used in text classification[C] // Proceedings of the 2018 15th International Computer Conference on Wavelet Active Media Technology and Information Processing(ICCWAMTIP). Tokyo: IEEE, 2018: 123-126.
[17] VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[C] // Proceedings of the 31st International Conference on Neural Information Processing. Long Beach: ACM, 2017: 6000-6010.
[1] BAO Liang, CHEN Zhi-hao, CHEN Wen-zhang, YE Kai, LIAO Xiang-wen. Dual co-matching network with multiway attention for opinion reading comprehension [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2021, 56(3): 44-53.
[2] Chang-ying HAO,Yan-yan LAN,Hai-nan ZHANG,Jia-feng GUO,Jun XU,Liang PANG,Xue-qi CHENG. Dialogue generation model based on extended keywords information [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2019, 54(7): 68-76.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!