您的位置:山东大学 -> 科技期刊社 -> 《山东大学学报(理学版)》

《山东大学学报(理学版)》 ›› 2021, Vol. 56 ›› Issue (3): 77-82.doi: 10.6040/j.issn.1671-9352.4.2020.281

• • 上一篇    

基于相似关系的局部粗糙集模型

张杰,张燕兰*   

  1. 闽南师范大学计算机学院, 数据科学与智能应用福建省高等学校重点实验室, 福建省粒计算及其应用重点实验室, 福建 漳州 363000
  • 发布日期:2021-03-16
  • 作者简介:张杰(1990— ), 男,硕士研究生,研究方向为覆盖粗糙集等. E-mail:13142832142@163.com*通信作者简介:张燕兰(1983— ), 女,博士, 教授,研究方向为不确定性处理等. E-mail:zhangppff@126.com
  • 基金资助:
    福建省高校杰出青年科研人才培养计划资助项目

Local rough set model based on similarity relation

ZHANG Jie, ZHANG Yan-lan*   

  1. School of Computer Science, Key Laboratory of Data Science and Intelligence Application, Fujian Province University, Fujian Province Key Laboratory of Granular Computing and Its Application, Minnan Normal University, Zhangzhou 363000, Fujian, China
  • Published:2021-03-16

摘要: 粗糙集理论是一种有监督学习模型, 一般需要适量有标记的数据来训练分类器,但现实中的一些问题往往存在大量无标记的数据, 若标记数据则代价过大。概念近似是粗糙集理论的一个关键所在,基于相似关系粗糙集的提出, 扩大了粗糙集理论的应用范围。为了应对大数据标记特性有限和计算效率低的问题, 本文介绍了一个相似关系下的局部粗糙集理论模型, 提出了具有线性时间复杂度的概念近似模型,理论证明和实例分析验证了基于相似关系的局部粗糙集中概念近似模型的优越性。

关键词: 局部粗糙集, 相似关系, 概念近似, 有限标记数据

Abstract: Rough set theory is a supervised learning model, which generally requires a certain amount of labeled data to train classifiers. However, there are many unlabeled data in some practical problems, and the cost of labeling data is too large. The concept approximation is a key problem in rough set theory. The rough set based on similarity relation expands the application of rough set theory. In order to deal with the problem of limited label characteristics and low calculation efficiency, a theoretical model of local rough set under similarity relationship is introduced and a concept approximation model with linear time complexity is proposed. Theoretical proof and case analysis verify the superiority of local rough set based on similarity relation.

Key words: local rough set, similarity relation, concept approximation, limited labeled data

中图分类号: 

  • TP18
[1] PAWLAK Z. Rough sets[J]. International Journal of Computer and Information Sciences, 1982, 11:341-356.
[2] ZADEH L A. Fuzzy logic=computing with words[J]. IEEE Transactions on Fuzzy Systems, 1996, 4(2):103-111.
[3] RADZIKOWSKA A M, KERRE E E. A comparative study of fuzzy rough sets[J]. Fuzzy Sets and Systems, 2002, 126(2):137-155.
[4] STEPANIUK J. Similarity based rough sets and learning[C] //Proceedings of the Fourth International Workshop on Rough Sets, Fuzzy Sets, and Machine Discovery. Tokyo: [s.n.] , 1996: 18-22.
[5] SLOWINSKI R, VANDERPOOTEN D. A generalized definition of rough approximations based on similarity[J]. IEEE Transactions on Knowledge and Data Engineering, 2000, 12(2):331-336.
[6] LIN T Y. Granular computing on binary relations I: data mining and neighborhood systems[C] //Proceedings of the Rough Sets in Knowledge Discovery. Heidelberg: Physica-Verlag, 1998: 107-121.
[7] ZHANG W X, LEUNG Y. Theory of including degrees and its applications to uncertainty inferences[C] //Fuzzy Systems Symposium. Proceedings of the 1996 Asian. [S.l.] : Kenting, 1996: 496-501.
[8] QIAN Yuhua, LIANG Xinyan, WANG Qi, et al. Local rough set: a solution to rough data analysis in big data[J]. International Journal of Approximate Reasoning, 2018, 97:38-63.
[9] HU X H, CERCONE N. Learning in relational databases: a rough set approach[J]. Computational Intelligence, 1995, 11(2):323-338.
[10] LIANG Jiye, WANG Feng, DANG Chuangyin, et al. An efficient rough feature selection algorithm with a multi-granulation view[J]. International Journal of Approximate Reasoning, 2012, 53(6):912-926.
[11] LIANG Jiye, WANG Feng, DANG Chuangyin, et al. A group incremental approach to feature selection applying rough set technique[J]. IEEE Transactions on Knowledge and Data Engineering, 2013, 26(2):294-308.
[12] PEDRYCZ W, VUKOVICH G. Feature analysis through information granulation and fuzzy sets[J]. Pattern Recognition, 2002, 35(4):825-834.
[13] QIAN Y H, LIANG J Y, PEDRYCZ W, et al. Positive approximation: an accelerator for attribute reduction in rough set theory[J]. Artificial Intelligence, 2010, 174(9):597-618.
[14] WANG Guoyin, YU Hong, YANG Dachun. Decision table reduction based on conditional information entropy[J]. Chinese Journal of Computers, 2002, 25(7):759-766.
[15] WANG Guoyin, ZHAO Jun, AN Jiujiang, et al. A comparative study of algebra viewpoint and information viewpoint in attribute reduction[J]. Fundamenta Informaticae, 2005, 68(3):289-301.
[16] 黄宜纯,杨霁琳,张贤勇,等. 基于相似关系的条件熵属性约简及其算法[J]. 数学的实践与认识, 2019, 49(2):168-177. HUANG Yichun, YANG Jilin, ZHANG Xianyong, et al. Conditional entropy attribute reduction based on similarity relationship and its algorithm [J]. Mathematics in Practice and Theory, 2019, 49(2):168-177.
[17] 刘瑶瑶. 基于局部粗糙集研究不完备信息系统的理论[J]. 智能计算机与应用, 2019, 9(5):121-124. LIU Yaoyao. Theory of incomplete information system based on local rough set research [J]. Intelligent Computer and Application, 2019, 9(5):121-124.
[18] 李进金. 基于粗糙集与概念格的知识系统模型[M]. 北京:科学出版社, 2013. LI Jinjin. Knowledge system model based on rough set and concept lattice [M]. Beijing: Science Press, 2013.
[19] 刘妍琼,钟波.变精度粗糙集模型中β参数范围的确定[J]. 湖南理工学院学报(自然科学版), 2008, 21(1):11-13. LIU Yanqiong, ZHONG Bo. Determination of parameter range in variable precision rough set model [J]. Journal of Hunan University of Science and Technology(Natural Science), 2008, 21(1):11-13.
[20] 周爱武,周闪闪,邹武.一种变精度粗糙集模型阈值选取的方法[J]. 计算机技术与发展, 2009, 19(4):112-114. ZHOU Aiwu, ZHOU Shanshan, ZOU Wu. A method for threshold selection of variable precision rough set model [J]. Computer Technology and Development, 2009, 19(4):112-114.
[1] 熊兴国,路玲霞. 基于MV-代数的度量型模糊粗糙集[J]. 《山东大学学报(理学版)》, 2019, 54(11): 81-89.
[2] 谢维奇1,2,李晓昕1. 层次内P-关系与内P-聚类算法[J]. J4, 2012, 47(2): 123-126.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!