JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE) ›› 2020, Vol. 55 ›› Issue (11): 66-77.doi: 10.6040/j.issn.1671-9352.0.2020.268

Previous Articles    

Class-specific β distribution reduction in interval-valued decision systems

HAN Shuang-zhi1,2, ZHANG Nan1,2*, ZHANG Zhong-xi1,2   

  1. 1. Key Laboratory for Data Science and Intelligence Technology of Shandong Higher Education Institutes, Yantai University, Yantai 264005, Shandong, China;
    2. School of Computer and Control Engineering, Yantai University, Yantai 264005, Shandong, China
  • Published:2020-11-17

Abstract: Attribute reduction is one of the important research points in rough set theory. The goal of β distribution reduction in interval-valued decision systems is to keep the corresponding β distribution of objects unchanged. In actual needs, attribute reduction usually only needs to focus on specific decision class rather than all decision classes. This paper proposes a theoretical of class-specific β distribution reduction in interval-valued decision systems. First of all, the basic concept of class-specific β distribution reduction is defined, and then the discernibility matrix corresponding to the class-specific β distribution is constructed. Finally, a class-specific β distribution reduction algorithm based on the discernibility matrices is proposed. In experiments, six UCI data sets are used to compare reduction results and reduction efficiency of BRADM algorithm and CSBRADM algorithm. The experiments results show that the reduction results of class-specific algorithm can keep the β distribution for class-specific unchanged, and reduction length of algorithm for specific class is less than or equal to reduction length of algorithm for all classes, and the CSBRADM algorithm efficiency is higher than the BRADM algorithm.

Key words: rough sets, attribute reduction, interval value, class specific, β distribution reduction, discernibility matrix

CLC Number: 

  • TP391
[1] PAWLAK Z. Rough sets[J]. International Journal of Computer & Information Sciences, 1982, 11(5):341-356.
[2] PAWLAK Z. Rough sets: theoretical aspects of reasoning about data[M]. Dordrecht: Kluwer Academic Publishers, 1992.
[3] QIAN Y H, XU H, LIANG J Y, et al. Fusing monotonic decision trees[J]. IEEE Transactions on Knowledge & Data Engineering, 2015, 27(10):2717-2728.
[4] DU W S, HU B Q. A fast heuristic attribute reduction approach to ordered decision systems[J]. European Journal of Operational Research, 2018, 264(2):440-452.
[5] FAN J, JIANG Y L, LIU Y. Quick attribute reduction with generalized indiscernibility models[J]. Information Sciences, 2017, 397:15-36.
[6] SENGUPTA A, PAL T. On comparing interval numbers [J]. European Journal of Operational Research, 2000, 127(1):28-43.
[7] 张楠, 苗夺谦, 岳晓冬. 区间值信息系统的知识约简[J]. 计算机研究与发展, 2010, 47(8):1362-1371. ZHANG Nan, MIAO Duoqian, YUE Xiaodong. Approaches to knowledge reduction in interval-valued information systems[J]. Journal of Computer Research and Development, 2010, 47(8):1362-1371.
[8] YAO Y Y, ZHAO Y. Discernibility matrix simplification for constructing attribute reducts[J]. Information Sciences, 2008, 179(7): 867-882.
[9] SKOWRON A, RAUSZER C. The discernibility matrices and functions in information systems[M] //SLOWONSKI R. Intelligent Decision Support. Dordrecht: Springer, 1992: 331-362.
[10] WEI W, WU X Y, LIANG J Y, et al. Discernibility matrix based incremental attribute reduction for dynamic data[J]. Knowledge-Based Systems, 2018, 140:142-157.
[11] LIU G L, FENG Y B, YANG J T. A common attribute reduction form for information systems[J]. Knowledge-Based Systems, 2020, 193:105446.
[12] LIU Y, ZHENG L D, XIU Y L, et al. Discernibility matrix based incremental feature selection on fused decision tables[J]. International Journal of Approximate Reasoning, 2020, 118:1-26.
[13] ZHENG K, WANG X. Feature selection method with joint maximal information entropy between features and class[J]. Pattern Recognition, 2018, 77:20-29.
[14] QIAN Y H, LIANG J Y, WITOLD P, et al. Positive approximation: an accelerator for attribute reduction in rough set theory[J]. Artificial Intelligence, 2010, 174(9):597-618.
[15] NGUYEN N T, SARTRA W. A new approach for reduction of attributes based on stripped quotient sets[J]. Pattern Recognition, 2020, 97:106999.
[16] MIAO D Q, ZHANG N, YUE X D. Knowledge reduction in interval-valued information systems[C] //Proceedings of the 8th IEEE International Conference on Cognitive Informatics, Hong Kong: IEEE, 2009: 320-327.
[17] DAI J H, HU H, ZHENG G J, et al. Attribute reduction in interval-valued information systems based on information entropies[J]. Frontiers of Information Technology & Electronic Engineering, 2016, 17(9):919-928.
[18] DU W S, HU B Q. Approximate distribution reducts in inconsistent interval-valued ordered decision tables[J]. Information Sciences, 2014, 271:93-114.
[19] DAI J H, WEI B J, ZHANG X H, et al. Uncertainty measurement for incomplete interval-valued information systems based on α-weak similarity[J]. Knowledge-Based Systems, 2017, 136: 159-171.
[20] 杨文静,张楠,童向荣, 等. 基于特定类的区间值决策系统的分布约简[J]. 计算机科学, 2020, 47(3):92-97. YANG Wenjing, ZHANG Nan, TONG Xiangrong, et al. Class-specific distribution preservation reduction in interval-valued decision systems[J]. Computer Science, 2020, 47(3):92-97.
[21] MI J S, WU W Z, ZHANG W X. Approaches to knowledge reduction based on variable precision rough set model[J]. Information Sciences, 2004, 159(3):255-272.
[22] LIU G L. Matrix approaches for variable precision rough approximations[C] //International Conference on Rough Sets and Knowledge Technology, Switzerland: Springer, 2015: 214-221.
[23] LIU G L, HUA Z, ZOU J Y. Local attribute reductions for decision tables[J]. Information Sciences, 2018, 422:204-217.
[24] LEUNG Y, FISCHER M, WU W Z, et al. A rough set approach for the discovery of classification rules in interval-valued information systems[J]. International Journal of Approximate Reasoning, 2008, 47(2):233-246.
[25] ZHANG X, MEI C L, CHEN D G, et al. Multi-confidence rule acquisition and confidence-preserved attribute reduction in interval-valued decision systems[J]. International Journal of Approximate Reasoning, 2014, 55(8):1787-1804.
[1] HAO Xiu-mei, LIU Ji-qin. Cut sets of outer P-fuzzy sets and extended rough sets models [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2020, 55(10): 1-6.
[2] WAN Qing, MA Ying-cang, WEI Ling. Knowledge acquisition of multi-source data based on multigranularity [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2020, 55(1): 41-50.
[3] JING Yun-ge, JING Luo-xi, WANG Bao-li, CHENG Ni. An incremental attribute reduction approach when attribute values and attributes of the decision system change dynamically [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2020, 55(1): 62-68.
[4] SUN Qian-qian, LI Xiao-nan. Information measures of interval valued Pythagorean fuzzy sets and their applications [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2019, 54(9): 43-53.
[5] ZHENG Li-ping, HU Min-jie, YANG Hong-he, LIN Yao-jin. Research on collaborative filtering algorithm based on rough set [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2019, 54(2): 41-50.
[6] ZHANG En-sheng. Composition and structure on attribute reduction of interval-set concept lattices [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2018, 53(8): 17-24.
[7] ZUO Zhi-cui, ZHANG Xian-yong, MO Zhi-wen, FENG Lin. Block discernibility matrix based on decision classification and its algorithm finding the core [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2018, 53(8): 25-33.
[8] LI Tong-jun, HUANG Jia-wen, WU Wei-zhi. Attribute reduction of incomplete contexts based on similarity relations [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2018, 53(8): 9-16.
[9] ZHANG Xiao, YANG Yan-yan. Algorithms of rule acquisition and confidence-preserved attribute reduction in covering decision systems [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2018, 53(12): 120-126.
[10] HU Qian, MI Ju-sheng, LI Lei-jun. The fuzzy belief structure and attribute reduction based on multi-granulation fuzzy rough operators [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2017, 52(7): 30-36.
[11] CHEN Xue, WEI Ling, QIAN Ting. Attribute reduction in formal decision contexts based on AE-concept lattices [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2017, 52(12): 95-103.
[12] HUANG Wei-ting, ZHAO Hong, ZHU William. Adaptive divide and conquer algorithm for cost-sensitive attribute reduction [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2016, 51(8): 98-104.
[13] QIU Ting-ting, LI Ke-dian. μ-reduction based on the subset of the objects in inconsistent information systems [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2015, 50(05): 35-39.
[14] LI Ling-qiang, LI Qing-guo. The characterizations of lattice-valued fuzzy lower approximation operators by a unique axiom [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2014, 49(10): 78-82.
[15] LIU Chun-hui. Theory of interval valued (∈,∈∨ q)-fuzzy filters in BL-algebras [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2014, 49(10): 83-89.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!