Boosted K-nearest neighbor classifiers based on fuzzy granules

Autor: Wei Li, Yumin Chen, Yuping Song
Rok vydání: 2020
Předmět:
Zdroj: Knowledge-Based Systems. 195:105606
ISSN: 0950-7051
DOI: 10.1016/j.knosys.2020.105606
Popis: K-nearest neighbor (KNN) is a classic classifier, which is simple and effective. Adaboost is a combination of several weak classifiers as a strong classifier to improve the classification effect. These two classifiers have been widely used in the field of machine learning. In this paper, based on information fuzzy granulation, KNN and Adaboost, we propose two algorithms, a fuzzy granule K-nearest neighbor (FGKNN) and a boosted fuzzy granule K-nearest neighbor (BFGKNN), for classification. By introducing granular computing, we normalize the process of solving problem as a structured and hierarchical process. Structured information processing is focused, so the performance including accuracy and robust can be enhanced to data classification. First, a fuzzy set is introduced, and an atom attribute fuzzy granulation is performed on samples in the classified system to form fuzzy granules. Then, a fuzzy granule vector is created by multiple attribute fuzzy granules. We design the operators and define the measure of fuzzy granule vectors in the fuzzy granule space. And we also prove the monotonic principle of the distance of fuzzy granule vectors. Furthermore, we also give the definition of the concept of K-nearest neighbor fuzzy granule vector and present FGKNN algorithm and BFGKNN algorithm. Finally, we compare the performance among KNN, Back Propagation Neural Network (BPNN), Support Vector Machine (SVM), Logistic Regression (LR), FGKNN and BFGKNN on UCI data sets. Theoretical analysis and experimental results show that FGKNN and BFGKNN have better performance than that of the methods mentioned above if the appropriate parameters are given.
Databáze: OpenAIRE