Advanced Search
Volume 33 Issue 10
Nov.  2011
Turn off MathJax
Article Contents
A Novel Template Reduction K-Nearest Neighbor Classification Method Based on Weighted Distance[J]. Journal of Electronics & Information Technology, 2011, 33(10): 2378-2383. doi: 10.3724/SP.J.1146.2011.00051
Citation: A Novel Template Reduction K-Nearest Neighbor Classification Method Based on Weighted Distance[J]. Journal of Electronics & Information Technology, 2011, 33(10): 2378-2383. doi: 10.3724/SP.J.1146.2011.00051

A Novel Template Reduction K-Nearest Neighbor Classification Method Based on Weighted Distance

doi: 10.3724/SP.J.1146.2011.00051
  • Received Date: 2011-01-18
  • Rev Recd Date: 2011-06-07
  • Publish Date: 2011-10-19
  • As a nonparametric classification algorithm, K-Nearest Neighbor (KNN) is very efficient and can be easily realized. However, the traditional KNN suggests that the contributions of all K nearest neighbors are equal, which makes it easy to be disturbed by noises. Meanwhile, for large data sets, the computational demands for classifying patterns using KNN can be prohibitive. In this paper, a new Template reduction KNN algorithm based on Weighted distance (TWKNN) is proposed. Firstly, the points that are far away from the classification boundary are dropped by the template reduction technique. Then, in the process of classification, the K nearest neighbors weights of the test sample are set according to the Euclidean distance metric, which can enhance the robustness of the algorithm. Experimental results show that the proposed approach effectively reduces the number of training samples while maintaining the same level of classification accuracy as the traditional KNN.
  • loading
  • Cover T M and Hart P E. Nearest neighbor pattern classification [J].IEEE Transactions on Information Theory.1967, 13(1):21-27[2]Nasibov E and Kandemir-Cavas C. Efficiency analysis of KNN and minimum distance-based classifiers in enzyme family prediction [J].Computational Biology and Chemistry.2009, 33(6):461-464[3]Zhang Rui, Jagadish H V, Dai Bing Tian, et al.. Optimized algorithms for predictive range and KNN queries on moving objects [J].Information Systems.2010, 35(8):911-932[5]Toyama J, Kudo M, and Imai H. Probably correct k-nearest neighbor search in high dimensions [J].Pattern Recognition.2010, 43(4):1361-1372[7]Dudai S A. The distance-weighted k-nearest neighbor rule [J].IEEE Transactions on Systems, Man and Cybernetics.1976, 6(4):325-327[8]Ferri F and Vidal E. Colour image segmentation and labeling through multiedit-condensing [J].Pattern Recognition Letters.1992, 13(8):561-568[9]Segata N, Blanzieri E, Delany S J, et al.. Noise reduction for instance-based learning with a local maximal margin approach [J].Journal of Intelligent Information Systems.2010, 35(2):301-331[12]Wilson D R and Martinez T R. Reduction techniques for instance-based learning algorithms [J].Machine Learning.2000, 38(3):257-286[13]Wu Ying Quan, Ianakiev K, and Govindaraju V. Improved k-nearest neighbor classi cation [J].Pattern Recognition.2002, 35(10):2311-2318[14]Fayed H A and Atiya A F. A novel template reduction approach for the k-nearest neighbor method [J].IEEE Transactions on Neural Networks.2009, 20(5):890-896[15]Huang D and Chow T W S. Enhancing density-based data reduction using entropy [J].Neural Computation.2006, 18(2):470-495[16]Paredes R and Vidal E. Learning prototypes and distances: a prototype reduction technique based on nearest neighbor error minimization [J].Pattern Recognition.2006, 39(2):171-179[17]Brighton H and Mellish C. Advances in instance selection for instance-based learning algorithms [J].Data Mining and Knowledge Discovery.2002, 6(2):153-172
  • 加载中

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Article Metrics

    Article views (2961) PDF downloads(1228) Cited by()
    Proportional views
    Related

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return