高级搜索

留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

基于多尺度重采样思想的类指数核函数构造

胡站伟 焦立国 徐胜金 黄勇

胡站伟, 焦立国, 徐胜金, 黄勇. 基于多尺度重采样思想的类指数核函数构造[J]. 电子与信息学报, 2016, 38(7): 1689-1695. doi: 10.11999/JEIT151101
引用本文: 胡站伟, 焦立国, 徐胜金, 黄勇. 基于多尺度重采样思想的类指数核函数构造[J]. 电子与信息学报, 2016, 38(7): 1689-1695. doi: 10.11999/JEIT151101
HU Zhanwei, JIAO Liguo, XU Shengjin, HUANG Yong. Design of An Exponential-like Kernel Function Based on Multi-scale Resampling[J]. Journal of Electronics & Information Technology, 2016, 38(7): 1689-1695. doi: 10.11999/JEIT151101
Citation: HU Zhanwei, JIAO Liguo, XU Shengjin, HUANG Yong. Design of An Exponential-like Kernel Function Based on Multi-scale Resampling[J]. Journal of Electronics & Information Technology, 2016, 38(7): 1689-1695. doi: 10.11999/JEIT151101

基于多尺度重采样思想的类指数核函数构造

doi: 10.11999/JEIT151101
基金项目: 

国家自然科学基金(11472158)

Design of An Exponential-like Kernel Function Based on Multi-scale Resampling

Funds: 

The National Natural Science Foundation of China (11472158)

  • 摘要: 该文按照多尺度重采样思想,构造了一种类指数分布的核函数(ELK),并在核回归分析和支持向量机分类中进行了应用,发现ELK对局部特征具有捕捉优势。ELK分布仅由分析尺度决定,是单参数核函数。利用ELK对阶跃信号和多普勒信号进行Nadaraya-Watson回归分析,结果显示ELK降噪和阶跃捕捉效果均优于常规Gauss核,整体效果接近或优于局部加权回归散点平滑法(LOWESS)。多个UCI数据集的SVM分析显示,ELK与径向基函数(RBF)分类效果相当,但比RBF具有更强的局域性,因此具有更细致的分类超平面,同时分类不理想时可能产生更多的支持向量。对比而言,ELK对调节参数敏感性低,这一性质有助于减少参数优选的计算量。单参数的ELK对局域特征的良好捕捉能力,有助于这类核函数在相关领域得到推广。
  • SMOLA A J and SCHLKOPF B. On a kernel-based method for pattern recognition, regression, approximation, and operator inversion[J]. Lgorithmica, 1998, 22(1): 211-231. doi: 10.1007/PL00013831.
    KOHLER M, SCHINDLER A, and SPERLICH S. A review and comparison of bandwidth selection methods for kernel regression [J]. International Statistical Review, 2014, 82(2): 243-274. doi: 10.1111/insr.12039.
    DAS D, DEVI R, PRASANNA S, et al. Performance comparison of online handwriting recognition system for assamese language based on HMM and SVM modelling[J]. International Journal of Computer Science Information Technology, 2014, 6(5): 87-95. doi: 10.5121/csit.2014.4717.
    HASTIE T and LOADER C. Local regression: Automatic kernel carpentry[J]. Statistical Science, 1993, 8(2): 120-143. doi: 10.1214/ss/1177011002.
    SCHLKOPF B, SMOLA A, and MLLER K R. Nonlinear component analysis as a kernel eigenvalue problem[J]. Neural Computation, 1998, 10(5): 1299-1319. doi: 10.1162/ 089976698300017467.
    BUCAK S S, JIN R, and JAIN A K. Multiple kernel learning for visual object recognition: A review [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2014, 36(7): 1354-1369. doi: 10.1109/TPAMI.2013.212.
    BACH F R, LANCKRIET G R, and JORDAN M I. Multiple kernel learning, conic duality, and the SMO algorithm[C]. Proceedings of the Twenty-first International Conference on Machine Learning, Banff, Canada, 2004: 1-6. doi: 10.1145/1015330.1015424.
    吴涛, 贺汉根, 贺明科. 基于插值的核函数构造[J]. 计算机学报, 2003, 26(8): 990-996.
    WU Tao, HE Hangen, and HE Mingke. Interpolation based kernel functions construction[J]. Chinese Journal of Computers, 2003, 26(8): 990-996.
    JAIN P, KULIS B, and DHILLON I S. Inductive regularized learning of kernel functions[C]. Proceedings of the Advances in Neural Information Processing Systems, Vancouver, Canada, 2010: 946-954.
    ZHANG L, ZHOU W, and JIAO L. Wavelet support vector machine[J]. IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, 2004, 34(1): 34-39. doi: 10.1109/TSMCB.2003.811113.
    NADARAYA E A. On estimating regression[J]. Theory of Probability Its Applications, 1964, 9(1): 141-142. doi: 10.1137/1109020.
    WASSERMAN L著, 吴喜之译. 现代非参数统计[M]. 北京: 科学出版社, 2008: 163-179.
    WATSON G S. Smooth regression analysis[J]. Sankhyā: The Indian Journal of Statistics, Series A, 1964, 26(4): 359-372.
    CRISTIANINI N and SHAWE-TAYLOR J. An Introduction to Support Vector Machines and Other Kernel-based Learning Methods[M]. Cambridge: Cambridge University Press, 2000: 93-112.
    VLADIMIR V N and VAPNIK V. Statistical Learning Theory[M]. New York: Wiley, 1998: 293-394.
    CHANG C C and LIN C J. LIBSVM: A library for support vector machines[J]. ACM Transactions on Intelligent Systems and Technology, 2011, 2(3): 27. doi: 10.1145 /1961189.1961199.
    REN Y and BAI G. Determination of optimal SVM parameters by using GA/PSO[J]. Journal of Computers, 2010, 5(8): 1160-1168. doi: 10.4304/jcp.5.8.1160-1168.
    SARAFIS I, DIOU C, and TSIKRIKA T. Weighted SVM from click through data for image retrieval[C]. 2014 IEEE International Conference on Image Processing (ICIP 2014), Paris, 2014: 3013-3017. doi: 10.1109/ICIP.2014.7025609.
    SONKA M, HLAVAC V, and BOYLE R. Image Processing, Analysis, and Machine Vision[M]. Kentucky: Cengage Learning, 2014: 257-749. doi: 10.1007/978-1-4899-3216-7.
    SELLAM V and JAGADEESAN J. Classification of normal and pathological voice using SVM and RBFNN[J]. Journal of Signal and Information Processing, 2014, 5(1): 1-7. doi: 10. 4236/jsip.2014.51001.
    高晋占. 微弱信号检测[M]. 北京: 清华大学出版社有限公司, 2004: 154-299.
    GAO Jinzhan. Weak Signal Detection[M]. Beijing: Tsinghua University Press Ltd., 2004: 154-299.
    MERCER J. Functions of positive and negative type, and their connection with the theory of integral equations[J]. Philosophical Transactions of the Royal Society of London Series A, Containing Papers of a Mathematical or Physical Character, 1909, 209(456): 415-446. doi: 10.1098/rsta.1909. 0016.
    MARRON J and CHUNG S. Presentation of smoothers: The family approach[J]. Computational Statistics, 2001, 16(1): 195-207. doi: 10.1007/s001800100059.
  • 加载中
计量
  • 文章访问数:  1466
  • HTML全文浏览量:  150
  • PDF下载量:  570
  • 被引次数: 0
出版历程
  • 收稿日期:  2015-09-25
  • 修回日期:  2016-05-03
  • 刊出日期:  2016-07-19

目录

    /

    返回文章
    返回