Advanced Search
Volume 27 Issue 4
Apr.  2005
Turn off MathJax
Article Contents
Sheng Shou-zhao, Wang Dao-bo, Wang Zhi-sheng, Huang Xiang-hua. Research on Model Selection Based on Function Set Information Quantity[J]. Journal of Electronics & Information Technology, 2005, 27(4): 552-555.
Citation: Sheng Shou-zhao, Wang Dao-bo, Wang Zhi-sheng, Huang Xiang-hua. Research on Model Selection Based on Function Set Information Quantity[J]. Journal of Electronics & Information Technology, 2005, 27(4): 552-555.

Research on Model Selection Based on Function Set Information Quantity

  • Received Date: 2003-12-09
  • Rev Recd Date: 2004-03-26
  • Publish Date: 2005-04-19
  • The concepts of the Subspace Information Quantity(SIQ) and Function Set Information Quantity(FSIQ) are presented; Then the problem of model selection based on FSIQ are discussed explicitly, and the approximate method of model selection based on limited samples with white noise is proposed, which resolves the problem of underletting and overfitting of model selection and improves the generalization of predict model well. A new suboptimal algorithm for model selection is given, and its reliability and advantage are illustrated through concrete test.
  • loading
  • 张学工.关于统计学习理论与支持向量机.自动化学报,2000,26(1):32-42.[2]Sugiyama M, Ogawa H. Subspace information criterion for model selection[J].Neural Computation.2001, 13(8):1863-[3]Stolke A. Bayesian learning of probabilistic language models.[Ph.D. Dissertation], University of California, Berkeley, 1994.[4]Hemant Ishwaran, Lancelot F, Jiayang Sun. Bayesian model selection in finite mixtures by marginal density decompositions[J].Journal of the American Statistical Association.2001, 96(456):1316-[5]Cherkassky V, Shao X, Mulier F M, Vapnik V N. Model complexity control for regression using VC generalization bounds[J].IEEE Trans. on Neural Networks.1999, 10(5):1075-[6]Barron A R, Cover T M. Minimum complexity density estimation[J].IEEE Trans. onInformation Theory.1991, 37(4):1034-[7]Yamanishi K. A decision-theoretic extension of stochastic complexity and its application to learning[J].IEEE Trans. on Information Theory.1998, 44(4):1424-[8]Wood S N. Modelling and smoothing parameter estimation with multiple quadratic penalties[J].J. Royel Statist. Soc. B.2000, 62(1):413-[9]Chapelle O, Vapnik V N, Bengio Y. Model selection for small-sample regression[J].Machine Learning Journal.2002, 48(1):9-[10]Hurvich C M, Tsai C L. Regression and time series model selection in small samples[J].Biometrika.1989, 76(13):297-[11]Bousquet O, Elisseeff A. Stability and generalization. Journal of Machine Learning Research 2, 2002:499 - 526.[12]Konishi S, Kitagawa G. Generalized information criterion in model selection[J].Biometrika.1996, 83(4):875-[13]Akaike H. A new look at the statistical model identification. IEEE Trans. on Automatic Control, 1974, AC-19(6): 716 - 723.[14]Hurvich C M, Tsai C L. Bias of the corrected AIC criterion for under-fitted regression and time series models. Biometrika, 1991,78(2): 499 - 509.[15]Murata N, Yoshizawa S, Amari S. Network information criterion-determining the number of hidden units for an artificial neural network model[J].IEEE Trans. on Neural Networks.1994,5(6):865-[16]Shibata R. Bootstrap estimate of Kullback-Leibler information for model selection. Statistica Sinica, 1997, 7(2): 375 - 394.
  • 加载中

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Article Metrics

    Article views (2333) PDF downloads(665) Cited by()
    Proportional views
    Related

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return