高级搜索

留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

基于函数集信息量的模型选择研究

盛守照 王道波 王志胜 黄向华

盛守照, 王道波, 王志胜, 黄向华. 基于函数集信息量的模型选择研究[J]. 电子与信息学报, 2005, 27(4): 552-555.
引用本文: 盛守照, 王道波, 王志胜, 黄向华. 基于函数集信息量的模型选择研究[J]. 电子与信息学报, 2005, 27(4): 552-555.
Sheng Shou-zhao, Wang Dao-bo, Wang Zhi-sheng, Huang Xiang-hua. Research on Model Selection Based on Function Set Information Quantity[J]. Journal of Electronics & Information Technology, 2005, 27(4): 552-555.
Citation: Sheng Shou-zhao, Wang Dao-bo, Wang Zhi-sheng, Huang Xiang-hua. Research on Model Selection Based on Function Set Information Quantity[J]. Journal of Electronics & Information Technology, 2005, 27(4): 552-555.

基于函数集信息量的模型选择研究

Research on Model Selection Based on Function Set Information Quantity

  • 摘要: 提出了子空间信息量(SIQ)和函数集信息量(FSIQ)概念,详细讨论了基于函数集信息量的模型选择问题,给出了有限含噪声样本下模型选择的近似解决方法,很好地克服了模型选择过程中普遍存在的欠学习和过学习问题,大大提高了预测模型的泛化性能,在此基础上提出了一种可行的次优模型选择算法。最后通过具体实例验证了上述方法的可行性和优越性。
  • 张学工.关于统计学习理论与支持向量机.自动化学报,2000,26(1):32-42.[2]Sugiyama M, Ogawa H. Subspace information criterion for model selection[J].Neural Computation.2001, 13(8):1863-[3]Stolke A. Bayesian learning of probabilistic language models.[Ph.D. Dissertation], University of California, Berkeley, 1994.[4]Hemant Ishwaran, Lancelot F, Jiayang Sun. Bayesian model selection in finite mixtures by marginal density decompositions[J].Journal of the American Statistical Association.2001, 96(456):1316-[5]Cherkassky V, Shao X, Mulier F M, Vapnik V N. Model complexity control for regression using VC generalization bounds[J].IEEE Trans. on Neural Networks.1999, 10(5):1075-[6]Barron A R, Cover T M. Minimum complexity density estimation[J].IEEE Trans. onInformation Theory.1991, 37(4):1034-[7]Yamanishi K. A decision-theoretic extension of stochastic complexity and its application to learning[J].IEEE Trans. on Information Theory.1998, 44(4):1424-[8]Wood S N. Modelling and smoothing parameter estimation with multiple quadratic penalties[J].J. Royel Statist. Soc. B.2000, 62(1):413-[9]Chapelle O, Vapnik V N, Bengio Y. Model selection for small-sample regression[J].Machine Learning Journal.2002, 48(1):9-[10]Hurvich C M, Tsai C L. Regression and time series model selection in small samples[J].Biometrika.1989, 76(13):297-[11]Bousquet O, Elisseeff A. Stability and generalization. Journal of Machine Learning Research 2, 2002:499 - 526.[12]Konishi S, Kitagawa G. Generalized information criterion in model selection[J].Biometrika.1996, 83(4):875-[13]Akaike H. A new look at the statistical model identification. IEEE Trans. on Automatic Control, 1974, AC-19(6): 716 - 723.[14]Hurvich C M, Tsai C L. Bias of the corrected AIC criterion for under-fitted regression and time series models. Biometrika, 1991,78(2): 499 - 509.[15]Murata N, Yoshizawa S, Amari S. Network information criterion-determining the number of hidden units for an artificial neural network model[J].IEEE Trans. on Neural Networks.1994,5(6):865-[16]Shibata R. Bootstrap estimate of Kullback-Leibler information for model selection. Statistica Sinica, 1997, 7(2): 375 - 394.
  • 加载中
计量
  • 文章访问数:  2333
  • HTML全文浏览量:  83
  • PDF下载量:  665
  • 被引次数: 0
出版历程
  • 收稿日期:  2003-12-09
  • 修回日期:  2004-03-26
  • 刊出日期:  2005-04-19

目录

    /

    返回文章
    返回