Zhang Yi-Peng, Chen Liang, Hao Huan. An Improved Training Algorithm for Quantum Neural Networks[J]. Journal of Electronics & Information Technology, 2013, 35(7): 1630-1635. doi: 10.3724/SP.J.1146.2012.01417
Citation:
Zhang Yi-Peng, Chen Liang, Hao Huan. An Improved Training Algorithm for Quantum Neural Networks[J]. Journal of Electronics & Information Technology, 2013, 35(7): 1630-1635. doi: 10.3724/SP.J.1146.2012.01417
Zhang Yi-Peng, Chen Liang, Hao Huan. An Improved Training Algorithm for Quantum Neural Networks[J]. Journal of Electronics & Information Technology, 2013, 35(7): 1630-1635. doi: 10.3724/SP.J.1146.2012.01417
Citation:
Zhang Yi-Peng, Chen Liang, Hao Huan. An Improved Training Algorithm for Quantum Neural Networks[J]. Journal of Electronics & Information Technology, 2013, 35(7): 1630-1635. doi: 10.3724/SP.J.1146.2012.01417
An improved training algorithm is proposed to solve the conflict of objective functions between weights and quantum interval during the training process of Multilevel Activation Functions-Quantum Neural Network (MAF-QNN), which will result in the degradation of training speed and network performance. By the criterion of least mean square error, the objective functions of the weights and quantum interval are unified and trained simultaneously. And then, by introducing the Levenberg-Marquardt (LM) algorithm, the probability of which the training results fall into local minimum is reduced. Therefore, the MAF-QNN can be trained effectively and efficiently. Simulation results show that, the proposed algorithm can decrease the iteration times efficiently and improve convergence precision significantly. In this way, it can be applied to data classification, function approximation and so on, expanding the application fields of MAF-QNN.