Zhang Ya-Hong, Li Yu-Jian, Zhang Ting. Detecting Multivariable Correlation with Maximal Information Entropy[J]. Journal of Electronics & Information Technology, 2015, 37(1): 123-129. doi: 10.11999/JEIT140053
Citation:
Zhang Ya-Hong, Li Yu-Jian, Zhang Ting. Detecting Multivariable Correlation with Maximal Information Entropy[J]. Journal of Electronics & Information Technology, 2015, 37(1): 123-129. doi: 10.11999/JEIT140053
Zhang Ya-Hong, Li Yu-Jian, Zhang Ting. Detecting Multivariable Correlation with Maximal Information Entropy[J]. Journal of Electronics & Information Technology, 2015, 37(1): 123-129. doi: 10.11999/JEIT140053
Citation:
Zhang Ya-Hong, Li Yu-Jian, Zhang Ting. Detecting Multivariable Correlation with Maximal Information Entropy[J]. Journal of Electronics & Information Technology, 2015, 37(1): 123-129. doi: 10.11999/JEIT140053
Many measures, e.g., Maximal Information Coefficient (MIC), are presented to identify interesting correlations for pairs of variables, but few for triplets or even for higher dimension variable set. Based on that, the Maximal Information Entropy (MIE) is proposed for measuring the general correlation of a multivariable data set. For k variables, firstly, the maximal information matrix is constructed according to the MIC scores of any pairs of variables; then, maximal information entropy, which measures the correlation degree of the concerned k variables, is calculated based on the maximal information matrix. The simulation experimental results show that MIE can detect one-dimensional manifold dependence of triplets. The applications to real datasets further verify the feasibility of MIE.