Goodness of Fit Test and Test of Independence by Entropy

Maryam Sharifdoost, Nader Nematollahi, Einollah Pasha


To test whether a set of data has a specific distribution or not, we can use the goodness of fit test. This test can be done by one of Pearson X2-statistic or the likelihood ratio statistic G2, which are asymptotically equal, and also by using the Kolmogorov-Smirnov statistic in continuous distributions. In this paper, we introduce a new test statistic for goodness of fit test which is based on entropy distance, and which can be applied for large sample sizes. We compare this new statistic with the classical test statistics X2, G2, and Tn by some simulation studies. We conclude that the new statistic is more sensitive than the usual statistics to the rejection of distributions which are almost closed to the desired distribution. Also for testing independence, a new test statistic based on mutual information is introduced.


Chi-squared test, goodness of fit test, test of independence, Kolmogorov-Smirnov test, likelihood ratio test, mutual information, relative entropy.

Full Text: PDF


  • There are currently no refbacks.

Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 License.