Likelihood ratio tests in random graph models with increasing dimensions
math.ST
/ Authors
/ Abstract
We explore the Wilks phenomena in two random graph models: the $β$-model and the Bradley-Terry model. For two increasing dimensional null hypotheses, including a specified null $H_0: β_i=β_i^0$ for $i=1,\ldots, r$ and a homogenous null $H_0: β_1=\cdots=β_r$, we reveal high dimensional Wilks' phenomena that the normalized log-likelihood ratio statistic, $[2\{\ell(\widehat{\mathbfβ}) - \ell(\widehat{\mathbfβ}^0)\} - r]/(2r)^{1/2}$, converges in distribution to the standard normal distribution as $r$ goes to infinity. Here, $\ell( \mathbfβ)$ is the log-likelihood function on the model parameter $\mathbfβ=(β_1, \ldots, β_n)^\top$, $\widehat{\mathbfβ}$ is its maximum likelihood estimator (MLE) under the full parameter space, and $\widehat{\mathbfβ}^0$ is the restricted MLE under the null parameter space. For the homogenous null with a fixed $r$, we establish Wilks-type theorems that $2\{\ell(\widehat{\mathbfβ}) - \ell(\widehat{\mathbfβ}^0)\}$ converges in distribution to a chi-square distribution with $r-1$ degrees of freedom, as the total number of parameters, $n$, goes to infinity. When testing the fixed dimensional specified null, we find that its asymptotic null distribution is a chi-square distribution in the $β$-model. However, unexpectedly, this is not true in the Bradley-Terry model. By developing several novel technical methods for asymptotic expansion, we explore Wilks type results in a principled manner; these principled methods should be applicable to a class of random graph models beyond the $β$-model and the Bradley-Terry model. Simulation studies and real network data applications further demonstrate the theoretical results.