Statistical Modelling 13 (5&6) (2013), 409–429

Applications of a Kullback-Leibler divergence for comparing non-nested models

Chen-Pin Wang
Department of Epidemiology and Biostatistics,
University of Texas Health Science Center,
San Antonio,
TX, USA
e-mail: wangc3@uthscsa.edu

Booil Jo
Department of Psychiatry and Behavioral Sciences,
Stanford University,
Stanford,
CA, USA


Abstract:

Wang and Ghosh (2011) proposed a Kullback-Leibler divergence (KLD) which is asymptotically equivalent to the KLD by Goutis and Robert (1998) when the reference model (in comparison with a competing fitted model) is correctly specified and when certain regularity conditions hold true. While properties of the KLD by Wang and Ghosh (2011) have been investigated in the Bayesian framework, this paper further explores the property of this KLD in the frequentist framework using four application examples, each fitted by two competing non-nested models.

Keywords:

comparison of non-nested models; information criterion; Kullback-Leibler divergence
back