• Media type: E-Article
  • Title: Local Divergence and Association
  • Contributor: Blyth, Stephen
  • imprint: Biometrika Trust, 1994
  • Published in: Biometrika
  • Language: English
  • ISSN: 0006-3444
  • Keywords: Miscellanea
  • Origination:
  • Footnote:
  • Description: <p>In applications of differential geometry to problems of parametric statistical inference, the notion of divergence is often used to measure the separation between two parametric densities. In this paper we move away from the parametric framework of much of statistical differential geometry and instead employ divergence in a covariate setting. Many data-analyses involve investigation of how the conditional distribution<tex-math>$f(y\midX=x)$</tex-math>of a random variable Y changes with covariate values x. We propose using local divergence between conditional distributions as a measure of this change and thus as a general measure of association between Y and the covariates X. Under certain regularity conditions we define a class of divergences which are locally the Rao distance (LR). The limiting LR divergence is bounded below by the signal-to-noise ratio, with equality holding if and only if the conditional density comes from the natural exponential family. The correlation curve and in particular the correlation coefficient are simply transformations of the signal-to-noise ratio and thus are transformations of local divergence. We therefore obtain a differential geometric interpretation of standard ideas of association. The class of LR divergences is broad and includes the Kullback-Leibler divergence and Renyi α-information measures. We can therefore interpret local association as local utility or information gain of order α. We obtain comparable results without regularity conditions.</p>