On the Applications of Divergence Type Measures in Testing Statistical Hypotheses
The fundamentals of information theory and also their applications to testing statistical hypotheses have been known and available for some time. There is currently a new and heterogeneous development of statistical procedures, based on information measures, scattered through the literature. In this paper a unification is attained by consistent application of the concepts and properties of information theory. Our aim is to examine a wide range of divergence type measures and their applications to statistical inferences, with special emphasis on multinomial and multivariate normal distributions, The "maximum likelihood" and the "minimum discrepancy" principles are combined here in order to derive new approaches to the discrimination between two groups or populations. To study the asymptotic properties of divergence statistics, we propose a unified expression, called (, )-divergence, which includes as particular cases most divergences. Under different assumptions it is shown that the asymptotic distributions of the (, )-divergences are either normal or chi square. From the previous results a wide range of statistical hypotheses about the parameters of one or two populations can be tested To help clarify the discussion and provide a simple illustration examples are given.
Year of publication: |
1994
|
---|---|
Authors: | Salicru, M. ; Morales, D. ; Menendez, M. L. ; Pardo, L. |
Published in: |
Journal of Multivariate Analysis. - Elsevier, ISSN 0047-259X. - Vol. 51.1994, 2, p. 372-391
|
Publisher: |
Elsevier |
Saved in:
Online Resource
Saved in favorites
Similar items by person
-
DIVERGENCE MEASURES BASED ON ENTROPY FUNCTION AND STATISTICAL INFERENCE
Pardo, L., (1995)
-
A test for homogeneity of variances based on Shannon's entropy
Pardo, L., (1995)
-
Some bounds on probability of error in fuzzy discrimination problems
Pardo, L., (1991)
- More ...