Adjusting the U.S. Census of 1990
Considering the difficulties, the Census Bureau does a remarkably good job at counting people. This article discusses techniques for adjusting the census. If there is a large undercount, these techniques may be accurate enough for adjustment. With a small undercount, adjustment could easily degrade the accuracy of the data. The Bureau argued that errors in the census were more serious than errors in the proposed adjustment, using "loss function analysis" to balance the risks. This procedure turns out to depend on quite unreasonable assumptions. With other and more realistic assumptions, the balance favors the census. The story has a broader moral. Statistical models are often defended on grounds of robustness. However, internally generated measures of precision may be critical. If the model is at all complicated, these measures of precision may turn out to be driven by assumptions not data—the antithesis of robustness.
Year of publication: |
1994
|
---|---|
Authors: | Functions, Loss ; Freedman, David A. ; Wachter, Kenneth W. ; Cutler, D. Richard ; Klein, Stephen P. |
Published in: |
Evaluation Review. - Vol. 18.1994, 3, p. 243-280
|
Saved in:
Online Resource
Saved in favorites
Similar items by person
-
Freedman, David A., (1993)
-
Freedman, David A., (1991)
-
Ecological Regression and Voting Rights
Freedman, David A., (1991)
- More ...