Adjusting the Census of 1990
Considering the difficulties, the Census Bureau does a remarkably good job at counting people. This article discusses techniques for adjusting the census. If there is a large undercount, these techniques may be accurate enough for adjustment. With a small undercount, they are unlikely to improve on the census; instead, adjustment could easily degrade the accuracy of the data. The focus will be sampling error, that is, uncertainty in estimates due to the luck of the draw in choosing the sample. Sampling error is a major obstacle to adjusting the 1990 census, even at the state level. To control sampling error, the Census Bureau used a smoothing model. However, the model does not solve the problem, because its effects are strongly dependent on unverified and implausible assumptions. This story has a broader moral. Statistical models are often defended on grounds of robustness, that is, estimates do not depend strongly on assumptions. But the standard errors, which are internally generated measures of precision, may be critical. Then caution is in order. If the model is at all complicated, the standard errors may turn out to be driven by assumptions not data—the antithesis of robustness.
Year of publication: |
1993
|
---|---|
Authors: | Freedman, David A. ; Wachter, Kenneth W. ; Coster, Daniel C. ; Cutler, D. Richard ; Klein, Stephen P. |
Published in: |
Evaluation Review. - Vol. 17.1993, 4, p. 371-443
|
Saved in:
Online Resource
Saved in favorites
Similar items by person
-
Adjusting the U.S. Census of 1990
Functions, Loss, (1994)
-
Freedman, David A., (1991)
-
Ecological Regression and Voting Rights
Freedman, David A., (1991)
- More ...