Showing 1 - 10 of 558
Models defined by moment inequalities have become a standard modeling framework for empirical economists, spreading over a wide range of fields within economics. From the point of view of an empirical researcher, the literature on inference in moment inequality models is large and complex,...
Persistent link: https://www.econbiz.de/10014247961
Panel or grouped data are often used to allow for unobserved individual heterogeneity in econometric models via fixed effects. In this paper, we discuss identification of a panel data model in which the unobserved heterogeneity both enters additively and interacts with treatment variables. We...
Persistent link: https://www.econbiz.de/10014322772
We study the interpretation of regressions with multiple treatments and flexible controls. Such regressions are often used to analyze stratified randomized control trials with multiple intervention arms, to estimate value-added (for, e.g., teachers) with observational data, and to leverage the...
Persistent link: https://www.econbiz.de/10013334327
A central question in applied research is to estimate the effect of an exogenous intervention or shock on an outcome. The intervention can affect the outcome and controls on impact and over time. Moreover, there can be subsequent feedback between outcomes, controls and the intervention. Many of...
Persistent link: https://www.econbiz.de/10015056147
show that IDT activity reduces bid ask spread and increases intra-day volatility and total volume traded. The volume traded …
Persistent link: https://www.econbiz.de/10014250145
A growing number of central authorities use assignment mechanisms to allocate students to schools in a way that reflects student preferences and school priorities. However, most real-world mechanisms incentivize students to strategically misreport their preferences. In this paper, we provide an...
Persistent link: https://www.econbiz.de/10014544713
We propose a new specification test to assess the validity of the judge leniency design. We characterize a set of sharp testable implications, which exploit all the relevant information in the observed data distribution to detect violations of the judge leniency design assumptions. The proposed...
Persistent link: https://www.econbiz.de/10014544734
Econometric software packages typically report a fixed number of decimal digits for coefficient estimates and their associated standard errors. This practice misses the opportunity to use rounding rules that convey statistical precision. Using insights from the testing statistical hypotheses of...
Persistent link: https://www.econbiz.de/10014486216
When economists analyze a well-conducted RCT or natural experiment and find a statistically significant effect, they conclude the null of no effect is unlikely to be true. But how frequently is this conclusion warranted? The answer depends on the proportion of tested nulls that are true and the...
Persistent link: https://www.econbiz.de/10014372423
We propose a new framework to explain the factor structure in the full cross section of Treasury bond returns. Our method unifies non-parametric curve estimation with cross-sectional factor modeling. We identify smoothness as a fundamental principle of the term structure of returns. Our approach...
Persistent link: https://www.econbiz.de/10014544750