Showing 1 - 1 of 1
The intermittency of a time series can be defined as its normalized difference in scaling parameters. We establish the central limit theorem for the estimates of intermittency under the null hypothesis of a random walk. Simulations of random walks indicate that the distribution of intermittency...
Persistent link: https://www.econbiz.de/10012923881