Time Period and Risk Measures in the General Risk Equation
In an earlier paper, a general risk equation, applicable to all non growth systems, and inclusive of financial systems, was derived. It related expected throughput capacity of any system to both system resources and positive risk of loss of throughput capacity. Two risk measures were required, a new <italic>MEL-risk</italic> measure, and the conventional <italic>standard-deviation risk</italic> measure. In this paper we show that the two apparently distinct risk measures are intimately related, and that which one is appropriate depends merely on the time period over which the risk is calculated. We show, ultimately by application of the Central Limit Theorem, that if we merely sufficiently alter the time period, at some point the need for one measure will transition into the need for the other, without any change in the underlying physical system. This leads to a comprehensive risk measure that defaults to either the MEL-risk measure, or standard-deviation measure, depending not on the physical system, but merely on the time period over which the risk is calculated.
Year of publication: |
2007
|
---|---|
Authors: | Bradley, James |
Published in: |
Journal of Risk Research. - Taylor & Francis Journals, ISSN 1366-9877. - Vol. 10.2007, 3, p. 355-369
|
Publisher: |
Taylor & Francis Journals |
Saved in:
Online Resource
Saved in favorites
Similar items by person
-
The relation between student attitudes toward graphs and performance in econommics
Cohn, Elchanan, (2004)
-
Determinants of undergraduate GPAs : SAT scores, high-school GPA and high-school rank
Cohn, Elchanan, (2004)
-
Do Graphs Promote Learning in Principles of Economics?
Cohn, Elchanan, (2001)
- More ...