For random variables A and X, consider the relationship
E{XA}=E{X}E{A}+
ρXAσXσA
which, up to a bit of arithmetic, is basically the definition of correlation. If A is a binary variable, though, we can do more with this; among other things, in this case σ
A2=E{A}(1-E{A}). Conflating the variable A a bit with the "event" A=1, and doing a bit of algebra, we get
The effect of the arrival of new information on the expected value of a variable is proportional to the square root of the odds ratio. Among other things, it can't be more than σ
X times the square root of the odds ratio, though this bound, which (obviously?) is reached when X is a linear function of A and therefore is a binary variable, can be more directly derived in that context.
No comments:
Post a Comment