Civil Engineering Reference
In-Depth Information
If participants receive signal s
S , they infer whether each state
ω
belongs to
ϕ 1
(
s
)
and assign the state with a posterior probability:
( ϕ 1
ω ϕ 1
P
( ω |
s
) /
P
(
s
)) ,
(
s
) ,
P
( ω ) >
0
P
( ω |
s
)=
ω ϕ 1
(1)
0
,
(
s
) ,
P
( ω ) >
0
1
,
P
( ω )=
0
In ( 1 ), the first equation is derived from the definition of conditional probability.
The second equation is because such conditional event is impossible. The third
equation holds because P
( ϕ 1
( ω |
s
) /
P
(
s
))
is of 0
/
0 type; then according to the
( ϕ 1
monotonicity assumption (ii), P
decreases, so we
can assign it with 1. From an economic sense, formula ( 1 ) denotes that market
participants have a new subjective (a posterior) knowledge on the probability
of natural states according to the observed signal and determine the value of
such probability according to the ratio of objective probability of natural state to
subjective prior probability , meeting boundary conditions.
Define the degree of consistency of signal
(
s
))
scales down as P
( ω |
s
)
ϕ ( ω )=
s to natural state
ω
:
d :
Ω ×
S
R (real set)
d
( ω ,
s
)=
P
( ω |
s
)
P
( ω )
( ϕ 1
( ϕ 1
ω ϕ 1
[
1
P
(
s
))]
P
( ω |
s
) /
P
(
s
)) ,
(
s
) ,
P
( ω ) >
0
(2)
ω ϕ 1
=
P
( ω ) ,
(
s
) ,
P
( ω ) >
0
1
,
P
( ω )=
0
1.
In essence, the market participants recognize the real situation of natural states
based on the difference between subjective posterior probability (based on the
observed signal) and objective natural states probability. Such recognition may be
consistent with or contrary to the real situation.
Based on the above analysis, we give a formal definition of information as
follows:
Obviously,
1
d
( ω ,
s
)
Definition of Information. For given space
( Ω ,
S
,
P
, ϕ ,
d
)
,when d
( ω ,
s
) >
0, the
increased recognition degree on the state
appearance caused by observed signal
s is known as information (positive information); when d
{ ω }
( ω ,
s
) <
0, the opposite
recognition degree on the state
appearance induced by observed signal s is
known as noise (negative information).
With terms in information theory, the significance of d
{ ω }
( ω ,
s
)
can be re-explained
as follows: d
( ω ,
s
)=
0 denotes observed signal s does not convey any information
of natural state
{ ω }
; d
( ω ,
s
) >
0 means observed signal s represents information
of natural state
{ ω }
to some extent; d
( ω ,
s
) <
0 indicates observed signal s reflects
negative information of natural state
{ ω }
to some extent. The closer d
( ω ,
s
)
is to 1,
the stronger information of natural state
{ ω }
observed signal s conveys. The closer
d
( ω ,
s
)
is to
1, the stronger noise of natural state
{ ω }
observed signal s conveys.
Search WWH ::




Custom Search