Civil Engineering Reference
In-Depth Information
33
35
35
32
Ye s
No
Unknown
Ye s
No
Unknown
30
30
25
25
20
20
15
15
10
10
0
4
4
5
2
5
2
1
1
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
(a)
(c)
35
35
Ye s
No
Unknown
Ye s
No
Unknown
30
30
27
26
25
25
20
20
15
15
8
8
10
10
3
5
22
5
1
1
1
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
AT C 13 damage state
AT C 13 damage state
(b)
(d)
7.13 Performance modifi ers for building vulnerability: (a)
discontinuous columns; (b) plan setback; (c) torsional imbalance; and
(d) plan irregularity.
7.5.4 Conditional probability development
This example was implemented in commercially available Bayesian network
software, Netica. Once the hierarchical structure is developed (Fig. 7.12),
The Netica 1 , offers three alternatives for learning models; regular learning,
expected maximization (EM) learning, and gradient learning. The regular
learning entails learning by applying Bayesian conditional probability to
statistical models built from data loaded into the nodes of the network. For
regular learning, the inputs (PI, VI, YC, DD, SSH) should be in the same
level, and directly connected to the output (BD). However, as shown in Fig.
7.12, in case where hidden variables are introduced (ID, DC, BV), the EM
and/or gradient learning algorithm should be used.
1
Norsys Netica, Advanced Topics: Missing Data and Hidden Variable, Online Tutorial (http://
www.norsys.com/tutorials/netica/secD/tut_D1.htm#EMLearning).
 
Search WWH ::




Custom Search