Biology Reference
In-Depth Information
Algorithm 2.1 Inductive Causation Algorithm
1. For each pair of variables A and B in V search for set S AB
V (including S
= ∅
)
such that A and B are independent given S AB and A
,
B
S AB . If there is no such a
set, place an undirected arc between A and B .
2. For each pair of non-adjacent variables A and B with a common neighbor C ,
check whether C
S AB . If this is not true, set the direction of the arcs A
C and
B .
3. Set the direction of arcs which are still undirected by applying recursively the
following two rules:
C
B to A
C and C
a. if A is adjacent to B and there is a strictly directed path from A to B (a path
leading from A to B containing no undirected arcs) then set the direction of
A
B ;
b. if A and B are not adjacent but A
B to A
C and C
B , then change the latter to
C
B .
4. Return the resulting (completed partially) directed acyclic graph.
2.2 Static Bayesian Networks Modeling
The task of fitting a Bayesian network is usually called learning , a term borrowed
from expert systems theory and artificial intelligence ( Koller and Friedman , 2009 ).
It is performed in two different steps, which correspond to model selection and
parameter estimation techniques in classic statistical models.
The first step is called structure learning and consists in identifying the graph
structure of the Bayesian network. Ideally, it should be the minimal I-map of the
dependence structure of the data or, failing that, it should at least result in a dis-
tribution as close as possible to the correct one in the probability space. Several
algorithms have been proposed in the literature for structure learning. Despite the
variety of theoretical backgrounds and terminology, they fall under three broad cate-
gories: constraint-based , score-based ,and hybrid algorithms. As an alternative, the
network structure can be built manually from the domain knowledge of a human
expert and prior information available on the data.
The second step is called parameter learning . As the name suggests, it imple-
ments the estimation of the parameters of the global distribution. This task can be
performed efficiently by estimating the parameters of the local distributions implied
by the structure obtained in the previous step.
2.2.1 Constraint-Based Structure Learning Algorithms
Constraint-based structure learning algorithms are based on the seminal work of
Pearl on maps and its application to causal graphical models. His inductive
 
Search WWH ::




Custom Search