Databases Reference
In-Depth Information
mining, and machine learning algorithms. The presence of both known logical
rules and degrees of uncertainty requires formalism that allow the representa-
tion of both deterministic and uncertain aspects of the problem. In the following,
we introduce such a probabilistic-logical framework based on Markov logic and
show how description logic ontologies are represented in the language. Moreover,
we describe the application of an ecient probabilistic inference algorithm that
uses integer linear programming. The main reason for focusing on Markov logic
is its declarative and easy to understand syntax as well as its rather expres-
sive nature - numerous other statistical relational formalism can be represented
with Markov logic networks. However, depending on the application and the
intended users, other statistical relational learning (SRL) formalisms might be
more appropriate and we make no claim of Markov logic being superior to other
languages.
4.1 Markov Logic
A Markov logic network (MLN) is a set of first-order formulas with weights.
Intuitively, the more evidence we have that a formula is true the higher the weight
of this formula. To simplify the presentation of the technical parts we do not
include functions. Moreover, Markov logic makes several assumptions such as (a)
different constants refer to different objects (the unique names assumption) and
(b) the only objects in the domain are those representable using the constants
(the closed domain assumption) [80]. In addition, we assume that all (ground)
formulas of a Markov logic network are in clausal form and use the terms formula
and clause interchangeably.
Syntax. A signature is a triple S =( O , H , C ) with O a finite set of observable
predicate symbols, H a finite set of hidden predicate symbols, and C a finite set
of constants. A Markov logic network (MLN) is a set of pairs
with each
F i being a function-free first-order formula built using predicates from O
{
( F i ,w i )
}
H
and each w i R
a real-valued weight associated with formula F i . For most appli-
cation of Markov logic, one also wants to include so-called hard formulas .Hard
formulas are constraints on the possible word. Intuitively speaking, whenever a
possible world violates a constraint its the probability of said world is either zero
or close to zero. There are different to define Markov logic networks depending
on whether one wants to model hard formulas with large weights (resulting in
a probability close to zero of a possible world violating it) or with constraint
(resulting in a zero probability of possible worlds violating it). For the sake of
simplicity, we will use the standard definition with weights only. Later, when we
cover some inference methods, we will also introduce the alternative definition.
Semantics. Let M =
be a Markov logic network with signature
S =( O , H , C ). A grounding of a first-order formula F is generated by substi-
tuting each occurrence of every variable in F with constants in C . Existentially
quantified formulas are substituted by the disjunctions of their groundings over
the finite set of constants. A formula that does not contain any variables is
{
( F i ,w i )
}
 
Search WWH ::




Custom Search