Image Processing Reference
In-Depth Information
homogeneity approach (Cheng, Chen, Chiu, and Xu, 1998) and the histon based approach
(Mohabey and Ray, 2000) exploit this correlation to improve the quality of segmentation.
The concept of histon, introduced by Mohabey and Ray (Mohabey and Ray, 2000), is an
encrustation of the histogram that visualizes the multi-dimensional color information in an
integrated fashion. The concept has found applicability towards boundary region analysis
problems. The histon encapsulates the fundamentals of color image segmentation in a
rough-set theoretic sense and provides a direct means of segregating pool of inhomogeneous
regions into its components.
In this chapter, we present a new technique for color image segmentation using a rough-
set theoretic approach. The roughness index, obtained by correlating histon with the upper
approximation and the histogram to the lower approximation of a rough set, has been used
as a basis for segmentation (Mushrif and Ray, 2008).
In the next section we present the basic concepts of the rough set theory and some im-
portant properties of rough sets. In section 1.3, we describe the concept of histon and
calculation of roughness measure. Section 1.4 describes segmentation algorithm and exper-
imental results are given in section 1.5, followed by concluding remarks in section 1.6.
10.2Rough-settheoryandproperties
Rough set theory, introduced by Z. Pawlak (Pawlak, 1991), represents a new mathemati-
cal approach to vagueness and uncertainty. The theory is especially useful in discovery of
patterns in data in real life applications such as medical diagnosis (Tanaka, Ishibuchi, and
Shigenaga, 1992), pharmacology, industry (Szladow and Ziarko, 1992), image analysis (Pal,
Shankar, and Mitra, 2005) and others.
Rough set theory provides a possibilistic approach towards classication and extraction
of knowledge from a data set. It supports granularity in knowledge and concerns with
understanding knowledge, nding means of representation of knowledge and automation of
the process of extraction of information from knowledge bases. Rough set theory addresses
the issue of indiscernibility and is a formal framework for the automated transformation of
of data into knowledge. The knowledge is primarily dened by the ability of the system to
classify data or objects. Thus, it is necessarily connected with the variety of classication
patterns related to specic parts of the real or abstract world, called universe of discourse.
In this section, we introduce some preliminary concepts of rough-set theory that are relevant
to this chapter.
Given a nite set U 6= ; (the universe) of objects, any subset X U of the universe
is called aconcept or acategoryinUand any family of concepts inUis referred to as
abstractknowledgeaboutU. Categories lead to the classication or partition of a certain
universe U, i.e. in families C = fX 1 ;X 2 ;:::;X n g such that X i U; X i 6= ;; X i \X j = ;
for i 6= j; i;j = 1; 2;:::;n and S X i = U.
A knowledge base is a relational system K = (U;R), where U 6= ; andRis a family of
equivalence relations overU. IfPRandP6= ;, then \Pis also an equivalence relation,
and is denoted by IND (P), and is known asindiscernibilityrelation overP. Moreover
[x] IND( P ) = \
R 2P
[x] R
(10.1)
Thus, U=IND (P) or simply U=Pdenotes the knowledge associated with the family of
equivalence relationsP, calledP-basic knowledge aboutUin the knowledge baseK. The
equivalence classes of IND (P) are called basic categories of knowledgeP. TheP-basic
 
 
Search WWH ::




Custom Search