Information Technology Reference
In-Depth Information
When the learning process starts, GER uses and maintains the Q-tree in order to
select a production rule in the grammar. GER tries to select the fitter production
rule by using the Q-values. The Q-value represents a reward and it is computed
and updated when the individual is evaluated during the evaluation process.
An important drawback in GE and GER is that they only can guarantee
syntactically correct programs but none about semantic correctness is assured.
With the aim to solve this drawback a GER with semantic rules is recently
being developed [18]. Semantic is included via translation schemes [19]. We have
no space here to describe the process with detail but semantic rules represent
semantic constraints and they consist of attributes and rules. They appear in the
original grammatical rules in order to avoid that semantically wrong individuals
can be generated. Both GE and GER can build whatever structurewecan
describe by means of a grammar and we will use it to describe a vocabulary
made up of words. Thus, following grammar would describe a simple vocabulary
with the words wall , box , bal l and dice .
<vocabulary> ::= <word-set> [<vocabulary>.value='vocabulary']
(0)
<word-set> ::= <word>
[<word-set>.value=<word>.value] (0)
| <word-set> ; <word>
[if <word-set>.value "contains" <word>.value then semantic=wrong]
[<word-set>.value=<word-set>.value+<word>.value]
(1)
<word> ::= wall [<word>.value='wall'] (0)
| box [<word>.value='box']
(1)
| ball [<word>.value='ball']
(2)
| dice [<word>.value='dice']
(3)
Numbers in the right side are used during the translation and learning pro-
cesses in order to choose the rules. Alternatively, sentences in square brackets
represent the semantic constraints and they can be interpreted as follow: each
non-terminal has an attribute value and some non-terminal can include a seman-
tic constraint as a semantic rule. In the example, the rule if < word-set > .value
contains < word > .value then semantic=wrong avoids that individuals with two
o more repeated words can be generated. This way, we discard individuals with
repeated words because they are useless.
4
Incremental Model and Environment
4.1
Incremental Model of Lexicon Consensus
Basically, the incremental model is held by a modified GER. The modified version
of GER tries to maintain the solutions previously found by an original GER. To
clarify how the incremental model is implemented we summarize the algorithm
in figure 1.
Fitness value is calculated from two contributions. Firstly, we consider the
individual's skill in order to generate unique words in the vocabulary. The max-
imum value would be the maximum number of unique words in the vocabulary.
 
Search WWH ::




Custom Search