Information Technology Reference
In-Depth Information
With enough data in each group—enough choices—recombinations like this can
occur following the basic models from the database. By using the term 'recombina-
tion' here, I'm referring to a Markov chaining process that uses probabilities as just
described. Unfortunately, this process can produce excessively long or almost imme-
diately short outputs, neither of which really imitates the melodies shown. Therefore,
the program needs more rules that will, on the one hand, provide more than next-pitch
probabilities to give shape and logic to these melodies and on the other hand ensure
that the output won't simply repeat one of the melodies exactly.
As a simple example of how this might occur, imagine that the program's groups
are labeled with single digit numbers, each containing a significant number of options
within it. The models from which the program creates new output might then have
this as input:
12345678
14358672
32145768
The numbers tell us is that the orders provided here are acceptable. The number 1
canbefollowedby2or4,with4amorelikelychoice by a probability of two to one.
The number 2 can be followed by 3, 1, or nothing—a cadence in music—each with
equal chance. Thus, correctly recombining the models could produce the following
outputs34358672and12356768,andsoon.Bothofthesearecorrectasfar
as they go.
What I've just described is often called a First Order Markov forward-chaining
process. First order, because only one member (the previous one) of a chain of items
is necessary to predict the next. A Second Order Markov forward-chaining process
requires two numbers to gauge the next, and so forth. In this manner, almost any
algorithmic process can be described as a Markov chain. As example and using the
previous models, 1 and 2 can only be followed by 3, 1 and 4, can be followed by
3 or 5 with equal probability. This process certainly helps provide more context to
the choices made. Unfortunately, second order chaining does not help with ending
an output with reasonable length phrases. Furthermore, the higher the order number
used, the more likely the output will repeat exactly one of the inputs.
To make the process more constrained and contextual, therefore, I now add orders
of backward-chaining Markov processes used in conjunction with the forward-
chaining ones. This means that the just-described process will also include the objects
that follow the to-be-chosen one. Based on the previous models as input, 1 can be
followed by 2 or 4 with 4 having a two to one better chance of being chosen as previ-
ously mentioned. However, backward chaining suggests that the 3 group determines
the opposite; that 2 and 4 are equally probable given that 3 is the most probable of
the following numbers.
To make this clearer, the number sets of 123, 143, and 145 give the probability
that 1 will be followed by 2 one third of the time or 4 two thirds of the time, while
taking the third number and working backward gives the number sets of 321, 341,
and 541, clearly showing that 4 is the more likely choice and thus balancing the
forward-chaining process. While subtle, this kind of contextual framework produces
Search WWH ::




Custom Search