Database Reference
In-Depth Information
Fig.7.21. Relationships between variables in the ICU database.
We applied IMANNs to classify the ICU database. The PLNN has
15×
squared neurons in a hidden layer after the neuron generation/annihilation, as
shown in Fig. 7.22. A total of 61 neurons were classified into 20 subgroups in the
lattice of T-cell NN. T-cell NN learns the relation between an input pattern and its
allocated categories using two divided PLNN. B-cell NN trains the neural network
for the 20 subgroups of training cases, respectively. The total correct rate on the
test data was 91.7% (629/686). The correct rates for the 20 subgroups were 90.9%,
94.7%, 84.6%, 97.7%, 88.5%, 96.9%, 100.0%, 17.6%, 94.4%, 92.9%, 100.0%,
77.3%, 98.7%, 76.0%, 97.0%, 84.6%, 100.0%, 41.7%, 95.0%, and 90.0%,
respectively. We found that the correct rate for the eighth subgroup was very low.
Although the B-cell NN in this group were trained to output “alive,” most of the
assigned test cases in this group by the T-cell NN were “dead.” To improve this
low accuracy, the macrophage NN was trained until the squared error was very
small.
We applied a search algorithm to the subspaces divided by PLNN (T-cell
NN) and extracted If-Then rules from the trained IMANN without using an
expert's explicit knowledge. After giving all possible patterns of input vector, the
search algorithm may be able to extract not only the explicit knowledge, but also
new unknown knowledge from the network as shown in Fig. 7.23 [22].
IMANN makes rough classification of training cases and checks whether the
classification result is correct. T-cell NN determines one subgroup with the
maximum output activity in the lattice of T-cell NN. This subgroup is considered
to have typical characteristics of input-output patterns. Each subgroup is labeled
according to the characteristics. For example, we may find that the characteristics
of the output signal categorize the subgroup.
15
Search WWH ::




Custom Search