Cryptography Reference
In-Depth Information
⎡
⎤
10 1 0
01 0 1
10
G
2
=
10
01
11
1
10
01
=
10
01
⎣
⎦
⊗
⊗
⊗
−
1
10
01 0
−
−
1
⎡
⎣
⎤
⎦
10100000
01010000
10
10000 0
01 0
−
100 0 0
00001010
00000101
00 0 0 10
−
=
10
00 0 0 01 0
−
−
1
⎡
⎤
11000000
1
⎣
⎦
1000000
00110000
001
−
⎡
⎤
1000
0100
0010
0001
11
1
=
⎣
⎦
⊗
10 0 0 0
00001100
00001
−
G
3
=
−
1
10 0
00000011
0000001
−
−
1
We can then calculate
R
by three matrix multiplications:
R
=[0
.
8; 0
.
0;
−
1
.
4; 1
.
8;
−
1
.
0;
−
1
.
8; 0
.
0; 3
.
2]
·
G
1
·
G
2
·
G
3
=[0
.
8; 0
.
0;
−
1
.
4; 1
.
8; 1
.
8;
−
4
.
6; 1
.
6; 1
.
6]
The matrices
G
i
are sparse matrices having only two non-null elements per
column. Moreover, factorization of the Hadamard matrix
H
N
has a length
proportional to
log(
N
)
. The total computation cost is therefore in
N
log(
N
)
by
the fast transform, instead of
N
2
by the direct method. Figure 8.6 presents the
graph of the calculations for the fast Hadamard transform in the case
N
=8
.
In terms of error correcting performance, article [8.6] shows that we obtain
the same performance as the Chase-Pyndiah algorithm for two times fewer iter-
ations.
8.4.5 Other soft input decoding algorithms
There are still other algorithms for decoding block codes with weighted input and
output. One of the oldest is Farrell
et al.
's algorithm [8.5] for decoding by local