Biology Reference
In-Depth Information
dimensions p 1 3
p 2 respectively and the covariance matrix between the two
blocks R 12 (with dimension p 1 by p 2 ), giving
p 1 and p 2 3
R 1 R 12
R t 12 R 2
R 5
(7.1)
where R t 12
covariance of the combined data can thus
be thought of as having three distinct parts, the first two of which are the variance
is the transpose of R 12 . The variance
covariance matrices within each block ( R 1 , R 2 ) and the third, which is the covariance
between the two blocks ( R 12 ).
The covariance between the two blocks can be quantified by Escoufier's coefficient
( Escoufier, 1973 ), which is a multivariate extension of the ordinary univariate correlation.
That coefficient is given by the expression:
R 12 R t 12 Þ
trace
ð
p
trace
RV
5
(7.2)
R 1 R t
R 2 R t
ð
1 Þ
trace
ð
2 Þ
The numerator is the summed squared covariances between the two sets of variables
and the denominator is the square root of the product of the summed squared variances
within each block. Escoffier's coefficient thus ranges from 0 (no covariance) to 1 (complete
covariance). The statistical significance of RV can be tested by randomizing the order of
observed values (the rows of the matrix Y 1 , for example) and recomputing the value of the
coefficient for the permuted version of Y 1 and Y 2 ( Klingenberg, 2009 ). Note that this will
alter the covariance between blocks, but not the variance within each. If the observed
value of RV lies outside the confidence interval of values obtained by the permutations,
for some chosen
, then the observed RV (and therefore the covariance between the two
blocks) is statistically significant.
α
MATHEMATICAL DETAILS OF TWO BLOCK PLS
Given that matrix, Y 1 ,ofp 1 variables measured on n specimens, and the other block, Y 2 ,
of p 2 variables, we compute the variance
covariance matrix, R , as discussed above, which
comprises the within-block variance
covariance matrices of blocks Y 1 and Y 2 ( R 1 and R 2 ,
respectively) and the covariance matrix between the two blocks R 12 (as shown in Equation
7.1 ). We then perform a singular value decomposition (SVD) of R 12 :
USV t
R 12
5
(7.3)
λ I (there are as
many singular values as there are variables in the smaller block, P min ). The matrices U and
V have dimensions p 1 3
S is a p 1 3
p 2 diagonal matrix whose entries are the P min singular values,
p min , respectively; their columns are the Singular
Axes (SAs). The first columns of U and V comprise the paired SAs corresponding to the
first singular value
p min and p 2 3
λ 1 , just as the first Principal Component (PC1) is the axis correspond-
ing to the first eigenvalue of the variance
covariance matrix. The SAs are ordered by
decreasing singular values, just as PCs are ordered by decreasing eigenvalues. Scores on
Search WWH ::




Custom Search