Agriculture Reference
In-Depth Information
and then go for the study of the form of the
association, that is, the regression analysis.
The simplest and widely used measure of
correlation is the measure of Karl Pearson's
correlation coefficients. Correlation coefficient
measures the degree of closeness of the linear
association between any two variables
two variable s such that ,
y i ¼
c þ dv i ) x ¼ a þ bu and y ¼ c þ dv ,and
S
x i ¼ a þ bu i a nd
2
2
2
u
2
2
2
v
x ¼ b
S
and
S
y ¼ d
S
;
i ¼
1,2,3,
...
,
n
and
a
,
b
,
c
,and
d
are constants, and
a
and
c
are changes of origin and
b
and
d
are changes
of scale.
So,
and is
given as
r xy ¼
Cov x;ðÞ
s x :s y
, where (
x 1 ,
y 1 ), (
x 2 ,
y 2 ), (
x 3 ,
y 3 ),
...
,
X
1
n
Cov
x; ðÞ¼
ðx i xÞðy i
x n ,
y n ) are
n
(
pairs of observations and
i
¼ bd
Cov
u; ð :
X n
i ¼ 1 x i x
1
n
ð
i
Þ
Cov
x; ðÞ¼
ð
Þ y i y
ð
Þ¼S xy
Thus, the correlation coefficient between
x
X n
1 x i x
y
1
n
and
becomes
2 and
2
ð
ii
Þ s
x ¼
ð
Þ
Cov
x; ðÞ
S
b:d
jj: jj :r uv :
X n
1 y i y
1
n
r xy ¼
q
¼
2
y ¼
2
ð
iii
Þ s
ð
Þ
2
x :S
2
y
n P n
Thus, the numerical value of
r xy and
r uv are the
1
1 x i y i xy
same. But, the sign of
r uv depends on the sign
s
Thus,
r xy ¼
:
1
of
b
and
d
. If both
b
and
d
are of the same sign,
n P n
n P n
1
1 x i
2
x
2
1 y i
2
y
2
then
r xy ¼ r uv ; on the other hand, if
b
and
d
are
r xy ¼r uv .
2. The correlation coefficient
of the opposite signs, then
It may be noted that we have considered
two variables
r xy
lies between
x
and
y
irrespective of
their
1.
3. The correlation coefficient between
1 and +1, that is,
1
r xy þ
dependency.
y
is same as the correlation coefficient between
y
x
and
Properties of Correlation Coefficient:
1. The correlation coefficient between any two
variables is independent of change of origin
and scale in value but depends on the signs
of scales.
Let us consider (
.
4. Being a ratio, a correlation coefficient is a
unit-free measure. So, it can be used to compare
the degree of the linear association between
the different pairs of the variables.
5. The two independent variables are uncorre-
lated, but the converse may not be true. Let
us consider the following two variables:
and
x
,
( x n , y n ), the n pairs of observations for the two
characters
x 1 ,
y 1 ), (
x 2 ,
y 2 ), (
x 3 ,
y 3 ),
...
x
and
y
having the means
x
and
y
2
x
2
y
and the variances
S
and
S
.Wetakeanother
P x ¼
x
4
3
2
1
0
1
2
3
4
0
P y ¼ 60
y
16
9
4
1
0
1
4
9
16
P xy ¼ 0
xy
64
27
8
1
0
1
8
27
64
n P n
2
Clearly, the relationship is
y ¼ x
between
x
1
1 x i yðÞx:y
S x :S y
Cov x;ðÞ
S x :S y ¼
and
. Thus, the zero correlation coefficient
between the two variables does not necessarily
mean that the variables are independent.
y
Therefore,
r xy ¼
¼
1
9 : 0 0 :
60
9
S x :S y ¼
:
0
Search WWH ::




Custom Search