Agriculture Reference
In-Depth Information
function of the number of variables in the
model. The adjusted R
coefficient with the dependent variable. Similar
decisions in subsequent steps are taken to include
the other variables in stepwise manner. If the
inclusion of a new variable in the model
increases the explanatory power of the model,
that is, increases the value of
2 is defined as
R g SS d
=
TSS d
:
f
2
2
R
¼ R
adj ¼
=
:
f
P n
1 y ic y
2 , to a great extent
without hampering the nature of the coefficient
(s) of the previously included variable(s) in the
model coupled with a significant coefficient, then
the variable is retained in the model and it is
useful. If the inclusion of a new variable in the
model does not increase the explanatory power
of the model, that is, the value of
R
2
ð
Þ
=
ð
n k
1
Þ
¼
:
P n
1 y i y
2
ð
Þ
=
ð
n
1
Þ
We have
n
1 u
2 , to a great
extent, then the variable is redundant. If with the
inclusion of a new variable in the model does not
increase the explaining power of the model but
rather changes the nature of the coefficient(s) of
the variable(s) already in the model, then the
variable is detrimental.
In the backward regression technique, all the
variables under consideration are included in the
model at the first instance to get the multiple
regression equation;
R
2
i
R g SS
TSS ¼
TSS RSS
TSS
2
R
¼
¼
1
2 :
n
1 Y i Y
ð
Þ
On the other hand,
RMS
TMS ¼
RSS
=
ðn kÞ
R ¼
1
1
TSS
=
ðn
1
Þ
¼ 1 n
1
n k
TSS
R g SS
TSS
2 value and the nature of
the coefficients of the individual variables are
noted. The variable having most nonsignificant
coefficient (at a preassigned probability level) is
dropped at the first instance, and the multiple
regression equation is again framed by dropping
the variable. In the next subsequent steps, the
same procedure is followed to discard the
unuseful variables in stepwise manner. The pro-
cess continued till one gets a regression equation
with all the variables having significant coeffi-
cient at preassigned level of significance.
R
:
n
1
n k
2
¼
1
1
R
In regression model, we have:
2
R
2 ; that
means as the number of independent variables
increases, R
1.
K >
2 thereby indicating that
< R
2 increases less than
2
R
.
n
1
n k ð
2
R
2
2
2.
¼
1
1
R
Þ
when
R
¼
1,
R
2
¼
1.
n
1
n k ¼
) R
2
2
3. When
R
¼
0
¼
1
n k n þ
1
1
k
n k ;
2 the R
2 is -ve.
¼
if
k
n k
Example 12.2.
The following table in the next
page gives the yield attributing characters along
with yield for 37 varieties. We are to work out the
linear relationship of yield with other yield
components having significant coefficients.
From the given data, one should first make the
following correlation table as per the method
suggested in Chap. 8 (Table 12.2 ).
From the correlations of yield with other
variables, it is found that the order of correlation
coefficientsisX11
2 can be negative if
R
k >
Thus, adjusted
2 and
2 becomes nega-
tive, then its value is taken as zero.
While dealing with multiple variable regres-
sion equation, particularly with respect to the
variable to be retained in the multiple regression
equation, we generally follow two procedures—
(a) stepwise forward and (b) stepwise backward
regression technique—to get the actual relation-
ship. In forward regression technique, the vari-
able to be included first in the model is guided
by the theoretical and logical idea about the
variables under consideration and is further
supported by the higher values of the correlation
0; when the value of the R
2
R
¼
X13
and so on. So if one wants to have stepwise forward
regression, then he or she should start withX11 first
and then in subsequent step as per the order of the
correlation coefficient shown above (Table 12.3 ).
>
X12
>
X7
>
X10
>
Search WWH ::




Custom Search