Agriculture Reference
In-Depth Information
18.3 PREDICTING APPLE SCAB RISK BASED ON THE PHYSICAL
ENVIRONMENT
The Mills system for predicting apple scab infection periods is used worldwide in
various forms to provide a rational basis for implementing scab control strategies.
Developed in New York State, the Mills system was important initially for
scheduling applications of sulphur during rain (Mills, 1944; Mills and LaPlante,
1951). Sulphur, the main fungicide available in the 1940s, was effective only when
applied immediately ahead of infection events. Unnecessary applications of sulphur
dusts and sprays could be avoided by delaying each treatment to within a few hours
before infection was anticipated based on Mills. The Mills system was also used to
establish the need and timing for emergency post-infection application of lime
sulphur. High rates of lime sulphur would control scab when applied within a few
hours after a predicted infection event but the treatment was used sparingly to avoid
possible reductions in fruiting.
Today, the Mills system is the foundation for most integrated pest management
(IPM), integrated fruit production (IFP) and stewardship programmes for apple
disease management (Cross, 1994; Jones, 1995; Sutton, 1996). It is used for
improving the timing of fungicides with post-infection activity and for establishing
when protective spray programmes may need to be backed up with post-infection
sprays. Depending on the degree of their kick-back or reach-back activity, post-
infection fungicides are applied within 24 to 96 hours after the onset of a predicted
infection period. The evolution of this scab prediction system since its introduction
in the 1940s provides an excellent case history for students interested in how
forecast systems are introduced and then maintained as production systems change.
The principles of disease forecasting are discussed in Chapter 9.
Mills's publication of curves (Fig. 18.1) and later a table relating infection by the
scab fungus to the number of hours of wetting at various temperatures was a seminal
event in apple scab epidemiology. Initial inoculum was assumed to be abundant - a
realistic assumption considering that sulphur is not especially efficacious and was
the predominant fungicide used at that time. Then, based on correlations with leaf
wetness and temperature, predictions were made for slight, moderate, or severe
infection based on the duration of continuous wetness at temperatures between 5.5
and 25.5°C. Research in Wisconsin by Keitt and Jones (1926) provided Mills with
the baseline for his curves. This baseline was the minimum number of hours of
wetting that was necessary for infection of leaves on inoculated apple trees placed in
temperature-controlled dew chambers. Because of his uncertainty about the
minimum conditions for infection in the field, Mills made the curve for light
infection from ascospores much thicker than the curves for moderate and severe
infection (Mills, 1944). The ambiguous way in which Mills described light infection
has been overlooked in many reproductions of Mills's original curves. The thick line
was replaced by the word 'approximate' in the heading for the Mills table. Research
aimed at improving the accuracy of this infection curve has been intensive and
ongoing in many countries since the 1940s.
Search WWH ::




Custom Search