Biomedical Engineering Reference
In-Depth Information
the capacity of the facility. Determining the 'required' size for a treatment wet-
land is often complicated by uncertainty regarding the full range of wastewater
volumes and component character likely to be encountered over the lifetime of
the operation. The traditional response to this is to err on the side of caution
and oversize, which, of course, has inevitable cost implications, but in addition,
also affects the overall water budget. If the effluent character is known, or a
sample can be obtained, its BOD can be found and it is then a relatively simple
procedure to use this to calculate the necessary system size. However, this should
only ever be taken as indicative. For one thing, bio-engineered treatment systems
typically have a lifespan of 15-20 years and the character of the effluent being
treated may well change radically over this time, particularly in response to shifts
in local industrial practice or profile. In addition, though BOD assessment is a
useful point of reference, it is not a uniform indicator of the treatment require-
ments of all effluent components. For the bioamelioration process to proceed
efficiently, a fairly constant water level is necessary. Although the importance
of this in a drought scenario is self-evident, an unwanted influx of water can
be equally damaging, disturbing the established equilibrium of the wetland and
pushing contaminants through the system before they can be adequately treated.
Provision to include sufficient supplementary supplies, and exclude surface water,
is an essential part of the design process.
One aspect of system design which is not widely appreciated is the importance
of providing a substrate with the right characteristics. A number of different
materials have been used with varying degrees of success, including river sands,
gravels, pulverised clinker, soils and even waste-derived composts, the final
choice often being driven by issues of local availability. The main factors in
determining the suitability of any given medium are its hydraulic permeability
and absorbance potential for nutrients and pollutants. In the final analysis, the sub-
strate must be able to provide an optimum growth medium for root development
while also allowing for the uniform infiltration and through-flow of wastewater.
A hydraulic permeability of between 10 3 and 10 4 m/s is generally accepted as
ideal, since lower infiltration tends to lead to channelling and flow reduction, both
of which severely restrict the efficiency of treatment. In addition, the chemical
nature of the chosen material may have an immediate bearing on system efficacy.
Soils with low inherent mineral content tend to encourage direct nutrient uptake
to make good the deficiency, while highly humeric soils have been shown to
have the opposite effect in some studies. The difficulties sometimes encountered
in relation to phosphorus removal within wetland systems have been mentioned
earlier. The character of the substrate medium can have an important influence
on the uptake of this mineral, since the physico-chemical mechanisms responsi-
ble for its abstraction from wastewater in an aquatic treatment system relies on
the presence of aluminium or iron within the rhizosphere. Obviously, soils with
high relative content of these key metals will be more effective at removing the
phosphate component from effluent, while clay-rich substrates tend to be better
suited to lowering heavy metal content.
Search WWH ::




Custom Search