Biomedical Engineering Reference
In-Depth Information
On the Horizon
For decades, statistical analysis has been recognized as a necessary component of scientific R&D.
Since the work of Ronald Fisher and others in the 1930s, statistical methods have been applied to
everything from process control in automobile factories to predicting the results of presidential
elections to estimating crop yields. Many computer-aided statistical methods, such as Monte Carlo
methods, were first applied in nuclear physics and migrated soon thereafter to research and
engineering. Today, the desktop microcomputer has made it possible for every researcher, student,
and layperson to explore statistical principles. With the ever-decreasing cost of computation, this
trend of moving statistical concepts out of the laboratory and into the public domain is expected to
continue.
Consider that, since the introduction of genetically modified crops in the mid-1990s, a great deal of
public attention has been focused on the likelihood that these crops could either contaminate
traditional crops or have an adverse effect on consumers. Statistical methods have been embraced
by politicians, scientists, and farmers in the EU and elsewhere to back their particular perspective on
the issues. For example, the British government has established buffer zones to separate organic and
genetically modified crops, based on statistical models. For example, genetically modified maize can't
be planted within 200 meters of organic crops, thereby preventing the genetically modified maize
from cross-fertilizing organic maize. With the various special interest groups involved, each using
statistical analyses to back their positions on genetically modified foods, it's likely that statistical
methods could easily hold the key to whether at least one end-product of bioinformatics R&D
survives.
 
 
Search WWH ::




Custom Search