Agriculture Reference
In-Depth Information
needs. Unfortunately, this is still an unresolved and pending matter. Many precision
agriculture applications never reached the product status because of their complex-
ity in use, and mainly, in data interpretation and management. This fact resulted in a
general disenchantment of both researchers and manufacturers who had to abandon
great ideas because of the difficulties in deploying practical solutions. Motivated by the
lessons learned over the past two decades, a necessary goal to fulfill in the near future
will be the oversimplification of both system architectures and management instruc-
tions . Compact designs will surely offer the robust solutions that better fit the demands
of next-generation vehicles; sensors totally integrated in chassis and touch screens with
intuitive menus—no more complex than cell phone browsing—ready to save percep-
tion information in standard formats and conventional memory cards. Some of these
designs are already available for GPS-based applications, but off-the-shelf solutions in
the field of perception and awareness—most of which based on machine vision—are
still to come and not even in the agenda of many manufacturers.
Another issue that needs urgent attention is the not always positive perception
by consumers of the usefulness and cost efficiency of these systems. In addition to
simplify the use and configuration of perception engines, reasonable costs are neces-
sary to encourage users to adopt a methodology that, despite carrying some initial
uncertainty, will eventually deliver the desired tangible results that make the entire
investment feasible. For this to happen, applications will have to be very practical
and field-oriented, sensors reliable and cost-effective, and output information useful,
instructive, and easy to interpret by the lay person. At this point, and with regard to
surrounding perception systems, it seems wiser to assure small steps with modesty
than planning for grandiloquent missions with small or no viability at all.
REFERENCES
Cui, D., Zhang, Q., Li, M., Zhao, Y., and G. L. Hartman. 2009. Detection of soybean rust using
a multispectral image sensor. Journal Sensing and Instrumentation for Food Quality and
Safety 3(1): 49-56.
Guo, L., Zhang, Q., and S. Han. 2002. Agricultural machinery safety alert system using ultra-
sonic sensors. Journal of Agricultural Safety and Health 8(4): 385-396.
Kim, Y., Reid, J. F., Hansen, A. C., Zhang, Q., and M. Dickson. 2001. Ambient illumination
effect on a spectral image sensor for detecting crop nitrogen stress. Paper 01-1178 . Saint
Joseph, MI: ASABE.
Reid, J. F., and S. W. Searcy. 1991. An algorithm for computer vision sensing of a row crop
guidance directrix. Transactions of the SAE 100(2): 93-105.
Rovira-Más, F. 2009. 3D vision solutions for robotic vehicles navigating in common agricul-
tural scenarios. In Proc. IV IFAC International Workshop on Bio-Robotics , Champaign,
IL. September 2009. International Federation of Automatic Control (IFAC), Laxenburg,
Austria.
Rovira-Más, F. 2010. Sensor architecture and task classification for agricultural vehicles and
environments. Sensors 10: 11226-11247.
Rovira-Más, F. 2011a. Stereoscopic Vision for Agriculture: Principles and Practical
Applications . Saarbrücken, Germany: Lap-Lambert Academic Publishing, pp. 81-100.
Rovira-Más, F. 2011b. Global 3D terrain maps for agricultural applications. In Advances
in Theory and Applications of Stereo Vision , ed. A. Bhatti, 227-242. Rijeka, Croatia:
InTech.
Search WWH ::




Custom Search