Global Positioning System Reference
In-Depth Information
constantly increasing while cost and sensor size are going down. The result
is data growth at a rate exceeding Moore's law (doubles approximately
every two years) according to the leading researches in the fi eld, which
becomes even more impressive as the trend predicted by Moore's law is
slowly coming to an end (Hilbert and López 2011). In other words, hardware
development alone cannot keep up in providing the expected resources
for real-time and near real-time analysis of such Big Data (Laney 2001;
Snow 2012; IBM 2013). It is just as important, if not more, to advance the
development of software that scales to huge volumes of data in all aspects
of working with it—from collecting and storing the data, to searching it,
processing and analyzing it, to visualizing the results.
Traditionally, raster data repositories are implemented in a one-fi le-per-
image manner where images often serve as “dead” backdrops which can
be displayed, but not get analyzed further at the data repository. In terms
of data management and provisioning, fi le-based solutions are based on
particular models (typically induced by the particular design choices of
the data exchange format used) and often actually lacking a clearly stated,
informationally coherent model for different formats, e.g., GeoTiff and
NetCDF do not know each other unless format rules are introduced under
a coherent information model. Consequently, every extension or add-ons
with new structures is causing severe implementation, performance, and
interoperability problems. Therefore, it requires tremendous effort to
provide fl exible and real-time answers in this case.
Array databases provide a more convenient way to process large arrays.
For example, queries can be quicker than fi le-based systems because of the
potential for optimization and for organizing data physically around the
kinds of queries that one expects. Database technology adds substantial
advantages by offering scalable, fl exible storage and retrieval/manipulation
on arrays of (conceptually) unlimited size, but have to be extended and
adapted in several ways, leading to the goal of the recently emerged
research fi eld of Array Analytics which combines insights from databases,
programming languages, high-performance computing, and further areas
with the domain expertise of Earth, Space, Life, and Social sciences and
engineering.
Effective analytics on pixel level is more than simply a matter of shuffl ing
high volumes or optimizing the search performance by introducing a database
system to store fi le locations in the database; it heralds an era of fi nding
insights in historical and emerging information in a more agile and adaptive
way, and of answering questions that were previously unconsidered because
they were thought to be intractable. This paradigm shift is fundamentally
impacting all geo-scientifi c disciplines, such as Cryospheric, Atmospheric,
Solid Earth, and Ocean research. The promise is a better understanding of
the Earth System, e.g., global warming and climate changes.
Search WWH ::




Custom Search