Databases Reference
In-Depth Information
data into chunks and coordinating the processing across a distributed environment for rapid, efficient
analysis and results. Christensen explains:
IBM gave us an opportunity to turn our plans into something that was very tangible right from the
beginning. IBM had experts within data mining, Big Data, and Apache Hadoop, and it was clear to
us from the beginning if we wanted to improve our business, not only today, but also prepare for the
challenges we will face in three to five years, we had to go with IBM.
Maintaining energy efficiency in its data center
For a company committed to addressing the world's energy requirements, it's no surprise that as
Vestas implemented its Big Data solution, it also sought a high-performance, energy-efficient com-
puting environment that would reduce its carbon footprint. Today, the platform that drives its forecast-
ing and analysis comprises a hardware stack based on the IBM System x iDataPlex supercomputer.
This supercomputing solution—one of the world's largest to date—enables the company to use 40%
less energy while increasing computational power. Twice the number of servers can be run in each of
the system's 12 racks—reducing the amount of floor space required in its data center.
“The supercomputer provides the foundation for a completely new way of doing business at
Vestas, and combined with IBM software, delivers a smarter approach to computing that optimizes
the way we work,” says Christensen.
Overall, the deployment of Hadoop-based processing has become a game changer for Vestas and
provided critical insights that differentiate their services and capabilities. This kind of solution archi-
tecture is not a mandatory approach for every situation or organization, but the architecture approach
can be used to design scalable platforms for Big Data problems that need agility and flexibility.
Case study 2: Streaming data
This case study is also contributed by IBM and republished with permission. This case study is on
streaming data.
Summary
This case study deals with the extreme volume of data that needed to be processed prior to its reach-
ing storage after the acquisition process. The volume of data to process is approximately 275Mb
of acoustic data from 1,024 individual sensor channels, which translates to 42TB/day of new data.
The data has to be processed through some statistical algorithms and current processing takes hours
when the need is subsecond. As you read the case study, you will see how combining unconven-
tional approaches to data processing data with structured techniques solves the architecture needs of
TerraEchos.
Surveillance and security: TerraEchos
Streaming data technology supports covert intelligence and surveillance sensor systems.
 
Search WWH ::




Custom Search