Databases Reference
In-Depth Information
layout is complete, then metadata and contextualization can be designed, and lastly, the database and
data infrastructure integration can be completed. The steps are as follows:
Data inventory.
List all data inputs across the organization.
Create an auto-discovery search and scan the enterprise for new data and sources.
Create alters and processes to handle new data as it is discovered.
MDM.
Data virtualization can complete the operational master data management process once it
completes data discovery.
Use data virtualization to manage the gold-copy redistribution and replication.
Data source management data virtualization can determine new data based on patterns and
contexts. Once you have a new pattern discovered, create the data lineage and provide the data to
the platform for use when executing queries.
Data virtualization data delivery rules can be implemented to deliver data upon ingestion and
preprocessing to a business user or group to accelerate discovery of any corporate or legal types of data.
Agile development process and programs can effectively use data virtualization to manage agile
development and testing of data integration.
Pitfalls to avoid when using data virtualization
Overengineering—virtualizing everything is not a good idea; in some cases it may have a negative
effect.
Underengineering—the power of data virtualization lies in the fact that you can perform
automated data discovery. Incorrect rules or poorly defined discovery contexts create an
underperforming virtualization environment.
Adding too many data quality rules to data virtualization can defeat the agility enabled by the
platform.
Adding too many sources at the same time can slow down data processing and acquisition.
Incorrect configuration across databases and servers can lead to skewed performance.
As we see from the overview of data virtualization, it provides a compelling option for integrating
data across the enterprise while leveraging current and new infrastructure investments. There are four
known platforms in the market for you to try and select from, including IBM, Informatica, Composite
Software, and Denodo Technologies.
In-memory technologies
The biggest benefit of commoditization of hardware is the advances of increasing infrastructure capa-
bilities within the same form factors. What we mean by this statement is the fact that five years ago
what you could purchase for hardware is still physically the same size but configuration-wise is 200×
more faster, cheaper, and has more horsepower and storage. One of these advancements is a set of pow-
erful hardware accelerators utilizing the power of flash memory, memory chips, and solid-state disks.
Though the common term used for specialized memory-based computing is in-memory computing , the
actual technology may include very high-speed solid-state disks along with pure memory.
Search WWH ::




Custom Search