New Research: Data Virtualization Perceptions and Market Trends

A newly published BI Leadership Benchmark Report concludes that data virtualization software is the key to creating an agile, cost-effective data management infrastructure.

The report, Data Virtualization: Perceptions and Market Trends, which includes survey results from 192 BI professionals, was authored by Wayne Eckerson, Director, BI Leadership, a TechTarget research service.

Distributed Data beyond the Warehouse

According to Eckerson, data in organizations is hopelessly distributed across multiple operational and analytical systems and, increasingly, external data sources, such as social media and syndicated data services. Traditionally, Eckerson writes, organizations physically consolidate data within a single environment (e.g., a data warehouse) before building applications that query data.

“BI professionals increasingly understand that federation and virtualization is the future - they can no longer deliver enterprise data by physically consolidating all data,” Eckerson said. “As more IT managers learn about this unique category of software, it will grow its footprint within corporate data infrastructures.”

Data Virtualization Key to Agility

“This approach is not very agile, and where data virtualization software can help,” Eckerson said. “Once implemented, data virtualization software can accelerate application delivery because developers no longer have to source, integrate, and clean data on their own or wait for the IT department to do the work. Instead of hunting down relevant data using a variety of tools and access methods, developers can use a single tool with a uniform interface to access data both inside and outside the organization,” Eckerson said.

According to The Data Warehousing Institute, it takes organizations an average of 7.8 weeks to add a new data source to their data warehouses and seven weeks to build a complex dashboard or report. “Given the fast-pace of business today, this is too slow to meet business needs,” Eckerson said. “Data virtualization creates a logical view of distributed data and eliminates the need to always have to physically consolidate it in a local database. All this speeds delivery times and liberates developers from data collection grunt work.”

Additional Findings

The research report also found:

  • More organizations will look to data virtualization to extend their query reach beyond relational databases
  • Data virtualization software works best if it addresses all the data within an organization as well as the external data business users need
  • Data virtualization accelerates time to insight while making it easier for administrators to move, change and consolidate back-end data systems without affecting downstream applications
  • A classic use case for data virtualization software is to create a 360-degree view of customers. Another use case for data virtualization is to augment a data warehouse or business application with real-time data maintained elsewhere

Learn More

Data Virtualization: Perceptions and Market Trends, along with over a dozen additional data virtualization thought leadership white papers from leading analysts, is available to download on the Data Virtualization Leadership Series website.

You can also see a video of Wayne discussing data virtualization with Rick van der Lans at Data Virtualization Day 2012

One Response to New Research: Data Virtualization Perceptions and Market Trends

  1. Pingback: Data Virtualization: Where Do We Stand Today? | TECH in AMERICA (TiA)

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s


Get every new post delivered to your Inbox.

Join 27 other followers

%d bloggers like this: