Data Abstraction: The Lingua Franca for Data Silos

Enterprises are seeking ways to improve their overall profitability, cut costs, reduce risk and more through better leverage of their data assets.

Significant volumes of complex, diverse data spread across various technology and application silos make it difficult for organizations to achieve these business outcomes. To further complicate matters, there is a range of problems such as

  • Separate access mechanisms, syntax, and security for each source
  • Lack of proper structure for business user or application consumption and reuse
  • Incomplete or duplicate data
  • And a mixture of latency issues

Data abstraction overcomes these challenges by transforming data from its native structure and syntax into views and data services that are much easier for business intelligence and analytics developers to use when creating new decision-making applications.

Enterprises can approach data abstraction three ways:

  • Manual data abstraction
  • Data warehouse schemas
  • Data virtualization

Of the three approaches, data virtualization is the superior solution for data abstraction because it enables the most flexibility and agility when you need to provide simple, consistent, business–formatted data from different data locations and sources.

As a complement to Cisco’s Data Virtualization software and services, Cisco also provides data abstraction best practices that help you accelerate your data abstraction activities. Composed of three distinct layers (application layer, business layer and physical layer), these best practices support a data reference architecture that rationalizes multiple, diverse data silos for a range of BI and analytic applications. The architecture aligns closely with analyst best practices mapped out by both Forrester and Gartner on the topic of data virtualization. Using these best practices will enable your company to access the right data for the business, gain agility and efficiency, maintain end-to-end control, and increase security of your data across all your data silos.

To learn more about data abstraction best practices using Cisco Data Virtualization, check out our white paper.

Active Archiving with Big Data

Historical data is now an essential tool for businesses as they struggle to meet increasingly stringent regulatory requirements, manage risk and perform predictive analytics that help improve business outcomes. While recent data is readily accessible in operational systems and some summarized historical data available in the data warehouse, the traditional practice of archiving older, detail-level data on tape makes analysis of that data challenging, if not impossible.

Active Archiving Uses Hadoop Instead of Tape

What if the historical data on tape was loaded into a similar low cost, yet accessible, storage option, such as Hadoop?  And then data virtualization applied to access and combine this data along with the operational and data warehouse data, in essence intelligently partitioning data access across hot, warm and cold storage options.  Would it work?

Yes it would!  And in fact does every day at one of our largest global banking customers.  Here’s how:

Adding Historical Data Reduces Risk

The bank uses complex analytics to measure risk exposure in their fixed income trading business by industry, region, credit rating and other parameters.  To reduce risk, while making more profitable credit and bond derivative trading decisions, the bank wanted to identify risk trends using five years of fixed income market data rather than the one month (400 million records) they currently stored on line.  This longer time frame would allow them to better evaluate trends, and use that information to build a solid foundation for smarter, lower-risk trading decisions.

As a first step, the bank installed Hadoop and loaded five years of historical data that had previously been archived using tape.  Next they installed Cisco Data Virtualization to integrate the data sets, providing a common SQL access approach that made it easy for the analysts to integrate the data.  Third the analysts extended their risk management analytics to cover five years.   Up and running in just a few months, the bank was able to use this long term data to better manage fixed income trading risk.

Archiving with Big Data_BankTo learn more about Cisco Data Virtualization, check out our Data Virtualization Video Portal.

The Fourth V in Big Data

Bob Eve, Director, Product Management

View Bob Eve’s original post on Cisco Data Center’s Blog

At Cisco Live! Melbourne, I was invited to speak at the Executive Symposium to nearly 100 of Cisco’s top customers in the Australia and New Zealand region. In mytalk, Gaining Insight from the Big Data Avalanche, I covered big data business opportunities and technology challenges.

To level set at the start, I opened with a definition of big data, including the typical velocity, volume, and variety seem to be the three V’s everyone hears when it comes to big data. But then I challenged the audience to consider the fourth and in fact most important V, holding back on identifying it so the audience could consider what was missing.

After an appropriate pause, I told them the most important V was value. Value is the only reason to work on big data. This value must be seen in better business outcomes such as:

  • Higher Customer Profitability
  • Faster Time to Market
  • Reduced Cost
  • Improved Risk Management
  • Better Compliance
  • Greater Business & IT Agility

It is interesting how people get knocked off guard by the big data buzzwords. So go back to the basics. Start by getting your business case in order. Once the value to the business is understood, juggling higher data velocity, volume and/or variety becomes an engineering problem. Certainly, a new class of engineering problem, requiring new technologies and skills, but it is a fully solvable engineering problem nonetheless.

For IT, big data is as much an organizational change challenge, as a technology challenge. Practical first steps that seem to work well include:

  • Experiment with a smaller, “SWOT” team on a selected set of projects. This is a great way to introduce something new.
  • Go for some quick and easy wins, rather than boiling the ocean with large-scale initiatives. That is a proven technique for gaining momentum.
  • Implement a solution with revenue impact, such a next-best offer analytic to improve upsell performance or a predictive churn analytic that helps reduce customer defection. These high visibility projects will ease business funding challenges and improve executive visibility / sponsorship.

 

Counting Down to Data Virtualization Day 2013

Take Big Advantage of Your Data with Composite and Cisco

Today, the difference between business leaders and also-rans is how well they leverage their data.
And with big data and cloud causing data to be more distributed than ever, world-class data virtualization and networking technology have become critical to this success.

With the acquisition of Composite Software, only Cisco provides this powerful combination.

Read More

Data Virtualization Leadership Blog Selected in Top 30 MIS Blogs of 2012

We are proud to announce that our Data Virtualization Leadership Blog has been awarded as one of the Top 30 Management of Information Systems (MIS) Blogs of 2012.

Read More

Get Certified On Data Virtualization, Virtually!

A First in Data Virtualization

Last week we announced the industries’ first Data Virtualization certification program, the Composite Information Server Specialist Certification.

Read More

Analytic Sandboxes and Data Virtualization

Analytics – Opportunities and Challenges

I have spent a number of years in the BI space including time with innovators including Business Objects and EMC. In my opinion, the opportunity for business people to perform new types of analysis to gain greater insight into their business and customers has never been greater.

Read More

How Organizational Hurdles Can Delay Data Virtualization Adoption

Because of both organizational resistance and technological concerns, enterprises often struggle when adopting new data management technologies such as data virtualization. In this blog, I will examine five common organizational hurdles to data virtualization adoption.

Read More

Data Virtualization Best Practices from Our Customers – Part 2

Five Lessons from Leading Data Virtualization Adopters

In this series of three blog posts, I am passing along the best practice lessons from the ten organizations profiled in Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility.

Read More

Data Virtualization Best Practices from Our Customers – Part 1

Five Lessons from Leading Data Virtualization Adopters

Enterprise adoption of data virtualization continues to accelerate driven by organization demands for greater business agility and lower IT costs.

My professional services team has deployed Composite data virtualization offerings in support of hundreds of implementations. From this work a number of best practices have emerged. 

Read More

Follow

Get every new post delivered to your Inbox.

Join 27 other followers

%d bloggers like this: