Share this post:

This post is part of a series. Read  part one, part two, part three and part four to learn more about the urgent need for Hybrid Integration.)

Smart, intelligent, predictive — whatever you call it, I feel like the promise of analytics has been around for a long time. Marketers  are constantly seduced by the allure of real-time insights about our customers that we can immediately act upon. But in practice, instead of Real Genius it is more like Dumb and Dumber.

Do remember the opening scene from that movie?

Lloyd Christmas: Why you going to the airport? Flying somewhere?

Mary Swanson: How’d you guess?

Lloyd Christmas: I saw your luggage. Then when I noticed the airline ticket, I put 2 and 2 together.

In many cases, analytics has not progressed beyond that kind of thinking. It remains in the realm of simple data points that lead to obvious conclusions. How do we take it to the next level?  For the most part, the data exists. The issue is that it’s locked in enterprise systems, or across multiple clouds or as unstructured data that you can’t pull into your applications so you can act upon it.

At the top level, it’s an issue with integration. But, when you break it down, it is actually three distinct integration issues: how you integrate the right analytics applications into your environment, how you connect to the new devices and sources that supply the data,  and how you deal with the velocity-intensive Big Data integration problem that comes from those devices and source. It’s a triple whammy — like when you got no food, you got no job, and your pets’ heads are falling off.

How do you avoid making big mistakes when it comes to supporting analytics-driven applications (mistakes like driving a sixth of the way across the country in the wrong direction)?  Excellent question. And since I am not nearly smart enough to come up with something myself, I went and asked our friends at IDC. They looked into a few use cases and found some interesting trends around using predictive analytics as part of a larger application.

Here’s what they determined:

  • Moving data from a source to the target using Extract, Transform and Load (ETL) or file transfer technology may not be fast enough.
  • Transmitting individual data events over a network to the target system may not be reliable enough.
  • The protocols and formats of the source data are often incompatible with traditional adapters used to connect to source or target systems.
  • Traditional integration to normalize the data cannot handle the high volumes of data.
  • Data may be received out of order, which is problematic when the time of data origination is critical.
  • Third-party data not under control of internal resources may be leveraged, requiring additional validation and verification checks or arm’s-length integration to avoid contamination or compliance issues.

To overcome some of these issues, we are seeing organizations move from using batch delivery to a more event-driven design for their analytics-based applications. In fact, in 2017, 41% of the 6,068 respondents to IDC’s CloudView Survey have already implemented an event-driven architecture for their real-time analytically-based initiatives. In our 2016 survey, only 29% of respondents had implemented such an architecture.

What we are seeing is a move to early adoption, with those smart enough to make the change realizing a new competitive advantage. Where will you fall in the 2018 survey?

Before people think you can’t get any dumber, download the IDC Report, The Urgent Need for Hybrid Integration or go to the IBM Integration website to learn more about IBM’s view on hybrid cloud integration and TOTALLY REDEEM YOURSELF.









#awvi,#Analytics,#Cloud,#bigdata

via Cloud computing news https://ibm.co/2cigQr9

December 5, 2017 at 01:18AM