Fixing Predictive, Making Anticipatory Work

Thanks for coming along… to refresh your memory: three part series

The concept? Predictive analytics was badly implemented, but new technologies and knowledge have made it obsolete.  There’s a new model that can be used, an interesting new model: anticipatory analytics.

This third and final post in the series will show you how to make it work – the rest had been more theoretical.  I want to really explore how to make the art of the possible work now.

How is this possible? It involves five elements:

  • Fast data management – the technologies that we use to manage data have changed dramatically in the past five years or so. From in-memory processing that does not require storage to understanding of unstructured data to faster throughput and better connections between systems we are able to manage data magnitudes faster than in the past – leading us to real-time (or near-real-time) use of data to let computers automate decision making. This is what allows us to create in near real-time customized offers that would have a far greater likelihood of being accepted.
  • Sharing of data – consumers, in the age of the customer, are sharing more and more data today thanks to social channels, social networks, and online communities. Information that was before hard to obtain or to understand it is today freely shared in public places for organizations to leverage and use to deliver solutions of mass personalization. This highly customized and attractive offers can only happen when data about sentiments is freely shared and analyzed.
  • Loads of data – there is no doubt that we are seeing more and more data being generated, but the counter side to that (organizations being able to manage all that massive load of data) is more interesting. Long gone are the days when using more data meant more expensive storage to accumulate it and welcome are the moments when either the storage is so cheap that is not an issue, or when the processing that happens in near real time makes storage obsolete and unnecessary. The volume that was suspect of bringing down performance before becomes a load of welcome data when properly filtered.
  • Outcome driven – as more and more consumers and customers are pushing to co-create and collaborate we see more organizations willing to focus on the outcome from the perspective of the consumer, not the company. After all, the outcome for the consumer also benefits the company as it usually relates to more products and services being sold. An ideal outcome for a consumer would undoubtedly necessitate more of a product or service that the organization, or one of its partners in the ecosystem, can offer and benefit from selling. Focusing on the outcome from the consumer perspective is a win-win situation for both parties.
  • Experience based – not just a generational shift, but also a societal move that reflects the advances of the middle class around the world, more and more consumers are shifting away from single interactions to complete experiences. This move, and the demands they make on organizations to deliver against them, is making more and more brands and providers consider ecosystems or complete experiences. In the case above, the airline could offer complete experiences versus a simple trip to create a competitive advantage. Knowing what consumers want for those experiences requires a complete predictive model that leverages all data and sources from all partners to deliver better experiences to consumers.

Of course, none of this happens automatically – organizations must undertake the process to understand what they have to do, setup the ecosystems, implement analytics, and optimize as they go. There are three steps that organizations must take to embrace anticipatory analytics:

  1. Understand Your Customers.  This is past the concept of demographics and who they are, but more around of their expectations, wants, needs, and demands.  What are they really asking? How they expect to get it? When? What is their impression of your duties and responsibilities towards them? This type of understanding is what drives the analysis and decision making you will undertake.  If you don’t know what you need to deliver, you don’t know how the data can help you anticipate their needs (and, btw, make sure you have processes in place to update those needs and wants as your customers are wont to do).
  2. Tie to KPIs.  If you are going to invest time, money, and resources into making this work you better be prepared to show how it supports the ever-moving goals of the organization.  We are not talking about numbers or metrics that are relevant to the process, but how those functions tie to the numbers that tell the story of how the organization is evolving.  The outcomes you are seeking better have a tie-in with the numbers that show the health of the organization – if not, find it.
  3. Evolve. There is no simple way to get there from here.  There is no silver bullet or magic potion that will give you an understanding of your customers expectations and how they evolve, nor is there a way to understand how the specific actions are tied to KPIs.  Further, it is likely that the technology you have you already implemented – no one will sell you an “anticipatory analytics engine in a box” to deploy.  This is about making commitments to see outcomes become realities and the best way to implement the technology you already have to support the new goals you are setting.  Leverage, repeat, learn, and evolve.

Well, that’s it for this series – thanks for reading along… what do you think? something you could do? See your organization doing?

Would love to hear what you are doing or thinking…

ICYMM: Three Tips to Power Your Knowledge Management Initiatives

ICYMM: In case you missed me, linking to articles or other places in the web where you can find my research and content.

I had a few weeks ago the change to talk to the lovely Lori Alcala  about Knowledge Management.  She was interested in learning more about the research and findings from a report on Knowledge Management I ran sponsored by IntelliResponse.

She wrote a great summary, I am  happy with how it came out  – despite being in a publication I don’t appreciate much – and wanted to link you to it and give you some quotes from it.

Quoted:

Kolsky noted that only 34 percent of companies have maintenance processes in place for their knowledge.

“Maintenance costs increase an average of 8 percent per year, but less than 3 percent of organizations would invest 5 percent or more on maintenance over the previous year.

“Most people see knowledge management as a technology,” he concluded, “but in reality, it’s a lifelong initiative of an organization.

Read the entire article including the three tips to “power your knowledge management” here.

disclaimer: you know me and my model, vendors sponsor research but don’t own any of the planning, deployment, analysis, or delivery.  they get to use the findings, but i get approval of the final content and model, etc.  IntelliResponse was kind enough to sponsor my research last year on KM (together with Transversal and Parature) and they got a good report (I will provide more info in later installments) and data.  They created an infographic and that was why Lori became interested and we talked.  That does not change any of my opinions about any vendor – but you know that already.  right?

A Model for Anticipatory Analytics

I hope you read my last post about what’s wrong with Predictive Analytics – that’s the basis for this post.

I would like to explain how Anticipatory Analytics should work – and give you an idea of what the value is.  This is, in essence, what predictive analytics should’ve been in generation 1.0, and how we evolve from that definition of predictive to today’s model.  As I mentioned before, I see predictive as being badly implemented more than anything and am hoping this model can improve and replace those faulty implementations.

The outcome for this model: explore the art of the possible.  Let’s start with a theoretical example to illustrate better the art of the possible.

A consumer purchases an airline ticket to a tropical destination. In today’s world of traditional marketing the email confirmation would contain links to typical tourist attractions with whom the airline has established relationships. If the consumer buys, they get a percentage of the purchase price – but the number of consumers that take those offers is extremely low.

Any incentive to purchase, or any special offer, will not be customized to that specific consumer to augment the chance of purchase.

Fast forward to predictive analytics events, and the airline may do better than just offer a bunch of random attractions, they might even filter by age and gender of the consumer – or even look at past events and see if they opted for one of the previous offers and then make the offer to the consumer. Since the confidence of these offers is greater, and the recipient is thought to be better known, the offers the consumer gets may be more customized and even addressed to their individual preferences (based on information the airline captured before).

Picture1-MBP

In a world where everything is possible we see a different scenario. The airline would use data from their own repositories, but also from other sources. Compiling information from social channels, communities, partners and alliances, even accessing credit card information from the past they can, close to real time, construct a very effective profile of what offers would or wouldn’t attract the interest of the consumer and extend deeply discounted, but almost guaranteed, offers customized to the preferences of the consumer. Not necessarily based on information that was stored before, but based on analytics conducted ad-hoc on the many data streams related to the consumer.

Each choice the customer would make would then change the potential outcomes of the many other options – which would then be recalculated and the most likely chosen – not the next best, but the next most likely.

In this example we see a consumer that instead of getting an email with 10-12 “opportunities” would get a highly customized package of offers that are almost guaranteed to be interesting and appropriate. Further, if the airline could obtain financial information from a partner about the user, the offers could be of higher or lower value appropriate to each user and further increase the chances of adoption.

Even more interesting, any choice that the consumer makes would alter the calculations for the many other options – in real time resulting in better offers being tendered instantaneously.

This new model, from expecting a consumer to repeat a behavior from someone else in the past to foretelling what a consumer may do based on his or her individual data and needs, and adapting it along the way based on their choices and other data, is the art of the possible today.

Great, you say – so how do I make this work?  That’s next week’s third and final post on this series… stay tuned, once more.

Fixing The Suckiness of Predictive Analytics

You have been so nice to respond to my publishing of “old” work that was never shown that I want to continue doing that.

What follows is the fist of a three part series (when you are breaking down a 3-4 page writeup into pieces three parts seems to be about ideal).

I was tasked last year to focus on how to make predictive better.  I was never a fan of how predictive analytics was implemented (I am fine with the concept, but I don’t think anyone cares about the concept – instead deteriorating it into a parrot-act of repetitiveness with no good results).

In conducting the research came upon long-forgotten concepts and ideas, and mulling through them gave me a new idea: the one you will read about in next week’s installment :-)

First, let me set the playing field.

Predictive analytics is finally changing.

An art form of sorts, revived by the recent interest in Big Data and analytics shown by global corporations in the past decade, predictive was never intended for the current uses.

The definition of predictive analytics puts it at odd with the current usage.

Predicting a behavior was not intended to be used as a harbinger of a customer’s intention to purchase but rather as a lagging indicator of an occurrence or event so that the knowledge could be used to build better analytics models.

The thought that an occurrence will repeat many times over because of past data points indicating a similar setup is criticized by many analytics experts – even when adopted by most organizations.

The difference is the narrowness of what predictive can do today. We are simply focused on one path, one way to get from point A to point B. If last time we were at point A we took a bus to get to point B, we will do the same today. The complexity of today’s world makes those “guesses” just about impossible. What if, for example, it is raining heavily and I am in a rush? Could I take a taxi instead? Or, what f I have time and it is a beautiful day? Could I walk? Or, what if I am with someone who owns a motorcycle? As you can see, the many variables that are traditionally ignored by predictive (we look for a pattern, and then try to repeat it when similar data points are recognized in the same sequence) are what make the new models far more interesting.

Keep in mind, this is not what predictive intended to be – but what became from the poor implementations along the way.

A successful bad implementation will  be repeated.  A failed good implementation never sees the light of day again.  This is how Twitter came be used for Customer Service (but I digress)

Instead of trying to predict behavior step-by-step as most predictive applications do, why not use the pattern as a loose guideline of a sought outcome, break down the steps, and consider the many options available at each step. What yonder could’ve been a monumental step in calculating and analytics is very possible today thanks to advances in data capture, storage, management, and analysis.

The “Big Data” era brought the capabilities to analyze just about any data set in real-time and add many more variables as part of the analysis yielding far more interesting insights.

And it is within this new approach that we find not predictive analytics – but anticipatory analytics: the ability to dynamically and actively generate insights at each step of the way based on previously impossible to include variables and elements: intent, decision-making by users in real time, and untold goals and objectives.

As a result, my phone may hail a taxi for me (and maybe offer me a discount) if it detects rain or nudge me towards walking on a nice day – not because I did it before, but because I am about to do it. This is where predictive transforms to become the art of the possible.

What does anticipatory analytics look like?  Come back next week to see…

ICYMM: The Modern Customer Interaction

(ICYYM: In Case You Missed Me: linking to writing done elsewhere)

Another area where I spent a long time working in 2014 is customer interactions.  I mean, I am supposed to be a customer strategist… and customer interactions are the representation of what companies and customers do together… so why not? Right?

I talked to many organizations that are embarking on initiatives to redo their customer interactions.  Unfortunately, most of them are merely focused on documenting and extending what they are already doing (usually via <gulp> Journey Mapping and Customer Experience projects… sigh) instead of focusing on transforming them into something better.

I tried to imagine what a new customer interaction would look like and talked to vendors, organizations, and even consumers in that effort to get a better idea. I read most of what was written about it and then set out to summarize this information and came up with a new model for customer interactions.

Outcome-driven Customer Interactions.

Thanks to the sponsorship and support of my friends at InsideView, I published initial thoughts and models in their blog in a three-part series.  Here are links and quotes to  start the conversation.

Part 1: The Modern Customer Interaction – Introducing the concept and delineating the four outcomes: intent, satisfaction, knowledge, resolution, and engagement.  Quoting from the post:

All of these outcomes have the same thing in common: they use insights derived from the exchange of information at the moment of interaction to co-create value for both the customers and organizations. (…)

It is about delivering complete interactions at each step that co-create value to both sides resulting in long-term engagement.

Part 2: Outcome-driven Customer Interactions – Introducing a visual representation of this model – one that extends the work I’ve done previously with customer engagement.  Quoting from the post:

These outcomes ensure that even if there is no final resolution (or engagement over the long run) significant value is created at each stage that can then be used to move to the next step – or used in the future to generate better interactions.

Part 3: Customer-centric Outcomes – Finalizing the series by giving more details on the outcomes introduced and how organizations can start planning for them.  Quoting from the post:

These outcomes, in turn, will force organizations to change the way they work, the processes they utilize, the way they leverage their systems, but more importantly how they manage the information triumvirate (data, content, and knowledge) to seek engagement over the last run with customers.

Does this resonate?  Are you planning your interactions with customers with these goals in mind? Should you? Want to?

Leave me a brief note, let me know your thoughts…would love to talk and expand this model.

Disclaimer: InsideView is a client and they have generously sponsored this research. This research was conducted as part of my sponsored research model, where kind and generous clients pay me to do research I needed to do anyways and in exchange they get to use the content and their name associated with it. If you know me, and my work, you know this research is continuation of work I’ve done in the past few years – and that no one, even sponsors, get a saying in editorial control or content. In other words, they kindly sponsored me to write what I find out there.

How SAP Missed An Opportunity

Hello, it’s me again.  Your Friendly SAP — foe?

If you follow me you know that I admire the technical prowess of SAP, but despise about twice as much their marketing “acumen” (there are no other words that are politically correct and fit nicely between quotation marks).

I wrote about it in more detail in my post talking about their Reverse Dichotomy.  They innovate in technology and throw it away in marketing.

The latest launch event, yesterday, of S4/HANA (S stands for simple, 4 is the release number, and HANA stands for — er, well… HANA #LeSigh) continued that trend, with a twist.

There were high points worth mentioning:

  • The reduced and centralized data model (finally, finally, finally — but wait, does that include SuccessFactors?  I don’t think so… we are still not there)
  • The use of metadata (I think they call it customization, but by any words the concept is to allow customers to customize services via the use of metadata – one of the very few companies to do that)
  • A complete rewrite of the platform (yay, yay, yay – 30+ years of spaghetti code base and band aids had seen its day – good riddance)
  • Performance improvements galore (yes, finally – but seriously, 8 seconds responses are not something I’d brag about – even if 4-6x better than what they offer today – and don’t even ask the SuccessFactors clients for their opinion of those “great response times”)
  • A better way to cloud (will discuss the downside of this later in the post; but there were steps forward).

All in all, a good evolution for SAP as it introduces the first major innovation in their product since version 3 (R/3).

Good – right?

Yes – but mostly no.

SAP had been late to the cloud, to platforms, and to offer what their customers demanded in the form of a platform.  Salesforce.com and Microsoft have already addressed this, Oracle is working some of their marketing magic to convince their customers they have it (although they don’t so much) but SAP had been very, very quiet.

Well, not really.

Marketing wise they have been playing marketing musical chairs for the past few years relabeling and renaming their products under different launches, names, and what-nots – but never with a centralized perspective.

This is why they were behind companies like Salesforce (who spent a good 4-5 years retooling to create the Salesforce1 platform, only to waste the opportunity with a mobile-client marketing message that was not so good… or as my iPhone auto-correct would say, ducking socks) and Microsoft (who recently launched XRM, but how long it took to build depends on who you ask).

It’s been several different incarnations and versions for CRM for cloud, HANA, on-premises, and on-demand with different names and mostly the same product or slightly repackaged for some time.  Same happened to other products in the lineup.

I was looking forward to this release as it had been touted to me as the centralized, all-in-one release that will unite all products, fix the platform, change their approach to cloud and platform, and overall drive adoption for the next generation.

They fell short.

They had an opportunity to do that (I liked HANA from the very early conversations about in-memory and improved performance, and each time they showed what it could do with analytics and data management my mouth watered at the possibilities; they had some very interesting architectural approaches for CRM and the acquisition of SuccessFactors brought top quality talent to help them move forward; and more) but they did not take advantage of that opportunity.

They let the opportunity to deliver a market-leading platform that would match their competitors languish.

They missed the opportunity for the same reason Oracle chose not to chase it (as I wrote before): the biggest worry was to move forward with their late-adopter customer base versus doing something innovative and changing the conversation – or worse, leading the market (the necessary components and thinking are widespread throughout the company, just not properly utilized or in some cases even recognized)

I get it, I am not going to chastise them for doing the safe: retaining revenue and ensuring it continues to trickle in for the foreseeable future.  Alas, they left behind the ability to both impress and capture new customers in exchange for servicing their existing ones.  A safe move.  A lost opportunity.

Some of the items that caused me to write this from their recent launch?

  • The admission that multitenancy may not all that’s cracked up to be (how i wish I’d’ve said that before…  wait, I did) but still being offered (mostly because after many years of saying it is essential you can just walk away – and because…)
  • The insistence of offering on-premises version of the solution in addition to public cloud (and won’t even stop to answer questions about private and hybrid clouds… sigh); worse was the reasoning – some verticals  cannot do it – which is antiquated and wrong, but that’s another thread/ topic/post.
  • Not building on the concept of three-tier public and open cloud in favor of retaining the “platform” in HANA with little ability to be replaced or to use supporting services from other vendors (yes, like any other vendor – they want to retain the “ownership” of the client via their platforms, old habits die hard for all of them)

Short version of the complaint? They stuck by the slow-moving, late-adopting mass of the majority of their customer base instead of using the  potential of HANA to create new and innovative.

Just like Oracle before, it sucks - but I guess it is the way they had to go.

Next!

Stop Talking About Digital Transformation

What?

I mean, what????

Seriously?  Just last year two of my seven posts (yeah, didn’t do that well writing in my blog last year – but been working to remedy that by posting links to my other writings around the world – but I digress) were about digital transformation.

I have been talking about digital transformation for nearly four years now and began to write about the transformative power of data (what digital refers to) over 15 years ago (when I began to cover EFM at Gartner ‘member?).  Why on earth would I want to stop talking about it now – when its finally reached the peak of the hype cycle and is beginning to be adopted?

Because its too limiting.

In my (now) business transformation model data has a key place right in the middle of it (see figure below).

DT New Framework

In conversations and work I’ve done these past 6-8 months with organizations and vendors data remains the main focus.  Top  investments for 2015 are focused around data and analytics.  Talk of Big Data and related concepts are taking over the world – and my colleagues (analysts, influencers, and pundits) are all super-busy around the topic of Data.

Data has taken front-and-center positioning among organizations’ plans and strategies for 2015 – and it is too limiting.

We need to amplify the conversation.

If we are going to talk about transformation we need to talk about more than just digital.  Data is but one piece of the pie.  Data, together with content and knowledge, become part of the information layer (see figure below).

InformationCreation

If we are to talk about a complete transformation of the business we need to also talk about content beyond marketing and about knowledge beyond service as well as we talk about data.  We need to understand that the three work together to create information and that the flow of the information, freely via public clouds, is what will transform the business.  We also need to understand how new, old, and still-unknown data is going to be used to push business forward.

Data and digital are still part of the transformation, but we cannot forget the remaining pieces — and talking about digital simply limits the conversation to a small piece of it.

Let’s stop talking about digital transformation.

Let’s talk about business transformation.

What do you think?

the blog!

%d bloggers like this: