Telling a Story

28 12 2009

“What we’ve got here is a failure to communicate” Luke in Cool Hand Luke (played by Paul Newman)

A friend of mine sent this video along to a number of friends in the Business Intelligence space, saying we need to be better story tellers (Thanks Katie McCray).  We do spend an enormous amount of time talking about data structures, common data dictionaries, ease of use, speed, consistency, etc.  What we typically fail to do is tell our clients how to create information, to tell the story in a convincing enough manner to create attention, and more importantly, enable action.

As analysts we typically spend more time talking about data discovery, and the calculations we used than starting off by making our point.  We try to create 50 charts to explain everything, and not the one chart that most simply illustrates our point.  This not only wastes time, but we lose our audience.

Watch the next couple of presentations you sit through and watch the number of slides that build up to the point trying to be made.  What happens is that with each slide our listeners pay less and less attention as they have lost the point trying to be made.  As learners, we need the point to be made first.  We need to see how it all comes together, then have it explained how to get there.  It provides the context for the point to be made.  People now understand what to listen for and why they are listening.

On a slightly different note, last week I wrote about the housing market and the Dangers of Leading Indicators.  I had to update the post due to a new story with a different viewpoint that ran in the Globe on the 23rd.  Amazing how story tellers can tell such dramatically different things.





Analytics Process

23 11 2009

Over the last couple of months I have been writing about a handful of US Economic Indicators.  While I have reviewed these over the last few years of my life, I had not done so on a regular basis.  This inconsistent and let’s call it a casual curiosity lead to never really understanding the implications behind the numbers.  Sure I could talk about them, but I could not leverage them.  While not an expert by any means, I can see a lot more now than I did when I started this blog series.

This is similar to ad-hoc analysis without purpose.  We do something once and create a little hype.  When we don’t have any vehicle to take advantage of the newly found ideas, the idea dies as does the learning.

Think about the process of how you handle ad-hoc analytics within your organization:

  • Do you have the right minds constantly looking for new issues?
  • Or, do you put the right minds on solving issues when they arise?
  • Can you name your best analytical minds?  Are they assigned to thought leadership and problem solving?
  • Do you use your analytical minds to challenge the knowledge levels of others?
  • How do you foster new thinking?

 

Consistency breeds familiarity, and familiarity breeds knowledge





Analytics Competency Center

28 09 2009

We spend a lot of time on Business Intelligence, Master Data Management, Data Governance, Standardization, off-shoring, etc., yet I rarely hear organizations spending time and energy on analyzing the data.  We have cubes, we can do all sorts of things with reports and dashboards, yet I still hear people say “I need more information!”

It is impossible that we are short on data!

  • How then are we not getting enough information out to the organization?
  • Is it possible that we are spending all of our time and energy on data preparation and data movement?
  • Are we creating value, or just planning to create value?
  • What about creating a center of excellence around the business user?
  • Or something around the levers of the business?




Predictive Analytics Gets Closer

17 09 2009

I am always a little shocked by a company’s resistance to using predictive analytics.  My guess is that is a combination of not really understanding the value, fearful that they won’t get it right, or not having the right talent to use it.  It has long been labeled as “white lab coat stuff” and perhaps that is a bit accurate.  But software is making this easier, and MBAs are studying it so this label should be diminishing.

The Value:  Reducing costs, increasing returns, quicker identification of issues – these are all critical wants of every organization.  If we can only chase five opportunities with roughly the same make up, a little predictive analytics should be able to tell you who is more likely to have a higher customer lifecycle value.  If you only can cover 10% of the market with a marketing campaign, predictive analytics can help you determine which 10% is likely to have the greatest yield.

The Fear:  I understand this, but it is a little irrational as all decisions involve some level of risk.  All predictive analytics do is make decisions based on an elevated likelihood of being right.  If I told you I could make you 10% smarter, wouldn’t you listen?

The Talent:  This is perhaps a realistic barrier, but one simply corrected.  Predictive Analytics, while getting easier every day, is still about advanced computations.  Not only do you need to understand how to do them, you need to understand when and where to use them. And more importantly, you need to understand how to transform the information into values an executive team can put into action.

Where do you begin:

  1. Find someone in the organization with a good statistical and business mind (or hire one).  This may not be the technical team – it often takes a little different skill set.  Or find a small team.
  2. Find a business process where there is pretty good data and that will add value at the end of the day – customer attraction, attrition, fraud detection, scrap reduction, etc.
  3. Put a small project in place to try it.
  4. Enter my favorite stats words – Parsimony:  Find the most simple answer.  This is easier to explain and digest of how to put the project into action.  (Why is a word that strange about the simplest answer).  It is easy to end up tweaking a project to death.  Don’t do it on the first pass.  You get lost in data and often find it far more difficult to explain and complete the project.
  5. Try it and accept the results.  The is tremendous learning in failing (and chances are likely you won’t fail if you didn’t bite off that much).

Examples:

  • Let’s say you can identify customers who are likely to abandon you and then work to make sure those customers are treated better.  If your abandonment rate drops by 10%, what is the value to the bottom line?
  • If you can identify customer segments that are less price sensitive, what is the value of a 1% increase in average deal size (note that the entire amount really should drop to the bottom line as well)?
  • What if you can reduce fraud by 5%?

The numbers show that predictive analytics are very real.  It is not about guessing, it is about reducing the risk of guessing.  And if you follow many blogs, all of a sudden there is a lot more information on predictive analytics.  IBM is finally putting together some wood behind the arrow of its SPSS purchase which may also begin to influence more decision makers in the space.

Related Links:





Design for Information

4 08 2009

All to often reports are designed to provide data, not information.  There are charts and tables with little intrepretation, or description.  While I am not great fan of PowerPoint, it can often make up for Enterprise BI limitations.  We can call out certain areas within the charts and graphs, as well as add the commentary to help us communicate our point.

A safe assumption is that the person reading the report will not have the same understanding of the material as the report designer, or analyst.  It is then our job to make sure that the report communicates the point clearly.  The last thing you want is to hear “what are you trying to show me?”

Below is a good example of presenting data, while not telling us much.  Here we see that he/she has a few fans that are frequent contributors, and that tweet volume picks up around the lunch hour.  There is not much variation for the days of the week, with a little drop off for the weekend.  August is also the most popular month.

twitter2008-1

What would be helpful to know is why this data is important to us.  What perhaps would be the most important is to know the subject material, so we could do things like tweet just before lunch as that seems to be the most popular time to inspire reaction.  Or that August tweets were up due to an embarassing grammatical error.

As we are designing reports, make sure that the information has a purpose.  Most specifically, know the audience and know the potential actions the information is going to inspire.





Data Warehouse Design

24 06 2009

One of the main problems with Data Warehouses is that they are designed to answer any question.  The problem is that they usually fail to answer the one someone is asking.  DWs are usually good for referencial information – meaning I can answer questions like “how many customers do we have that have spent over $100,000″ or “which customers bought the blue widget.”

There are a number of points of failure that hamper DW projects:

  • They are usually complex and very costly
  • The business changes (regions, product lines, sales heirarchies, etc) in the middle of the process
  • The end use is not well defined
  • Lack of analytical skill and knowledge of data structure in the business users to get the right data
  • The end result is too complex for the users to understand where to go to get the right information
  • No one tells the organization “thou shalt” use the data warehouse – so people get data from all different sources making a common version of the truth difficult to get to
  • There are often no rules of engagement for how to use the environment, or data in general

If organizations only use 6-10% of the data they collect, how do you design the DW for greater adoption?

For starters, understand the common business questions and the potential levers that can be pulled. For example, one of the areas that always surprises me is the lack of information around the success of marketing campaigns. Marketing campaigns and price are really the only levers we can pull in the short term to increase revenues. What we often fall back to is the sales whip – where we put more pressure on the sales team to perform. This is a strategy of hope (which is not a recognized as a successful strategy practice). We apply the pressure without providing much in the terms of support.

Instead let’s say we are ending the 3rd quarter and our numbers are a little behind and the pipeline is not as strong as we would like.  We know we have some time, but the programs have to be very tactical to find low hanging fruit. Instead of reviewing the potential marketing programs or trying something new, we cross our fingers and yell at the sales team. We could cull the DW to find large groups of customers who had not bought specific groups of products and offer incentives for them to buy.  We could identify the groups/verticals of customers with the shortest sales cycle and build a promotion and program for them as well.

Yet why do we not do this…we typically lack the information in a format we can use in a timely manner.

So if we design the data warehouse (or perhaps data marts) around specific business levers we stand a better chance of answering the one question we need. We just might trigger some very interesting questions about our business.






KPI Design: Better than average

16 06 2009

In the June 1st issue of ESPN Magazine there was an interesting story about Rafael Nadal.  In the story there is a call out with some interesting facts about his play.  One of the items is his rotations per ground stroke versus the average pro.

Nadal Math Smaller
While this is fanatastic information to explain why he is better than average, what might have been more relevant to the article which is about his excellence would be to compare him versus the other top players.  What if all the top players are hitting at 5,000-6,000 rotations per ground stroke?

As we are designing KPIs and targets we need to make sure we are measuring against a relevant target, not just an industry average.








Follow

Get every new post delivered to your Inbox.