Advanced Analytics

22 03 2010

A major item organizations grapple with is the concept of advanced analytics.  They want it, but have little idea how to use the various tools to make it happen.  Unfortunately too much information often blurs the lines.

For example, I watched a sales presentation on Predictive Analytics where the key outcome showed how to build databases with the tool yet almost completely missed the fact that the real benefit should have been something like “we were able identify two segments to target a marketing program for more effectiveness.  Instead of spending $500k on a generic campaign we were able to identify key attributes that drove increased customer interaction and focus the campaign to only $200k on those segments.”

Why is this? The primary reason is we do not truly understand the tools and how best to use them.  A Swiss army knife is not good for home repair, but is the perfect tool to throw in a hockey bag, or car trunk for occasional use as a widget to get you out of a jam – a screw needs to be tightened, a shoelace needs to be cut, or an apple peeled.  We need to understand which tool to use in the most appropriate situation instead of thinking of various tools as universal.

Business Intelligence, Planning, What-If Scenario Tools, Optimization, Dashboarding, Scorecarding, Cubes, Cluster Analysis, Predictive Analytics are all different tools for vastly separate purposes yet have similar uses.

Advanced Analytical Tools

Here are the core elements of Advanced Analytical tools:

  • Business Intelligence – great for creating an enterprise-wide, data visualization platform.   If you do this right, you should create a single version of the truth for various terms within an organization.  It should enable better reporting consistency standards for the organization.  In the end, it reports what the data says.
    • Scorecard & Dashboards – These are primarily BI tools that have a more organized or structured methodology for presenting ideally the Key Performance Indicators.  These are great tools, but to be most effective, they need a specific purpose that is highly integrated into a management process.
  • Enterprise Scenario Planning – Most enterprise planning exercises are giant what-if scenarios that try to plan out financial outcomes based on a series of drivers (employees, widgets, sales reps, etc.).  We build out plans based on a number of assumptions, like the average sales rep drives $2mil in business, or benefit costs for the year are going to be #of employees * average salary * 2.  We do this primarily to lay out a game plan for the year and we do it as part of an annual or rolling cycle.
  • Tactical or Ad-Hoc What-if Scenario Analysis – Besides the full scale project we do to plan out the company’s cash outlays, we also do a significant amount of smaller, typically tactical “what-if” scenario tests.  This is traditionally done in Microsoft Excel.  We dump a bit of data into excel, make a number of assumptions and try to build out likely scenarios.  For example, “if we were to create a customer loyalty program, what would be the cost and a likely reward.”  We are doing this to test ideas, so yes it might be ideal to bolt those into the Enterprise planning tool, but it typically takes too much overhead.  It is easier to just get something done quickly, then make a go/no go decision.
    • Data Visualization can also be a great help with this – to bolt on a couple of reports to see the data and how different scenarios impact the various facts and dimensions.  This can help us with our conclusions and recommendations.
  • Predictive Analytics – This tool is best used when we have historical data, or representative data set and we want to make a conclusion based on mathematics.   The key is math.  This is not guessing, it is improving the chances of being right with math, or a structured approach to remove risk from decision making.  With a planning tool, we primarily use assumptions to create plans.  We cannot use predictive analytics for all decisions, but for a few specific types of decisions:
    • What transaction details and customer insight can we use to determine credit card fraud?
    • What customer attributes create our buying segments?
    • Which customers are most likely to abandon our offering?
    • What products are most often purchased together?
    • Which taxpayers most likely need to be audited?
  • Optimization Analytics – This is perhaps the most specific advanced analytics tool when looking to solve the specific business question: “With the given parameters of these trade-offs, which mix of resources creates the most effective (or efficient) use of those resources?” This helps make decisions around production locations and product investment.  Like predicative analytics, it is mathematically based (though you may need to make a couple of assumptions as well) in how it determines the answer.

Advanced Analysts

Another reason we lack understanding is analysts.  Our analysts are commonly from the IT team, trained in data structures, or from the finance team, trained in accounting.  Neither is wrong, they just have a default mindset that falls back on using the tool they best know.  This lacks the business/statistical trained person who can both layout the hypothesis and, more importantly, explain the results.

We do not want correlation explained in R-squared values, “63% of the variation of the data is explained by our independent variables.”  While this may make sense to other statisticians and mathematicians, it is lost on the business.   One key value of using a math-based concept is that the explanation should sound more like, “We have found a way to decrease fraud by 3.2%, which should result in a $576K return to the business every quarter” or “We have tested our marketing campaigns and have found three segments that are 25% more likely to purchase based on the campaign, which should result in a payback period of 3 months.”

The right tool with the right skill set is imperative to successfully using advanced analytics.  We also need the discipline to have the right people using the right tools for the right information to drive action.  If you have an algorithm that predicts customer defection, you need to use it and test the results.  It is never going to be perfect, but in most cases, you can bet it will be better than not using it at all.

Advertisement




Predictive Analytics, Business Intelligence, and Strategy Management

9 12 2009

I was having a discussion with one of my clients this week and I thought he did a nice job summing up Predicative Analytics.

So in the World According to Reed (WOTR) – “queries answer questions, analytics creates questions.” My response was “and Strategy Management helps us to focus on which questions to answer.”

Reed Blalock is exactly right, traditional BI is about answering the questions we know. Analytics is really what we create with data mining – we look for nuances, things that might give us new insight into old problems. We use human intellect to explore and test. And yes, there is a little overlap. But what is really happening is that we have a different level of human interaction with the data.

BI is about history, analytics attempts to get us to think, to change, and idealistically to act.

The danger with both of these is that they can be resource intensive. Neither tool, or mindset should be left to their own devices. What is needed is a filter to identify the priority and purpose. This is where strategy management and scorecarding comes into play. We have built out massive informational assets without understanding where, when, and how to use it. We have pushed out enormous reporting structures and said “it’s all there, you can find anything you need” yet we scratch our heads when we see adoptions levels are low.

What we have typically not done all that well is build out that informational asset by how it helps us be more productive along product lines, divisions, sales region, etc. We have treated all dimensionality the same. Why, because it was easy. The BI tools are tremendous in how quickly you can add any and all dimensions.

“But because you can, doesn’t mean you should”

As we built out these data assets, we did not align them to performance themes.  We have gotten better with some key themes, like supply chain management, and human resource management, but what about customer performance?  We might look at sales performance, but that is a completely different lens than customer performance.

How do we determine which assets to start with…what assets do we need to be successful 3-5 years from now, or what are our biggest gaps to close today.  Think about customer value, or employee satisfaction (and that doesn’t mean more HR assets).  Think about your gaps in Strategy.

How often do we discuss…

  • Are our customers buying more or less frequently?
  • What are our best, and better customers doing?
  • What are the costs associated with serving our least profitable customers?
  • Where are our biggest holes in understanding?




Design for Information

4 08 2009

All to often reports are designed to provide data, not information.  There are charts and tables with little intrepretation, or description.  While I am not great fan of PowerPoint, it can often make up for Enterprise BI limitations.  We can call out certain areas within the charts and graphs, as well as add the commentary to help us communicate our point.

A safe assumption is that the person reading the report will not have the same understanding of the material as the report designer, or analyst.  It is then our job to make sure that the report communicates the point clearly.  The last thing you want is to hear “what are you trying to show me?”

Below is a good example of presenting data, while not telling us much.  Here we see that he/she has a few fans that are frequent contributors, and that tweet volume picks up around the lunch hour.  There is not much variation for the days of the week, with a little drop off for the weekend.  August is also the most popular month.

twitter2008-1

What would be helpful to know is why this data is important to us.  What perhaps would be the most important is to know the subject material, so we could do things like tweet just before lunch as that seems to be the most popular time to inspire reaction.  Or that August tweets were up due to an embarassing grammatical error.

As we are designing reports, make sure that the information has a purpose.  Most specifically, know the audience and know the potential actions the information is going to inspire.





Relevance and Context

20 05 2009

With times a little tight these days, BI projects need to be more focused.  While there are a number of ways to do this, starting with very specific briefing books targeted at a management process or a departmental data mart, think about using relevance and context.  Don’t just recreate the hundreds and thousands of reports that have been created before, use new money to rethink old ways.

For example, compare a Google search versus the WolframAlpha search engine.  Reporting environments can often look like a Google search list – while there is some relationship, there may not be much relevance.   I may have to search around quite a bit to find what I need.  Enter the WolframAlpha search, it requires the user to provide appropriate context.  With the right context, Walfram works wonderfully, without it, not so much.  It only does what it was designed to do.

Like BI tools, each search engine is designed to do different things.   By training the users to use the right tool with the right context you have a greater chance to provide people with the information they need to make decisions.  You would not ask the same business questions to cubes, reports, dashboards, scorecards, etc.

 

And yes, perhaps I did force the argument so I could say “WolframAlpha.”  And with that said I should probably give a shout out to the folks at Cuil.  





Business Intelligence vs Business Analytics

14 04 2009

There is a growing debate over Business Intelligence vs. Business Analytics and what the future holds.  Clearly the Business Intelligence world has been shaken with Hyperion, Business Objects, and Cognos all now smaller parts of bigger companies.  This has created a number of marketing opportunities for the likes of Microstrategy and SAS.  The obvious marketing play was independence.  Now it is clear that SAS is taking a slightly different tact by claiming that Business Intelligence is dead and the future is Analytics.

Marketing messages aside, what we need to be focusing upon how we use information and the management process.  Call it data, information, intelligence, analytics, or whatever we come up with next, it is all irrelevant if we don’t understand how to use it.  A basement full of great tools doesn’t mean the house remains maintained.  
  • Do you have rules on when to use the specific tools in the BI suite?
  • Do your people have the analytical skills required?
  • Do you have a process where the information can be discussed and actions agreed upon?
We all agree that organizations need to make fact based decisions.  The other thing we should all be working upon is creating a common vernacular for each of the tools.  As analysts, consultants, pundits, bloggers, we do little good if we don’t teach the value of how to use each of the tools.  You don’t need predictive analytics for an exemption report.  You don’t need a sexy looking reports that do little to explain the goal.  Organizations don’t need real time scorecards.  

What organizations do need are ways to make people comfortable to take decisive action.  We also need these actions to align to company goals and strategy.  The tools we use need to be consistent enough for us to trust them, and the minds that analyze them need to be able to use the tools well enough to communicate only what matters in a digestible presentation.





Because you can…doesn’t mean you should

14 04 2009

We do a number of things in the name of business intelligence.  We say we have to have real time information.  We have to have hundreds of reports.  We have to be able to look at everything in every direction.

Business Intelligence software promises us this and make this seem like an achievable goal.  And yes it would be great to know everything about everything and get a perfect 360 degree view of the organization.

Yet it is not really achievable, actually not even close.  Instead ask what are the goals & objectives of the organization, and how does this support that end.  We are very quick to say “we can do that” but we need to temper that with “why should we do that?”  Think of the goal of a dashboard – to providereal-time information on a specific subject.  I have known many managers that constantly stare at the screen to see if anything moved.  

What we really need is to understand how to use the function of time and integrate that into a analytical management process.  What would you get more out of, a tactical dial that shows us one KPI, or a meeting at the end of the day to review a number of KPIs?





Scorecard or Fact sheet

10 04 2009

A common Scorecard design is to list a bunch of business facts – how many customers, total square feet, total employees, inputs, etc.  While these can be important business facts that executives need to know, they may not be manageable numbers.  By adding them to the scorecard, they take up valuable real estate and misdirect focus.  

As you are thinking through your scorecard design, take some time to consider if an item is a REAL KPI, or just a business fact.  Then design the scorecard to focus on objectives with potential links to business fact report(s).





Analytics & Actionable Information

5 03 2009

I work on many projects where the outcome is  “just provide us actionable information.” While this is always the goal, I find most people use the term quite loosely, as if it were merely an additional option. In reality this is quite difficult to create. Many things need to come together to create action, and it is far more than just information or a report.

To create effective actionable information, we need to integrate people, information, and tools. We also need to have the right skills at different times. All too often, the expectation is for IT to write a single report that will answer all questions. Yet, what typically happens is the report creates more questions as IT cannot predict every need. All this has done is create more activity for IT and delayed action.

Let’s view this from a process point of view…how would it look:

First, we have a tremendous amount of data. And it would be easy to argue way too much data, hence the need to create layers of relevance. How often do we get lost looking for what we need, or recreate something because we don’t understand the business rules of the data we find? This wasted effort costs the business money and time.

We have the information, now we need a good analytical mind to review the data to create analytical models or what-if scenarios. What typically happens here is a finance or IT analyst runs a few numbers. This is probably OK for many instances, but the best option would be both a mind of the business as well as a statistical curiosity (though at this stage we need more of a statistician). IT and Finance often lack both of these to some degree – as their primarily skill is data or fiscal governance.

Now we have some level of analytical information, but still have work to do. In general, the statistical mind tries to cram in too much detail and wants to discuss the process of discovery, instead of the finding. To transform analytical information into action, we need the business to present the finding in executive terms – value created. The presentation is more than likely to include multiple reports, synthesized into a couple charts. The next step is to foster a discussion of the recommendations and potential options. The discussion will focus on gathering feedback and coalescing them into an agreed upon plan.

It is common here for people not to feel comfortable with the information and ask for additional information and analysis, but we need to fight the urge to delay and put the best foot forward. There will be times when the need for rework is great, but if the discussion includes the right people and the facts then there should be enough to make a decision and move forward. Otherwise, the risk is creating a culture of endless analysis.