The analyst function is dead

8 09 2011

The role of the operational analyst has moved from the business into both Finance and into IT.  The Finance team typically focuses only upon the financial outcomes of the business and has left the operational side of the business to the IT team.

Here is a conversation a client of mine recently had with their analyst…

ANALYST: ” Here is the report on units sold this year.”

BUSINESS:  “What happened here?”

ANALYST:  “That is a spike in the data.”

BUSINESS:  “Right.  But what happened?”

ANALYST:  “That is what the data is showing.”

Sadly, this is not uncommon in the business world today.  Billions of dollars are spent every year on Business Intelligence software to help us visualize what is happening within the business, yet we are really no better off in terms of insight.

WHY is this happening?

  1. The biggest reason why this is happening is we have changed the role of the analyst.  It used to be a marketing person looking at marketing data, or operations looking at manufacturing information.  We have now moved that role to IT, or IT has promised that that can do it better with their understanding of data structures.
  2. We have wrongly assumed that a picture is worth a thousand words.  In BI terms, a chart is worth a handful of questions. IT can not predict that next series of questions and is then left to prioritize what questions to tackle next.
  3. The pace of business, or at least the pace and variety of business questions (like the data we collect) has risen exponentially and scaled faster than our ability to respond.
  4. IT is over burdened and lacks the political power and will to say “no.”  They are in complete reaction mode and lack the resources to cover the demand.

WHAT can we do to fix this?

  • First off, we need to understand the analytical gap within the organization.  IT can manage the data and needs to partner with the business, but the business needs to own the intelligence.  It is easier to teach the business a little about technology, than teach the IT resources about the business.  The business side needs to find that type of person who understands a little about technology, but has a solid mathematical or statistical mind with a curiosity about improving the business.
  • The organization needs to find a better way to integrate better analysis back into the management process.  We need to give the analysts a frame of reference in which to explore ideas and present results.  Some of this will follow reporting upon weekly/monthly operational outcomes, while most will likely by ad-hoc hypothesis or what-if scenarios about some aspect of the business.
  • The culture has to reward critical thinking.  This is not true in most corporate cultures.  All to often, the analyst is criticized for not “going along” with the current belief.  If the culture does not reward new thinking, then the analysis will quickly fall in line with visualizations that support the status quo.
  • Invest in tools and training beyond just the core cubes and reports of the BI market.  While a good portion of analysis can be done with Microsoft Excel and a data dump, the more we want out of our analysts, the more we need to give them.  We need them to look at market baskets, threshold containment, frequency curves, optimization models, assumption testing, correlations, and many other types of analytical tools.

 





Analytics: Frequency Distribution & Bell Curves

8 11 2010

A statistical method we often overlook is the distribution curve.  I think most of the time it is dismissed because people get nervous about using statistics if they are uncomfortable with math.  While there are some advanced concepts around using a frequency curve, it can also be used visually as a simple tool to explain results.

A simple stats lesson….

Normal Bell Curve – roughly 68% of the population is within 1 standard deviation (measure of variation) of the average and 95% is within two standard deviations. Below is an example of IQ scores.  The average score is 100 and 68% of the data is between 85 & 115.

While this visualization doesn’t do a tremendous amount for us, this is what we assume when we think of populations, like customers and employees.  And because of our limited statistical training we make a large number of assumptions based on averages.  We love to look at average revenue: average revenue per employee, average revenue per customer, etc.  This thinking also gets us looking into the outliers (that <5% that sits way out to the left or right of the chart).  How much time do you spend on less than 5% of the business?

OK, so back to thinking of this in terms of running a business….

Let’s map out our revenue per customer.  I would be willing to bet it looks something like the following:

If this is the customer revenue distribution, if we use the average number in our analyzes we can quickly generate a number of wrong assumptions.  First and foremost, our typical customer is larger than reality.  It might lead us to think we are serving mid-sized businesses than more likely smaller market customers.  I am also willing to bet our profitability per customer has a similar curve to it.  In this case we are likely spending money on the wrong customers and aligning our better services to a lower profit generating customer (or more likely a profit destroying customer).

Do we need to use it in everything? Of course not, but it might help everyone once in a while to challenge our overuse of the mathematical average to reassess perspectives of our business.  A great place to start is map out the customer base in terms of revenue (profitability is better, but takes a lot longer to do).  It might just lead you to understand your customer (think customer segmentation) better.

Real life example…I was once part of a research project to understand discounting to one side of the outliers (<1% of the business).  The outcome was to focus on reducing discounting to that <1% of the business.  What I argued was to focus on the larger part of the business, where the same efforts would have resulted in millions more in terms of profits.  It was a clear lesson is where to apply process improvement.





Visualization Methods

14 10 2010

I thought this was worth sharing….Periodic Table of Visualization Methods.  This is a nice visualization of the different types of visualization.  It shows some good examples, and some not so good examples of visualization. Make sure you mouse over the different elements.

Rules of visualization designed to create action:

  1. Keep it simple, clear, and concise – with the emphasis on simple.  Don’t use complex charts to explain simple ideas.
  2. Know your audience.  Don’t present glorious details of each step in the analytical process to executives – trust me, they don’t care.
  3. Find a chart style that works well with the data.  Line charts show historical trending, bars charts do a better job of showing relativity.
  4. Don’t use 10 charts when 1 could suffice.
  5. Label well.  Take the time to make sure all of the information is explained.  The last thing you want to happen is for someone to look at it and say “what does it mean?”
  6. Understand there is a difference in analysis and presentation.  If you are trying to convince someone to act, then make sure the data (and you) tell the story.
  7. Start with the big picture, then explain (if necessary) how you got there.  People learn by seeing the picture first, then seeing how the parts go together.
  8. Document your assumptions.
  9. Explain your conclusions, don’t expect your audience to jump to the same answer.
  10. Highlight the relevant points within the data that augment your argument – use a color scheme that calls out the item if you can (red bars vs gray).  Do not be afraid to use the power of a printed report and some hand written notes with arrows to the corresponding areas.
  11. Understand where and why the data does not support your conclusions.  Be prepared to defend against those points, because your audience will likely be looking for ways to contest your conclusions.
  12. Practice what you want to say.  The more proficient you sound the more convincing you will be.




Advanced Analytics

22 03 2010

A major item organizations grapple with is the concept of advanced analytics.  They want it, but have little idea how to use the various tools to make it happen.  Unfortunately too much information often blurs the lines.

For example, I watched a sales presentation on Predictive Analytics where the key outcome showed how to build databases with the tool yet almost completely missed the fact that the real benefit should have been something like “we were able identify two segments to target a marketing program for more effectiveness.  Instead of spending $500k on a generic campaign we were able to identify key attributes that drove increased customer interaction and focus the campaign to only $200k on those segments.”

Why is this? The primary reason is we do not truly understand the tools and how best to use them.  A Swiss army knife is not good for home repair, but is the perfect tool to throw in a hockey bag, or car trunk for occasional use as a widget to get you out of a jam – a screw needs to be tightened, a shoelace needs to be cut, or an apple peeled.  We need to understand which tool to use in the most appropriate situation instead of thinking of various tools as universal.

Business Intelligence, Planning, What-If Scenario Tools, Optimization, Dashboarding, Scorecarding, Cubes, Cluster Analysis, Predictive Analytics are all different tools for vastly separate purposes yet have similar uses.

Advanced Analytical Tools

Here are the core elements of Advanced Analytical tools:

  • Business Intelligence – great for creating an enterprise-wide, data visualization platform.   If you do this right, you should create a single version of the truth for various terms within an organization.  It should enable better reporting consistency standards for the organization.  In the end, it reports what the data says.
    • Scorecard & Dashboards – These are primarily BI tools that have a more organized or structured methodology for presenting ideally the Key Performance Indicators.  These are great tools, but to be most effective, they need a specific purpose that is highly integrated into a management process.
  • Enterprise Scenario Planning – Most enterprise planning exercises are giant what-if scenarios that try to plan out financial outcomes based on a series of drivers (employees, widgets, sales reps, etc.).  We build out plans based on a number of assumptions, like the average sales rep drives $2mil in business, or benefit costs for the year are going to be #of employees * average salary * 2.  We do this primarily to lay out a game plan for the year and we do it as part of an annual or rolling cycle.
  • Tactical or Ad-Hoc What-if Scenario Analysis – Besides the full scale project we do to plan out the company’s cash outlays, we also do a significant amount of smaller, typically tactical “what-if” scenario tests.  This is traditionally done in Microsoft Excel.  We dump a bit of data into excel, make a number of assumptions and try to build out likely scenarios.  For example, “if we were to create a customer loyalty program, what would be the cost and a likely reward.”  We are doing this to test ideas, so yes it might be ideal to bolt those into the Enterprise planning tool, but it typically takes too much overhead.  It is easier to just get something done quickly, then make a go/no go decision.
    • Data Visualization can also be a great help with this – to bolt on a couple of reports to see the data and how different scenarios impact the various facts and dimensions.  This can help us with our conclusions and recommendations.
  • Predictive Analytics – This tool is best used when we have historical data, or representative data set and we want to make a conclusion based on mathematics.   The key is math.  This is not guessing, it is improving the chances of being right with math, or a structured approach to remove risk from decision making.  With a planning tool, we primarily use assumptions to create plans.  We cannot use predictive analytics for all decisions, but for a few specific types of decisions:
    • What transaction details and customer insight can we use to determine credit card fraud?
    • What customer attributes create our buying segments?
    • Which customers are most likely to abandon our offering?
    • What products are most often purchased together?
    • Which taxpayers most likely need to be audited?
  • Optimization Analytics – This is perhaps the most specific advanced analytics tool when looking to solve the specific business question: “With the given parameters of these trade-offs, which mix of resources creates the most effective (or efficient) use of those resources?” This helps make decisions around production locations and product investment.  Like predicative analytics, it is mathematically based (though you may need to make a couple of assumptions as well) in how it determines the answer.

Advanced Analysts

Another reason we lack understanding is analysts.  Our analysts are commonly from the IT team, trained in data structures, or from the finance team, trained in accounting.  Neither is wrong, they just have a default mindset that falls back on using the tool they best know.  This lacks the business/statistical trained person who can both layout the hypothesis and, more importantly, explain the results.

We do not want correlation explained in R-squared values, “63% of the variation of the data is explained by our independent variables.”  While this may make sense to other statisticians and mathematicians, it is lost on the business.   One key value of using a math-based concept is that the explanation should sound more like, “We have found a way to decrease fraud by 3.2%, which should result in a $576K return to the business every quarter” or “We have tested our marketing campaigns and have found three segments that are 25% more likely to purchase based on the campaign, which should result in a payback period of 3 months.”

The right tool with the right skill set is imperative to successfully using advanced analytics.  We also need the discipline to have the right people using the right tools for the right information to drive action.  If you have an algorithm that predicts customer defection, you need to use it and test the results.  It is never going to be perfect, but in most cases, you can bet it will be better than not using it at all.





Telling a Story

28 12 2009

“What we’ve got here is a failure to communicate” Luke in Cool Hand Luke (played by Paul Newman)

A friend of mine sent this video along to a number of friends in the Business Intelligence space, saying we need to be better story tellers (Thanks Katie McCray).  We do spend an enormous amount of time talking about data structures, common data dictionaries, ease of use, speed, consistency, etc.  What we typically fail to do is tell our clients how to create information, to tell the story in a convincing enough manner to create attention, and more importantly, enable action.

As analysts we typically spend more time talking about data discovery, and the calculations we used than starting off by making our point.  We try to create 50 charts to explain everything, and not the one chart that most simply illustrates our point.  This not only wastes time, but we lose our audience.

Watch the next couple of presentations you sit through and watch the number of slides that build up to the point trying to be made.  What happens is that with each slide our listeners pay less and less attention as they have lost the point trying to be made.  As learners, we need the point to be made first.  We need to see how it all comes together, then have it explained how to get there.  It provides the context for the point to be made.  People now understand what to listen for and why they are listening.

On a slightly different note, last week I wrote about the housing market and the Dangers of Leading Indicators.  I had to update the post due to a new story with a different viewpoint that ran in the Globe on the 23rd.  Amazing how story tellers can tell such dramatically different things.





Mass Layoffs August 2009

24 09 2009

Yesterday the Mass Layoff report was issued by the United States Department of Labor – Bureau of Labor Statistics.  The data here is interesting in a few ways.  The Mass Layoff report highlights the number of events where 50 or people were laid off by the same firm.

  • Perhaps a little good news for the US economy
  • A little analytics lesson

First off, the US Economy.  We can look at a couple of things here that probably tell us the situation is still bad, but perhaps another indicator that we are rebounding.  In the first chart, the bars represent the events (not total layoffs – but the numbers are highly related).  You can see that the number for August appears to be quite a bit better than July and the previous 12 months, but August is also lower in general.  When you consider the raw volume of the last 12 months, perhaps we just ran out of people to layoff.  My initial assessment is that while it looks like we are heading in the right direction, we may just be witnessing the normal August dip.  Call it cautious optimism.

Aug 2009 Mass Layoff Peaks

Now looking at the data from a visual standpoint…below is how we typically look at this type of data.  Here we would conclude that things look like we have hit bottom and are moving in the right direction.

Aug 2009 Mass Layoff Raw Data1

Yet if we look a little more closely at the data above (and perhaps dig at some of the underlying regional or industry data) we can make a lot of different potential comments.

  • We are coming out of a major event – any data is going to be a little blurry.  Any investments are going to be risky, but with that risk comes the upside reward of potentially being a first mover.
  • The general trend might be improving, but the volumes are still way above normal levels.  How long can we continue to shed people like we have for the last 12 months?
  • The peaks and troughs also show that we are still greater than 2x normal levels.  Clearly, there are still problems in the economy.




Indicators & KPIs

9 09 2009

In a recent Wired magazine article “American Vice: Mapping the 7 Deadly Sins,” (The original was in the Las Vegas Sun’s One Nation, Seven Sins) a group from KSU students did a great job mapping data geographically.  While in no way is the data perfectly accurate, but in the same way it is a logical indication of behavior.  You can spend time arguing the merit of the work, or spend that same time debating the implications of the information.  Either way, it is a rather entertaining visual display of information.

In the business world, we struggle from trying to be perfect, or perhaps afraid of not being completely accurate.  Indicators do not need to be perfect, they just need to trigger a discussion by highlighting the potential of an issue.  The downside is that when we try, we often try to find indicators for everything (spandex rule) and “a point in every direction is like no point at all” (Harry Nilsson, for those who like eccentic music).  Too many indicators and we spend too much time on data collection and visualiztion with no time for analysis and discussion.  The point is to discuss information.





Perfection to Value

16 07 2009

One of the areas where performance takes a giant hit is in the area of project initiaition or closure.  And this is further complicated by personal preferences, politicing, and portfolio management.

In the diagram below there are three lines.  Line A is Corporate or Organization expectation of the trade off between speed and perfection.  Projects or tasks with little value (lower left corner) should require lower expectations of research, analytical thought, and discussion.  While projects that are higher in value (farther up to the right) should have higher expectations on quality of thought and preparation.

Perfection to Value Trends2

What happens all too often is we see line C where people don’t have the capacity or time to do the right job and throw something together.  We see that in the end we deliver far less than desired while wasting resources.  The small blue box is the value received, the red box is the wasted resources, and the green box was the original expected value of the project.  The arc is the value frontier, which demonstrates the trade off value between speed and quantity – or what we expect in terms value created from a combination of speed and quality.

Quality vs Speed - Speed

Or we have line B where we basically have a failure to launch because we spend all of our time debating how to be perfect.  Very similar to the situation with line C where we deliver far less than originally desired while wasting similar resources.

Quality vs Speed - Quality

Portfolio Management

Is this an individual issue, or a management issue?  If we were to plot out the results of the individual projects how would your organization look?

Perfection to Value Management2

If we were to see trends like the circles above, this would indicate a management problem.  As management either did not get the individual(s) to move back to the expected line, or management places to high a premium on either speed or perfection thus artificially altering expecations.

What I have witnessed is that line B is more often the norm.  Line C typically causes painful exposure, which causes people to be more fearful, thus needing more inputs and more support.  This creates more meetings, more approvals, more time, more people, which again causes more information, more analysis, more debate.  It is a vicious circle.

Failure to Act is a companion blog.





Simplicity and Creativity

21 06 2009

Often the best messages are the most simple and straight forward.  If you want a perfect example, check out Common Craft.  They simply explain things – there is no PowerPoint, there are just simple visuals that clearly articulate their points.  The visuals are borderlining on a junior high art project, but I am sure you will see that doing the same thing in PowerPoint is just not the same.

And while I am at it, if you wanted to learn about Twitter, Blogging, Social Networking, etc they have some great samples.  And if you want a short video to explain what you do, they just might be worth contacting.  And no, I do not represent them in any manner.  I just thought they were a great example of performance.

If you are trying to create a presentation to convince the executives to alter course, rethink the tired old PowerPoint and bar chart approach.