Clean Up

6 07 2009
  • When was the last time you cleaned up your reporting environment?
  • When was the last time your reorganized you computer files?
  • How many versions of old files do you keep in multiple back up files and how much space is littered junk?

Most BI environments and network file structures are collections of everything we ever used.  We have files that are used daily sitting right next to files that have never been used.  We have mulitple drafts of things with the same name in the same folder.

When you were designing the folder use, did you think about the lifecycle of that folder (or system) and the things within it?  This is why we end up with things we no longer need and it makes finding the things we need all that more difficult.

Advertisement




Data Warehouse Design

24 06 2009

One of the main problems with Data Warehouses is that they are designed to answer any question.  The problem is that they usually fail to answer the one someone is asking.  DWs are usually good for referencial information – meaning I can answer questions like “how many customers do we have that have spent over $100,000” or “which customers bought the blue widget.”

There are a number of points of failure that hamper DW projects:

  • They are usually complex and very costly
  • The business changes (regions, product lines, sales heirarchies, etc) in the middle of the process
  • The end use is not well defined
  • Lack of analytical skill and knowledge of data structure in the business users to get the right data
  • The end result is too complex for the users to understand where to go to get the right information
  • No one tells the organization “thou shalt” use the data warehouse – so people get data from all different sources making a common version of the truth difficult to get to
  • There are often no rules of engagement for how to use the environment, or data in general

If organizations only use 6-10% of the data they collect, how do you design the DW for greater adoption?

For starters, understand the common business questions and the potential levers that can be pulled. For example, one of the areas that always surprises me is the lack of information around the success of marketing campaigns. Marketing campaigns and price are really the only levers we can pull in the short term to increase revenues. What we often fall back to is the sales whip – where we put more pressure on the sales team to perform. This is a strategy of hope (which is not a recognized as a successful strategy practice). We apply the pressure without providing much in the terms of support.

Instead let’s say we are ending the 3rd quarter and our numbers are a little behind and the pipeline is not as strong as we would like.  We know we have some time, but the programs have to be very tactical to find low hanging fruit. Instead of reviewing the potential marketing programs or trying something new, we cross our fingers and yell at the sales team. We could cull the DW to find large groups of customers who had not bought specific groups of products and offer incentives for them to buy.  We could identify the groups/verticals of customers with the shortest sales cycle and build a promotion and program for them as well.

Yet why do we not do this…we typically lack the information in a format we can use in a timely manner.

So if we design the data warehouse (or perhaps data marts) around specific business levers we stand a better chance of answering the one question we need. We just might trigger some very interesting questions about our business.






Manage vs. Monitor

21 05 2009

It has always struck me as a little odd that a great deal of marketing literature in the Business Intelligence and Performance Management space talks about “monitoring” performance.  Isn’t the entire goal of this space to help companies actively “manage” their business.  My concern is that “monitoring” assumes that all is well unless some alarm is triggered.  

While it is fine for the thermastat to monitor temperature, perhaps business is a tad more complex.  Instead of waiting for things to get to a threshhold, we need to understand a number of things that all work more or less together to explain a more complex concept.  

Instead of just showing up for a meeting, what if we focus on creating a culture of being prepared for a meeting.  We can then use Business Intelligence as a organized and focused set of tools to help with our prep work.





Relevance and Context

20 05 2009

With times a little tight these days, BI projects need to be more focused.  While there are a number of ways to do this, starting with very specific briefing books targeted at a management process or a departmental data mart, think about using relevance and context.  Don’t just recreate the hundreds and thousands of reports that have been created before, use new money to rethink old ways.

For example, compare a Google search versus the WolframAlpha search engine.  Reporting environments can often look like a Google search list – while there is some relationship, there may not be much relevance.   I may have to search around quite a bit to find what I need.  Enter the WolframAlpha search, it requires the user to provide appropriate context.  With the right context, Walfram works wonderfully, without it, not so much.  It only does what it was designed to do.

Like BI tools, each search engine is designed to do different things.   By training the users to use the right tool with the right context you have a greater chance to provide people with the information they need to make decisions.  You would not ask the same business questions to cubes, reports, dashboards, scorecards, etc.

 

And yes, perhaps I did force the argument so I could say “WolframAlpha.”  And with that said I should probably give a shout out to the folks at Cuil.  





Business Intelligence vs Business Analytics

14 04 2009

There is a growing debate over Business Intelligence vs. Business Analytics and what the future holds.  Clearly the Business Intelligence world has been shaken with Hyperion, Business Objects, and Cognos all now smaller parts of bigger companies.  This has created a number of marketing opportunities for the likes of Microstrategy and SAS.  The obvious marketing play was independence.  Now it is clear that SAS is taking a slightly different tact by claiming that Business Intelligence is dead and the future is Analytics.

Marketing messages aside, what we need to be focusing upon how we use information and the management process.  Call it data, information, intelligence, analytics, or whatever we come up with next, it is all irrelevant if we don’t understand how to use it.  A basement full of great tools doesn’t mean the house remains maintained.  
  • Do you have rules on when to use the specific tools in the BI suite?
  • Do your people have the analytical skills required?
  • Do you have a process where the information can be discussed and actions agreed upon?
We all agree that organizations need to make fact based decisions.  The other thing we should all be working upon is creating a common vernacular for each of the tools.  As analysts, consultants, pundits, bloggers, we do little good if we don’t teach the value of how to use each of the tools.  You don’t need predictive analytics for an exemption report.  You don’t need a sexy looking reports that do little to explain the goal.  Organizations don’t need real time scorecards.  

What organizations do need are ways to make people comfortable to take decisive action.  We also need these actions to align to company goals and strategy.  The tools we use need to be consistent enough for us to trust them, and the minds that analyze them need to be able to use the tools well enough to communicate only what matters in a digestible presentation.





Because you can…doesn’t mean you should

14 04 2009

We do a number of things in the name of business intelligence.  We say we have to have real time information.  We have to have hundreds of reports.  We have to be able to look at everything in every direction.

Business Intelligence software promises us this and make this seem like an achievable goal.  And yes it would be great to know everything about everything and get a perfect 360 degree view of the organization.

Yet it is not really achievable, actually not even close.  Instead ask what are the goals & objectives of the organization, and how does this support that end.  We are very quick to say “we can do that” but we need to temper that with “why should we do that?”  Think of the goal of a dashboard – to providereal-time information on a specific subject.  I have known many managers that constantly stare at the screen to see if anything moved.  

What we really need is to understand how to use the function of time and integrate that into a analytical management process.  What would you get more out of, a tactical dial that shows us one KPI, or a meeting at the end of the day to review a number of KPIs?





Scorecard or Fact sheet

10 04 2009

A common Scorecard design is to list a bunch of business facts – how many customers, total square feet, total employees, inputs, etc.  While these can be important business facts that executives need to know, they may not be manageable numbers.  By adding them to the scorecard, they take up valuable real estate and misdirect focus.  

As you are thinking through your scorecard design, take some time to consider if an item is a REAL KPI, or just a business fact.  Then design the scorecard to focus on objectives with potential links to business fact report(s).





Key Performance Indicators (KPIs) & Key Risk Indicators (KRIs)

6 04 2009

Key Risk Indicators (KRIs) are an interesting concept, or twist to Key Performance Indicators (KPIs).  Instead of thinking of KPI measuring performance, think of a KPI as really just an indicator that the objective is at risk.  They are really the same thing.

If your objective is to Maintain Salesforce Effectiveness, a solid indicator might be revenue per sales rep.  If our revenue per sales rep is declining, it should be treated as a trigger for a broader discussion on the objective, not necessarily the KPI.  At the same time, we will want to analyze a number of other performance indicators for a deeper, richer discussion.

We look at KRIs for example employee turnover is this not just a performance measure against the objective Retain Great Employees?

In the end though, we are just splitting hairs by calling something a KPI or KRI.  It matters far more that we have the discussion about the objective(s), than trying to build separate processes to measure subtle nuances.





Scorecards & Dashboards

16 03 2009

These are two terms that the BI world uses interchangably. The only thing they should have in common is that they both can visually display data.

Defined:

  • Scorecards are tools that help facilate discussions around strategy and operational performance management. The indicators (KPIs) should foster discussions about corporate direction, resource allocation, priorities, and initiatives. 
  • Dashboards should be used for tactical discussion triggers, like inventory orders, technical support, phone coverage, etc. 

What should be happening with these tools is a far more structured use for each (and throw in reporting as well). All too often these tools are used without discipline which leads to mulitple versions of the truth, lack of focus, red herrings, miscommunication, and ultimately a waste of time and energy.

IT and business users need to work together to better understand what each tool can provide, when that tool will be used, how it will be used, how it will NOT be used, and who should be using them.





Align to Customer Value

16 03 2009

On thing to consider in terms of developing KPIs (Key Performance Indicators) is how they are aligned to the customer’s wants.  All to often we ignore this perspective, yet it is perhaps one of the most important factors.  

For example, one of the growing cost saving tools companies use is call automation services.  “For sales, press 1.  For customer service, please hold while we test your patience.”  

Companies do this because they are measuring cost per call, or efficiency.  What the customer really wants is a convenient resolution to their call, or effectiveness.  Clearly these goals are working against each other and in most cases destroys customer loyalty and brand value.  

In the end, we need to balance costs with value, and we need to understand customer and corporate strategy.  Are we focused on customer intimacy as our core business focus, or operational excellence?  Are we measuring the business in a manner that reinforces our business model and customer value creation, or strictly by the bottom line?





Efficiency vs. Effectiveness KPIs

13 03 2009

Key Performance Indicators (KPIs) should be measures of risk to annual goals or strategic objectives.  If we can keep this list of KPIs minimal, we stand a much greater chance of keeping the organizational focus on improving key processes.

To derive these KPIs we need to understand the organizational inputs, outputs, and desired outcomes.  While this is a little academic, it is a good way to start to organize and define your KPIs. Outputs / Inputs are measures of efficiency, while Outcomes / Inputs are measures of effectiveness.  By overlapping the organizational or departmental focus we can align and define these KPIs to make sure they are driving the desired behaviors.  

Tradionally Sales and Marketing goals are to be effective, thus revenue per head, or win percentage are better measures.  While finance and IT are generally geared for efficiency withcost per order, or IT spend per target are more common.  

KPI design is far more difficult than people expect and is often unique to the environment as strategies, objectives, and priorities vary organization to organization.





Analytics & Actionable Information

5 03 2009

I work on many projects where the outcome is  “just provide us actionable information.” While this is always the goal, I find most people use the term quite loosely, as if it were merely an additional option. In reality this is quite difficult to create. Many things need to come together to create action, and it is far more than just information or a report.

To create effective actionable information, we need to integrate people, information, and tools. We also need to have the right skills at different times. All too often, the expectation is for IT to write a single report that will answer all questions. Yet, what typically happens is the report creates more questions as IT cannot predict every need. All this has done is create more activity for IT and delayed action.

Let’s view this from a process point of view…how would it look:

First, we have a tremendous amount of data. And it would be easy to argue way too much data, hence the need to create layers of relevance. How often do we get lost looking for what we need, or recreate something because we don’t understand the business rules of the data we find? This wasted effort costs the business money and time.

We have the information, now we need a good analytical mind to review the data to create analytical models or what-if scenarios. What typically happens here is a finance or IT analyst runs a few numbers. This is probably OK for many instances, but the best option would be both a mind of the business as well as a statistical curiosity (though at this stage we need more of a statistician). IT and Finance often lack both of these to some degree – as their primarily skill is data or fiscal governance.

Now we have some level of analytical information, but still have work to do. In general, the statistical mind tries to cram in too much detail and wants to discuss the process of discovery, instead of the finding. To transform analytical information into action, we need the business to present the finding in executive terms – value created. The presentation is more than likely to include multiple reports, synthesized into a couple charts. The next step is to foster a discussion of the recommendations and potential options. The discussion will focus on gathering feedback and coalescing them into an agreed upon plan.

It is common here for people not to feel comfortable with the information and ask for additional information and analysis, but we need to fight the urge to delay and put the best foot forward. There will be times when the need for rework is great, but if the discussion includes the right people and the facts then there should be enough to make a decision and move forward. Otherwise, the risk is creating a culture of endless analysis.





External & Market Indicators

25 02 2009

One item most organizations struggle with is leveraging external indicators. Early last year, the price of gas created a chain reaction. Most companies cost of goods sold increased to where they were forced to raise their prices as their margins eroded.  

Even if we do that, we typically do not have a systematic way to incorporate the learning into a business process. What we would need is the ability to understand the external indicators, know of potential sources for the information, and work these into ongoing environmental scans.  

What is the value of understanding how the consumer price index impacts your revenues? What happens if you were able to move before your customer in terms of supply chain interruption? In some cases, this could mean millions to your top or bottom line. There are a number of organizations that knew the market was struggling in 2008, but did nothing to prepare.  And a number of those names will never be the same (GM, AIG, Circuit City, etc).

When is the last time you did a formal environmental scan, discussed the results, and put new actions into place?





Defining Operational Performance Management (OPM)

12 02 2009

OPM is part of a number of buzzwords within the industry that is often used, yet poorly defined. While it is part of a the Performance Management family (which is also overused, generally accepted, yet not well defined). For us to make the niche more credible it is important to have a generally accepted definition of what it means.

I have tried to frame OPM as a methodology, a framework, a process in which the focus is upon creating value with the customer in mind. Where Financial Performance Management strives to improve the budget development and budget management processes to enhance shareholder value, OPM takes us beyond the constraints of the financial mindset. We need to look at the processes and initiatives that drive customer value creation. Processes and initiatives like sales and marketing, operations, supply chain, pricing and discounting, etc.

It is clear these two legs (OPM & FPM) must work together, and one should not take priority at the expense of the other. All to often our budgetary process becomes our measure of success, even though it is a lagging indicator. Where OPM becomes particularly valuable is that if we are building customer value correctly, it leads to greater financial results. Are we better off to hit our budgets in a year when the market was wildly successful? If the market grew at 10%, yet one grows at their budgeted 8% – was management successful?