Reporting is NOT Analytics

Reporting is about what happened; Analytics is about answering "what if" and "what's best" questions.  Most of the materials that land on a VP/Director’s desk (or inbox) are examples of reporting with no analytical value added.


Reporting tells us what has happened: sales; orders; production; system downtime; labor utilization; forecast accuracy. Reports leave it up to the reader to digest this information and based off their experience and expertise about the world around them construct a ‘story’ as to why it may have happened that way.

Really good reports will look for known causes of routine issues. For example, if I know a store is low on inventory of a specific product, I could just report that. If I flag it as an exception the person receiving the report may even see it among the sea of other facts. To go the extra mile, it would be wise to see whether I can (automatically) find any disruptions back in the supply chain (either with inventory or flow of goods) and include that information to answer the routine questions that will be raised by my report. But, the report builder must anticipate these needs at the point they are writing the report and for more complex issues that’s just not realistic.

Analytics is all about finding a better story or, if you prefer, insight from data. We’ll talk about tools for finding insights in a moment, but much of this is about approach: develop a working theory about what may be happening and test it out with the data you have available. Revise your theory if needed, rinse and repeat: this is very definitely an iterative and interactive process.

At the simplest level a lot of really good analytics is enabled by being able to interact with the data: filter it to see specific features; sort it to find your exceptions, drill-down into more detail to see (for example) which stores are causing the issue in your region, chart it to see trends across time or perhaps even see relationships between variables (like temperature and sales of ice-cream by region). Generally available tools (like Excel) can get you a long way to intuitively understanding your data and finding some insight.
A further step (and one I fully understand most analysts cannot take - see  What is ‘analysis’ and why do most ‘analysts’ not do it? ) would be to run some descriptive statistics around the data.
  •  Measures of ‘average’ (mean, median, mode)
  • Measures of ‘spread’ (Standard deviation, percentile ranges, Min/Max)
  • Frequency histograms and boxplots to visually show the distribution of data
  • Scatter plots to view interaction
  • Correlation matrices to spot high level interactions
  • Outlier detection
  • Dealing with missing values
If these options seem strange perhaps even archaic and of little relevance to the business world, you may need to trust me when I say that these are exceptionally valuable capabilities that increase understanding of the data, uncover insights and prepare you to step into the world of (predictive) modeling. Thinking back over the last 2 weeks of work (store clustering, system diagnostics, algorithm development and even some ‘reporting’) I can confirm that I have used every one of these multiple times and to good effect.

In Predictive Modeling we build a mathematical model around the problem you are trying to solve. Once the model is built, and (very importantly) validated to confirm that it really does behave as you expect the world too, you can start asking questions like:
  • what happens if I add another warehouse to this network
  • what is the best combination of price and promotion to maximize profitability
  • how much inventory do I need in each warehouse
  • what is the best way to load this truck
  • what is the best assortment for each of my stores
  • why are sales dropping in the Western region

Everyone is familiar with the phrase “Garbage In, Garbage Out” relating to how computers will happily calculate (and report) garbage when you throw bad inputs at them. With modeling, the structure of the model is one of those inputs and many of you may have experienced the complete junk that comes out of a bad model even when you put good data into it. Picking the right modeling tools for the right job and applying them correctly is a very skilled job. Predictive modeling covers an extraordinarily broad field of statistics, mathematical and operations research. Just as for data handling (see The Right Tool for the Job), this is not something you are likely to do well without appropriate training: understanding the field you are trying to apply these tools to helps enormously too.

So why go to the trouble and expense of ‘Analytics’ rather than ‘Reporting’? Reporting is essential, but well-built Analytics or Predictive Models can find insights and opportunities that you will never find by any other means.

A really good analyst will do this work and while being willing, ready and able to take you to whatever depth of complexity you wish, will furnish you with a simple (perhaps even one page) report that answers your questions and makes the end result look … obvious.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.