Analytics has a COVID headache

One common challenge large businesses have lies in trying to accurately predict what

demand they will experience from the market in coming months for their product or service. It’s an area called “Demand Planning” or “Forecasting”, and almost every business does it in some way, shape or form.

When performing this Demand Planning, sometimes the business wants to look at the overarching trend of demand (and nullify the effects of seasonal variation). Imagine that you are in charge of the ice cream brand “Streets” in Australia. If you just looked at your monthly sales volumes it would be hard to distinguish if your business is becoming more popular, since the trend is hidden among a consistent seasonal pattern of higher sales in Summer months followed by lower sales in Winter months.

To account for this and make the trend easier to distinguish, sometimes the forward forecasts are “Seasonally Adjusted”. This just means that we equalize for this variation by lowering the sales in the hot months and boosting the sales in the cold months. Viola — we have a smooth sales trend.

So what’s the problem?

Well typically, the method by which this adjustment is performed is by looking at the seasonal variation over the last few years (say 3 years) to determine that generally (for example) Summer has higher sales and Winter has lower sales. As you can imagine, this will create some serious headaches for businesses next year when this adjustment they make to their forward forecasts is made on the basis of historic sales that include 2020! Businesses would be expecting a massive ‘seasonal’ drop in sales in March and April which are, of course, due to a pandemic and nothing seasonal at all!

More generally, data based decision making is usually based on inferring future behaviour from prior outcomes. This has revolutionised how businesses operate more efficiently and effectively, however it does present considerable risks when it is not reasonable to assume that the future will replicate the past.

All industries are currently grappling with how to make sense of their pre-Covid data insights in a post-Covid world that may be very different fundamentally to the pre-Covid world. I don’t believe there is an industry in the world that won’t struggle with this, however for an obvious example consider retail banking. The decision of whether to accept or decline a mortgage application is typically made by an algorithm that estimates the likelihood that the applicant will default on the mortgage.

The issue is that in most cases, the mortgage default probabilities were estimated from a data-set of applications (& associated outcomes) that occurred during a period where unemployment was stable at ≤ 5%. Obviously, that is no longer the case, so it is not reasonable to infer that these outcomes will continue to be adequately represented by the model that we learned behavior from (not remotely!).

Exacerbating this issue is the fact that businesses have abstracted a lot of this work away from experts and into the hands of ERP software that will automatically perform the forecasts without any human sense-checking of whether it still makes sense. It’s a warning of what can happen when businesses become overly reliant on platforms over people.

So given these challenges, what should businesses be doing? In an unpredictable world it’s more important than ever that businesses leverage their data in a way that is appropriate for the current conditions — that will only be possible through the synergy of technology and people.

The TL;DR is that if you think an expert is expensive — wait until you see what an amateur can end up costing you.

How can we synogize together?

Contact us today to learn how you can bring your data asset together and make your business smarter tomorrow.

Looking For A Reliable Partner for your Data + AI Challanges?