Data drives decisions but are they good or bad decisions…? Computers have become an intrinsic part to our lives, and have enriched everything we do, especially when it comes to our working lives. I think it’d be hard for most of us to even recall what our jobs were like before computers. One thing is for sure – the capacity to build ever more sophisticated systems to help with decision making has continued to grow.
There is one catch though, and this element remains one of the most critical aspects to getting the most out of the systems we design. It really doesn’t matter how smart / flashy / mobile / sexy UI it is – if the data coming in is wrong – the maths, reports and suggestions coming out will also be wrong. My good friend Brian Kolubinskyj gave a perfect analogy of this reality with an Exceedra Byte vlog, demonstrating the catastrophic effects on a big V8 sports car if diesel (the wrong fuel) is used rather than High Octane petrol.
Let’s have a think about this data challenge for TPM.
If you’ve been lucky enough to be involved in a deployment of TPM software, you will likely have been involved in the starting data discussions surrounding product, customer, and sales history. Sounds simple right? My background in IT started around Demand Planning and it was the same discussion – we wanted data presented at multiple levels to allow statistical forecasting engines to interpret trends and be able to forecast effectively.
For TPM – the level that promotions are mainly planned and reconciled with back to the finance systems is crucial, as is the lowest level at which ACCURATE sales history can be received. From a customer level – this can often be called the ‘SOLD TO’ level – which may be several levels ‘up’ from the ‘SHIP TO’ level. To add one (of many) layers of complication, it’s quite possible that the desired Promotion Planning level may vary according to which route to market is being considered. Many projects must spend considerable amounts of time looking at ways to simplify and consolidate both product and customer hierarchies to create the conformity that enables all downstream inter-dependent activities and processes to occur.
Over the years there have been significant data mart/warehouse/ERP data clean-up operations – especially once ERP systems started to have the capability to enforce what’s called Parent-Child relationships between attributes related to either a customer or a product. One of the most important features of planning systems is the ability to plan at different levels of these hierarchies – and when you add in capabilities to make changes to data at different levels – the relationship between a brand and a product SKU, or a country region and a customer are all critical to help ensure that the allocation of overrides work correctly. I sense thousands of consultants around the world (should this blog ever reach them!) would agree when the question asked, ‘is your data and hierarchies accurate’ is responded to ‘of course, our data is great….’
Most countries will have major retailers that the manufacturer will supply to directly. There is then a proportion of smaller customers that may be managed through wholesale or distributor customers – hence the relationship between the manufacturer and these ‘smaller’ retailers is ‘indirect’. There are then other channels such as Food Service, Out of Home and I’m sure many more depending on the region and industry we are considering. Somehow, all these variables need to be considered to either aggregate or harmonise the source data coming into a TPM solution. We were talking about sales volumes in this example – but tied in with the sales volumes are also going to be a significant variety of pricing information, which depending on the availability and level of this information, may also need to be harmonized through making assumptions to min/max/avg the price information.
Another data stream that has become far more prevalent in global FMCG is Point of Sale (POS) data. This is the data that the retailers get when a product is scanned at the checkout. Historically, there have been intermediary vendors such as Neilsen’s that provided a service of collecting this information from the retailers, cleansing, manipulating, and providing it to the manufacturers for a fee. In this data we would find the date of sale, volume sold, price sold at, product GTIN (barcode) and other attributes that may have been statistically estimated by the vendor – such as Base or Incremental Volume.
For the manufacturer – this information is extremely useful to understand the influence a consumer driven promotion was actually having at that retailer, you could say at ‘store level’ – but in reality, the number of stores, promotions and products most manufacturers would need to hold in a database to analyze store level data would run into billions of lines of data and not necessarily drive outcomes or decisions that would help them. In general, POS data aggregated to some level above store – usually geographical, is sufficient to understand promotional influence and trend. The ‘gotcha’ with POS data tends to be the effort required to ensure mapping is continuously managed from the source product/customer/date in the POS data to the ERP product/customer/date that would then feed into the TPM solution. In more recent times, it appears that retailers have discovered the power of this information and in some cases have restricted access to it to the 3rd party providers as well as the manufacturers.
In addition to POS data, best-in-class TPM systems drive further insights through aggregated feeds collected from Retail Execution (RE) solutions including DMS systems, Loyalty Data and store back KPI’s which all help to provide more insights on how to create the top-down channel and customer requirements to create effective TPM plans.
Promotional data is perhaps the hardest to assimilate into a new TPM system. Depending on the customer’s TPM journey – there may be no promotional history, or the history is stored in legacy systems difficult to extract. The data may also relate to obsolete customer/products, or in many cases – be inaccurate because of the nature of retail/manufacturer promotional plans/strategies that change at the last minute and are not corrected in the source systems. For the part of TPM systems that want to be able to provide future predictions on volume and price (and therefore commercial liability on promotions) – promotion history needs to be there. Most implementations accept an approach to wait for this statistical analysis to come ‘online’ a few months after the system has been live and when the Account Managers have entered their programs for the year. Make no mistakes though – if there is not constructive management of how this specific information is maintained – any predictive modelling whether statistical or Machine Learning will be subject to error (bad data in, bad results out.)
The final piece of data required to close a full TPM process is information that must be supplied back from the customer (retailer, distributor, wholesaler etc) specifically regarding their understanding of the discounts the manufacturer (hopefully) agreed to, and hence how much money is owned back. This information can come in the form of invoices, remittance advice, claims or deductions, and probably a host of other forms/names that I’ve not had the joy of discovering yet!
Somehow, the manufacturer must try and reconcile ALL the pricing deals presented in a multitude of different mechanics derived by the sales organization to every single customer. Therefore, TPM systems can become so complex – deal and pricing structures are main levers the sales teams must ensure ‘their’ product ends up front and center for the consumer to buy in the retailer’s store. The Finance team get the unfortunate responsibility of trying to link promotions, rebates, terms, and many other deal types back to the information provided by the retailer. This paper trail process is huge, and (perhaps in my old age) somewhat deliberately difficult to manage so that the beneficiaries of the discounts always maximize their profits from these discounting methods. One of the biggest potential wins for any company starting on a journey to implement a TPM solution is to go in ‘hard’ on simplifying and consolidating the entire deal metrics and process BEFORE starting the implementation (!). It’s in the retailers favor to want a thousand different ways to reach a final discounted price – but at the end of the day there is a list price, and the price they actually pay. If the percentage variance between these two remain roughly the same – but with 10 levers instead of a thousand – proceduralising this into a solution is vastly less complex. This area is usually having the lowest hanging fruit in the form of ROI for any TPM initiative. That is the reason why we offer additional support in the form of our Settlement Services Team that can help you manage your claims/deduction if your team is too lean to do so on their own.
Reporting is critical to be able to drive smart decisions for all users working with the system. Clearly from the points discussed in this blog – those reports are not going to give ACCURATE insights unless the data management process for all attributes has been thought through end to end. If they have – and there is a good team in place ensuring the maintenance and accuracy of the data is sound, the insights that can be provided will achieve the key objectives that most companies look for – what’s working, what’s not, and what needs to change.