Sep 23, 2020 | Analytics, Management, Operations
When you want superior performance, implement a number of key cross functional metrics.
Gaining agreement on a set of metrics that genuinely track a projects cross functional performance is not a simple task. KPI’s are usually focussed on functional performance, whereas optimal performance requires that cross functional dependencies are reflected in the KPI’s put in place.
The standard response of functional management to such an idea is that if they cannot control a process, how can they be held accountable for its performance?
To get over this reasonable question requires that there is agreement across three domains, and collaboration around the tactical implementation of a processes improvement.
Let us use a reduction of Working Capital requirements as an example, requiring 4 steps.
Agreement on strategic objectives, and accompanying KPI’s.
The strategic objective becomes making the enterprise more resilient, and therefore able to adjust to unforeseen shocks. One of the strategies agreed is the reduction of Working capital. There are many parts that make up working capital, inventory being a major one in a manufacturing environment. As the joint objective is to make the enterprise more resilient, it is agreed that Inventory levels must be reduced.
Agreement on what ‘success’ looks like.
The absence of an outcome that signals success means that any improvement will do. There are numerous measures that can be applied, how much, when, what outcomes, compliance to standards, variation from the mean, and many others. In this case, a reduction of inventory levels by 15% without compromising customer service, is the agreed metric of success. Agreement across functions that this is a sensible measure will deliver the opportunity for cross functional alignment, and will contribute to delivering the strategic objective of resilience.
Agreeing on tactical diagnostics.
Tactical diagnostics are aimed at tracking and optimising the short term performance detail of the components of the agreed objective. Which parts of a project are working as expected, and which are not. You can make the changes in these on the run, experiment, learn, adjust. It is usually not necessary to have these on the high level dashboard, they are for the teams and individuals responsible for the execution of a strategy to determine the best way of doing them. What is critical at the tactical level, is that those involved clearly understand the wider objective, and their role in achieving it.
Application of the diagnostics.
As the old saying goes, ‘what gets measured, gets done’. In this case, to reduce inventory without compromising customer service, requires the co-ordination of many moving parts, some of which will need some sort of a scoreboard to track progress on the tactical improvements. For example, transparency of raw materials inventory and incoming delivery schedules to those doing production planning, matching production to real demand, improving forecast accuracy, managing DIFOT levels, levelling production flow between work stations, and many others. These should be made visual to the teams engaged in the work, at the place where the work gets done.
For all this to work, the KPI’s need to be simple, visual, apparent to everyone, and as far as possible dependently cross functional. In other words, build mutual KPI’s that reflect both sides of a challenge.
For example, stock availability and inventory levels. Generally those responsible for selling do some of the forecasting, so they always want inventory, manufactured yesterday, to be available when a customer needs it. As a result of uncertainty, they tend to over forecast to ensure stock availability when an order arrives. By contrast, Operations tends to like to do long runs of products to satisfy productivity KPI’s, so you end up running out of stock of the fast movers, while having too much stock of the slow lines.
The solution is to make the sales people responsible for inventory levels, and the operations people responsible for stock availability. In that way, they collaborate to achieve the optimum mix of production and inventory. This mutuality ensures functional collaboration at the tactical level, leading to making decisions for which they are jointly accountable.
You are in effect, forcing cross functional collaboration where it does not naturally exist in a traditional top down management model.
None of this is easy. If it was, everybody would be doing it. That is the reason you should be on this journey, it is hard, and so delivers competitive sustainability.
Sep 7, 2020 | Analytics, Management, Marketing
As a marketer, I want data to better understand the risks and impact of investments in marketing. I am a true believer in data, which also means that the limitations of data are factored into my thinking.
The nonsense pushed around for decades that by default, human beings respond to stimuli in a binary way is increasingly being recognised for the bunkum it is. Marketing effectiveness is not as easily subject to risk analysis and probability based reasoning as most, including myself, would like to be the case.
Data that represents what has happened in the past might be objectively true, but as we see every day, can easily be interpreted and presented differently to deliver the message the carrier wants to be heard.
If we can do it with real data collected from past activities, imagine the vagaries that can be built into the data that is supposed to be telling us what will happen!
The selling point of all the digital data around is that it is both accurate and actionable. Tactically this is partly true, strategically it ranks with the fortune teller in the local fete as a base from which to make long term choices.
The two fundamental drivers of calculating an objective assessment of the impact of a marketing investment are:
Attribution.
Attribution is a particularly difficult and often overlooked problem. Is that purchase because of the anonymous display ads on Google, the annoying branded email that follows you around for weeks after a casual search, the fact that the truck that went past your door delivering was clean, the TV advertising, or that the packaging looked good on a supermarket shelf? All these factors play a role in creating a successful marketing investment, but how do you sort out the relative weights of the impact with one dimensional data?
The unpredictability of human behaviour.
Then you have the fact that people simply do not act rationally, or always in their own best interests, the two foundations of econometrics. They act on a range of impulses and learned behaviours that have little to do with rational economics, and everything to do with psychology. We are only just beginning to understand the impact of psychology on an individuals decision making.
Between them, these two factors make assessment of marketing effectiveness an elusive target. It is best served with the combination of data, and intelligent hindsight, mixed with a high degree of qualitative sensitivity to the drivers in the market, and instinct. These characteristics are only gathered with deep experience, years down in the marketing weeds, learning by doing. It does not come from a textbook, online course, or a few years following instructions.
Aug 17, 2020 | Analytics, Marketing
The implication of the word ‘research,’ is that you are setting out to understand something. All too often over the years, I have observed situations where that is not the case.
Market research can be a money trap, consuming resources with little or no payback. It can also be a huge capability to be leveraged for great benefit when done well.
The challenge is that it is a set of interrelated disciplines from statistics, psychology, behavioural economics, and science, and therefore requires a wide breadth of skill and acquired wisdom to be useful.
Doing commercially productive market research is a bit like learning to swim. No matter how much you read about it, study wise texts, and observe others, until you get into the pool, immerse yourself, you will never really understand it.
Some things are relatively easy to research. Usually they are adverse outcomes that have happened, and have been quantified. The research is aimed at understanding the drivers of those adverse outcomes. More challenging is research that seeks to put a shape around the future. If this happens, what then?
Most material published on the topic is about the techniques, the templates to use. They are very useful, but fail to accommodate the realities that intrude in real commercial situations, that impact on a research outcome.
Following are some of the hard won lessons from doing marketing and market research over the last 40 years. The tools have changed dramatically in the last decade, the principals remain unchanged.
Not understanding the ‘scientific method’.
Most are familiar with ‘the scientific method’. Identify a problem, form a hypothesis, test that hypothesis, adjust the hypothesis based on the results, rinse and repeat. However, most do not recognise the foundation of the scientific method is to set out to disprove an idea. This objective to disprove a proposition, ensures that all relevant information is made available. All contrary data, opinions and untested ideas are brought to the table for examination. It often happens that information that may be relevant is not considered. Ego, confirmation bias, existing standard procedures, and just lack of critical thinking clouds the process. Over the years I have seen piles of research that is setting out to prove a theory, and does so by, usually unconsciously, excluding data that might not confirm the proposition.
Failure to identify the problem/opportunity.
Useful research depends on providing answers to problems, or offering insight into the scale and location of opportunities. In the absence of clarity of the objective of the research, you cannot reasonably expect there to be any value delivered.
Asking poor questions.
Not all questions are created equal. Asking good questions implies that there has been enough work done to identify what is, and what is not, a good question. Also important is the manner in which the questions are asked. It is easy to generate different responses by seemingly subtle variations in the way questions are asked. Eg. ‘How big do you like the fruit pieces to be in your brand of yoghurt? This implies that the fruit in yogurt is in pieces, and ignores the possibility that the fruit may be added as a puree. Those who may prefer a homogeneous product using puree are thus precluded from giving an accurate response. Such a question would be relevant to the marketer of fruited yogurt seeking a point of differentiation, which would influence the product ingredients and choice of processing equipment.
Less than rigorous & neutral data collection & analysis.
We all know numbers can lie, we see it every day. Numbers can be used to support any proposition you choose, when managed to that end. The absence of rigor from research methodology and analysis, will lead to flawed conclusions every time.
Not knowing what you will do with the outcomes
In the absence of a clear use for the research, why do it? The answer to this is usually found amongst ego, seeking validation of a currently expressed position, or as a crutch to avoid making a decision. How often have we heard the phrase: ‘More research is required’
Selective use of results.
Selective use of research outcomes is standard practice in many places. Parts of the research that supports a proposition are used in the presentation of a position, and any parts that do not support the position are ignored. You see this all the time in the political discourse in this country. Politicians of differing parties, taking the same research reports and claiming opposite conclusions is common. Exactly the same process exists in corporate bureaucracies.
Lack of understanding of the techniques
You do not have to be a statistician to be able to understand the outcomes of data. However, you do need to understand what the terms mean, and the implications they carry. This applies from sampling techniques, to the tools of statistical analysis and results presentation. You must understand the principals sufficiently well to be able to ask informed questions, and recognise gobbledy gook when it comes back to you.
Not considering Anthropology & Context
Anthropology might seem a bit misplaced in market research, as it is the study of behaviour in varying cultural settings. However, consider how different the answer to a question about your work might be if asked while sitting at your desk absorbed by a task, to when asked the same question while on a holiday. Same question, different context.
These days we are often allocated to teams at work that are set up to solve problems, generate ideas, or just manage work flow. How different are our reactions inside those groups, to those to which we choose to belong outside the work context, and how differently do we behave?
Conducting research in the absence of such considerations can generate misleading outcomes. E.g. Conducting research on a new piece of packaging around a group discussion table will evoke responses, and a conclusion. How different might the reactions of those same people be when confronted by the new pack while shopping in a supermarket.
Failure to understand the drivers of Behaviour
Psychology plays a huge role in the development and reporting of research. Our brains are hard wired to reduce cognitive load, so can be easily tempted to accept a conclusion not supported by research. It is relatively easy to persuade others of the veracity of a conclusion, simply by the manner in which they are presented. E.g. Which milk is better for you: one that contains 3% fat, or one that is 97% fat free? They are identical products, but in research, a significant majority will answer ‘B’: 97% fat free.
Similarly in a qualitative group discussion, a proposition seemingly supported by most around the table can gather overwhelming support, irrespective of the accuracy of the proposition. This outcome has been repeated endlessly in first year psychology experiments, based on Solomon Asch’s 1951 experiment seeking to examine the power of a group to influence the expressed opinion of an individual.
What people say they do and what they actually do can be very different.
When you ask questions, they are answered from within the existing frame of reference of those being questioned. Their ‘mental Models’ dominate how they see things. Henry Ford was right when he quipped: ‘ He would not consult customers on what they wanted because he already knew, a faster horse. Steve Jobs expressed exactly the same opinion, in different words on several occasions, and was again proven correct.
Too much research is aimed at connecting the future dots to give a sense of certainty about the future, just to make people feel more comfortable. If we could tell the future accurately, we would all be at the local casino for a few nights until we got banned for winning too much.
Respecting the status quo too much
We humans are keen to retain the status quo, simply because it has been proven to work, and change involves risk. We are hard wired to avoid risk, a function of evolutionary psychology, when taking risks often meant you became breakfast for something nasty. The promise of a reward must be many times stronger than the downside of a behaviour before most of us are prepared to entertain the risk.
Presenting a research finding that is inconsistent with the well known view of the Managing Director is a risky undertaking that is often avoided. This is commonly called a HiPPO (Highest Paid Persons Opinion) and is pervasive. It is particularly challenging when the person concerned (often a bloke) is repeating the opinion of someone else. In consumer products, this is often his partner.
Poor presentation of results & Conclusions.
The errors I have seen in presentations are myriad. However, the worst are:
- Lack of clarity and simplicity in the conclusions, which limits useability.
- They do not answer the question. Generally this is because the question was ambiguous, unnecessary or stated a proposition someone wanted verified.
Every research project can be placed somewhere on the matrix in the header. The more right hand side and higher you go, the greater the degree of uncertainty is involved. In the bottom left quadrant, you are seeking answers that are quantifiable, things that have happened, that you are seeking to understand. Top right quadrant is the future, and contains things we do not know much, if anything, about. Often we do not even see them. Research that puts numbers against hypotheses that fall into this quadrant should not be believed. At best they are an estimate of a probability, at worst, just a WAG. (Wild Arsed Guess). What is important in these circumstances is that you understand the risks. Remember that old cliché, ‘plan for the worst, hope for the best’
Aug 14, 2020 | Analytics, Management
You will not hear the term ‘normalising’ the P&L very often. When you do, it is often an indication that the business is in a frame of mind open to change.
It is a common starting point of valuing a business, a process that has two basic buckets:
- Financial value. This is where any valuation process will start, with the numbers.
- Strategic value. Far more qualitative than the numbers, a potential buyer will set out to put a value on such things as market share, customer profiles, geographic location, cultural fit, and so on.
Valuing a business is a complex exercise, particularly valuing the contents of the ‘strategic bucket’.
Creating a financial value is much better understood, and almost always starts at the same place: EBITDA. Earnings Before Interest, Tax, Depreciation, and Amortisation.
EBITDA is a construction of the profit and Loss account, which reflects the trading results. Usually the P&L is completed on a monthly basis, and so long as the classifications of the expenses remains consistent, can be used for comparisons over time to give a good picture of trends.
However, the P&L can also be the repository of all sorts of costs and activities that bear little relationship to the competitive trading health that determines the value of a business. Therefore, an exercise to arrive at a value will seek to remove from, or add back, items that reflect more accurately the trading health. The usual term is to ‘normalise’ the P&L.
This is particularly relevant in the sale process of a private company, less subject to the rigors of governance that apply to listed companies with professional rather than family management.
The common items to be ‘normalised’ I have seen are:
Related party revenue or expenses. Purchases from, or sales to another business related in some way to the one being investigated, that are above or below market value. A common practise is for the owner of a private business to have their superannuation fund own the premises from which the business operates. The premises are then leased back to the operating business at a rate not reflective of competitive market value.
Owner bonuses and benefits. Often the owner of a private business will pay themselves and family members more than the market value of their contribution to the business. It also works in reverse, owners are sometimes the worst paid staff members, working longer hours than anyone else, just to keep the wheels turning over. These anomalies need to be ‘normalised’
Support of redundant assets. Every business has redundant assets that would be jettisoned by a new owner. This stretches from old inventory still carried on the books, to premises not utilised, to the country retreat used occasionally for a sales conference, but usually for the summer holidays of the owner. These do not realistically impact the performance of the business, and a new owner, unencumbered by the past, and by costs not associated with the trading position of the business, will remove them from the P&L.
Asset and expense recognition. Treating an expense as an asset, ‘capitalising’ an expense is a common practise that will boost short term profitability by moving items from the P&L to the balance sheet. While this practise is subject to the scrutiny of tax and accounting rules and independent audit, it is pretty common. It is particularly common in the treatment of repairs and maintenance. As with many items, the accounting treatment can be used both ways to ‘manage’ short term profitability.
One time costs. Items such as litigation, insurance claim recoveries, one-off professional fees, even charitable donations, that are not a normal part of trading operations need to be identified and ‘normalised’ to build the picture of repeatable trading outcomes.
Inventories. Every business has inventories, for many it is a significant item. Manufacturing businesses have physical inventories in raw material, Work in Progress, and finished goods, while service providers have projects in various stages of completion. The method of valuation of inventories is subject to all sorts of shenanigans, and the amount of inventory, subject to mismanagement, sloppy processes, and a host of other curses. Aggressive and consistent inventory valuation is a vital part of understanding the working capital needs of a business, and it often the most contested piece in the valuation puzzle.
When you have all that out of the way, you should be able to calculate a reliable figure for the free cash flow generated, or consumed by the business. A further vital number, and the one upon which many acquisition/divestment decisions have been taken.
As a consultant, looking to help businesses improve their financial and strategic performance, I often quietly do a ‘normalisation’ exercise on a clients P&L. This process almost always offers up those difficult questions that need to be asked and answered before an improvement process can be truly effective.
Header cartoon credit: Dilbert and Scott Adams again capture the idea.
Aug 10, 2020 | Analytics, Management, Strategy
Data is inherently tactical, just numbers without intelligence. It takes structure, capability development, and governance to turn it into a useable asset that adds value. In the absence of a structure that is designed to enable the identification, analysis, and leveraging of that data, and to turn it into useable intelligence, it will remain just data.
To go about that task, ask yourself a number of questions:
What are the data flows?
Through the enterprise, who uses the data, how do they use it, and to what outcome?
Where are the interconnections that occur, to what extend are they compounding positively? Data can also compound negatively, usually because it reinforces an existing confirmation bias that is flawed.
Data is functionally agnostic, should be readily available to all, and the outcomes of use transparent so they can be built upon and compounded.
Who ‘owns’ the data?
Too many times I see the IT department generating data, and keeping to it themselves. Similarly, the finance department is guilty, as are all functions. This is usually not malicious; it is just reflecting a lack of cross functional collaboration. It is becoming more common that marketing is driving a large part of the data agenda, enabled by digital tools, but few marketers have the capability to do it effectively.
Often, there is an expectation that ‘digitisation’ of the enterprise will change the way data is used. Not so, it is no more than putting a new coat of paint on the building, unless the internal structures are changed as well, nothing really changes, you just get a few press releases and nice photos for the annual report.
What data is used?
Piles of data is generated, often collated, and distributed, or made available, but never put to productive use. Usually the missing ingredient is curiosity. Those who are curious approach the data with a ‘why’ and ‘what if’ attitude, they ask questions which identify holes in the data, drives them to be filled, and seeks new sources.
Where does the data add competitive value? Competitive value is a two sided coin. On one side is the need to keep up with what your competition is doing, to leverage the opportunities for productivity and not fall behind in your customers eyes. The second is to find ways for data, and more specifically the knowledge that comes from analysing data, to give you a competitive edge. If a proposed investment does not do at least one of these two things, why would you proceed?
How well do the data outcomes reflect alignment with strategy?
Having data and the analyses that goes with it that leads to conclusions that are inconsistent or divergent from the stated strategy must cause you to question the data, its analysis, and the strategy. In these circumstances, it makes sense to deploy the scientific method, create a hypothesis, test it, collect more data and rinse and repeat until you have alignment between the strategy and its supporting data.
Where are you on the digital adoption curve?
Data is just another asset, it requires explicit actions to build the capabilities necessary to generate, use and fund it. There has to be explicit policies and priorities given, or the investments in data development and the capabilities required, or it will not happen. There needs to be a clear picture of the structures of data domains, from engineering, finance, marketing, sales, and they need to be prioritised and organised to deliver the best return in the long term.
The tools being used to accumulate, process and analyse data are just tools, no different to the hammer that drives a nail. It is how we use them that make the difference. Tools everyone should have are those that ensure the data is both clean and robust. Decisions based on data that fails either of these ‘sanitary’ tests will be sub-optimal at best.
We have entered the digital world. Data and its organisation, funding, leveraging and governance are rapidly becoming the key to competitive survival.
How well are you, and your enterprise placed?
Header cartoon: courtesy Tom Gauld at tomgauld.com.
Jul 20, 2020 | Analytics, Marketing
Being a useful marketer has many foundations, most of them untouched in the course of a marketing degree.
One of the ‘must have’ but seemingly rare skills amongst most so called marketers I see, is a relationship with numbers.
In a seeming paradox, I do not like numbers, the piles of them I often see squeezed onto dense spreadsheets, with little thought or imagination beyond getting as much data as possible assembled in the one place. This drives me nuts.
On the other hand, I love numbers for what they can tell me. Once that data has been cleaned and organised in a way that enables smart, and curious questions to be asked, then answered. Data that moves towards knowledge, then to the source of insight is essential to success. It also clearly demonstrates the parameters of holes in the data, and your ability to address the challenges presented.
Analytical skill is a foundation of successful marketing.
Typically, marketing is seen as a creative exercise. I think this is why many marketers appear almost innumerate, and why the accountants and engineers who run many organisations have little time for those supposed to be running marketing. They love numbers, and assume anyone who does not is an idiot.
Well used, numbers tell a story, and marketing is all about stories. However, stories that do not have some sort of quantitative foundation are commonly called fairy tales. Children love fairy tales, but the accountant in the corner office making the resource allocation decisions, thinks they are for his grandchildren only.
Being analytical is way more than just having the numbers. It requires that they are turned from just the numbers into actionable insights, which generate further numbers to be understood and used to gain leverage for the investments being made. It does not matter if the investment is one on brand building, or buying a new machine, they are both investments, upon which a return should be expected.
We are not generally taught to have this sort of intimacy with numbers. We are not taught that they are key enablers of critical thinking, curiosity, and creativity.
A hypothesis without the means to test and validate it is at best, a nice idea.
I managed to pass (just) a reasonably high level of maths at the HSC, almost 50 years ago. I passed purely because I worked at remembering the formulas and circumstances where they worked. I never had the slightest idea of where this gobbledy-gook stuff might be useful, so by the time I had recovered from the post exam hangover I had forgotten everything. The absence of that key item, understanding, is why many of us shy away from numbers, we were never taught where and why they might be useful. We had formulas jammed down our young throats, and hated it, a dislike that coloured the rest of our lives.
Get over it, and allow numbers to speak to you, to help you understand the stories they are hiding.
- Look for, and identify the trends, and patterns in the data, and when there is an anomaly, be able to ask and find the answer to the simple question: Why?
- Find the gems of truth hidden amongst the averages we always seem to be fed.
- Understand what ‘normal’ looks like, so you can see the bits sticking out, and again find out the ‘why’
- Find the boundaries of an idea, circumstance, impact, and potential.
- Discover variances, and use the boundaries of those variances to improve performance over time. This is the core technique of continuous improvement in factories, engineers love it, and I have found it just as useful in many other circumstances.
- Numbers enable some sort of quantitative boundary to be thrown around uncertainty, particularly useful at the moment. By testing the numbers, then revising and retesting, you can progressively increase the level of certainty, reducing risk.
- Enable yourself to use perhaps the oldest and most useful tool in the marketers arsenal, the 80/20 rule, courtesy of Italian mathematician Vilfredo Pareto. In 45 years of commercial life, this simple technique has been used over, and over, and over again to uncover many ‘Why’s’
- Understanding the data enables you to be ‘numerically ambidextrous’. You can zoom out to see the whole picture, and then zoom in to see the details of anything that for one reason or another looks different, interesting, or just a hole in the data that might lead to an insight.
All these skills are just as useful to a marketer as they are to an accountant or engineer. When you have them, your credibility with those in the corner office will soar.