Efficient does not always mean Optimal.

Efficient does not always mean Optimal.

 

 

Seeking highly efficient processes is the holy grail of most operational managers.

Is it the right goal?

‘Garbage in.. Garbage out’ still applies, even if the garbage gets a slick coat of paint on the way through.

The process as implemented might be efficient, optimised, but does it deliver the outcome in the most effective way?

A typical example is from a while ago when the NBN was (compulsorily) connected.

The technician turned up just within the time window, to do the connecting work, and did it quickly and it seemed, efficiently.

After about 45 minutes, he informed me it was all done, all I had to do from there was connect up the modems around the house.

When I expressed surprise, that until everything worked, the job was not complete, I was told: ‘Not my job, I have 7 connections today, and I am behind by almost an hour’.

Clearly there was an optimised process of installation by NBN subcontractors in place, the final few feet being the responsibility of the retailer. However, as far as I was concerned, I had paid the compulsory $172 for ‘connection’ and it was not complete until everything worked.

It may have been an efficient process from the perspective of the NBN, but from the perspective of someone who had paid for a service, it sucked.

The technician was prevailed upon to ensure that the job was complete, to my eyes. The problem for him was he failed to meet the stupid KPI imposed by someone seeking an efficient process, rather than one that optimised the outcome.

Header image is obviously courtesy of AI, and is therefore not optimised by a human.

 

 

 

The critical unasked question that can kill a ‘5-why’ analysis.

The critical unasked question that can kill a ‘5-why’ analysis.

 

‘Five Why’s’ is a commonly used tool, widely seen as one that when used well gives you answers to challenging operational problems.

Mostly it will, but what happens when the answer lies hidden outside the consideration of the effort to identify the cause-and-effect chains that lead to the problematic outcomes.

To solve any challenging problem, there are 4 stages that are used:

    • Collection of data
    • Analysis, segmentation, and classification of the data
    • Generation of a theory that might explain the condition and
    • Experiments to identify the cause of the outcomes rather than just the observations of it.

What happens when the third stage fails to produce a theory that explains under experimentation the outcome?

Go back to the basics, by looking at the data more widely, as clearly something is missing. Often it pays to reverse the process and ask yourself ‘what could have caused this outcome’ starting at the problematic result.

Years ago, Dairy Farmers limited had a monopoly in retail UHT processed long-life custard.  It was a modest sized niche market that was quite profitable. There had been several attempts by competitors to grab a piece of the action, all of which had failed.  Suddenly we started having problems at seemingly random times. When opened the custard was the consistency of water. The costs of lost production were substantial, but the far greater costs were those of the product recall from retail shelves, and loss of consumer confidence.

The condition was caused by either the presence of an enzyme called amylase, or a failure of the CIP system. Amylase is a naturally occurring enzyme in starch, which had been eliminated by processing from the complex hydrocolloid (starch) ingredient we used in the custard. We had accepted the assurances of the supplier that the ingredient supplied was amylase free, as per our specifications. We assumed therefore that the problem lay with the processing plant. The plant was torn apart several times, cleaned meticulously, and on one occasion, underwent some expensive engineering changes.

All efforts failed to fix the problem.

A valuable question to ask in this circumstance is: ‘What would have to be true to…..’ In this case, the answer would have been: ‘there is no presence of amylase in the hydrocolloid ingredient’. This may have, much earlier than it did, spark the further  question: ‘Is a test with a sensitivity level of 1 part per million a reliable indication that there is no amylase?

When we finally asked this question of ourselves, the answer was clearly ‘No’. We set about refining the test our suppliers used to a sensitivity of 1 part per 10 million. This more sensitive test showed up in a random manner, the presence of amylase in the supplied ingredient.

5-Why is a great tool. However, like any tool, it must be used by an expert in order to deliver an optimum result.

Header is courtesy of a free AI image generator, depicting some tortured engineers doing a root cause analysis..

 

The 2 mutually reinforcing ingredients to success:

The 2 mutually reinforcing ingredients to success:

 

 

If there is a magic ingredient to success, it is captured in two words: ‘Leverage’ and ‘Compounding’.

We all understand the concept of leverage, using a small amount of force to generate a larger outcome.

Compounding is a little more difficult to understand, although if you currently have a mortgage, you are suffering the compounding results of higher interest rates eating away at your growth in equity as you pay the monthly piper.

Question is, how do you find and build on them to generate a sustainable level of profitability?

Our commercial entities are built on the correct assumption that you need leverage to scale. As you build scale, it becomes necessary to add management layers to leverage the capabilities of those the next level down. That is why our organisation structures are always pictured as pyramids, because they are, for the leverage they generate.

Leverage leads to compounding, and compounding leads to greater leverage: a self-sustaining cycle, until the system becomes gummed up with friction.

Friction in management terms ends up being hidden in the layers of authority necessary to act. The transaction costs, which are almost always hidden from easy view, can be commercially fatal.

Leverage also delivers power to those in a position to exercise it, and as we know, power is a drug with many side effects, some of them not so good.

Technology has changed the ratios between leverage and compounding, but not the basic arithmetic. They remain mutually reinforcing, but their management has become significantly more complex.

 

 

 

The reliable way to forecast manufacturing costs.

The reliable way to forecast manufacturing costs.

 

 

Several years ago I became aware of ‘Wrights law‘.  In the 1930’s, Theordore Wright an aero engineer proposed that: ‘For every cumulative doubling of units produced, costs will fall by a constant percentage’. This insight came from observing the performance of his own factories building aircraft during the thirties and over the course of the war.

While I do not have the numbers, intuitively after 50 years of observation, it holds very true.

That truth seems to hold over any manufacturing I have seen and read about, unlike its much better known sibling Moore’s Law. Gordon Moore observed the increase in the number of transistors that can be stuffed onto a silicon chip in a given period of time, and predicted that a doubling of numbers would hold consistently over the long term.

Therein lies the significant difference that manufacturers have come to rely on.

Moore’s law refers to technology improvements over time.

Wright’s law refers to the manufacturing cost reductions that come with scale.

I would suggest that the cumulative impact of the combination has had a potent effect on manufacturing costs of everything from the manufacture of simple widgets to solar panels, to the cost of human genome mapping. Wrights Law applies as scale builds, and technology  provides a catalyst to a tipping point that radically alters the growth curve, after which the graph finds a new normal in the relationship between volume and cost.

Australia for lack of leadership, foresight and capital has shied away from the investment required to light that catalytic fire many times in the past.

A primary example is solar panels.  We have known for a hundred years that solar energy could be harnessed. As a kid I used to burn leaves, paper, ants, and occasionally myself, with a magnifying glass. However, it took researchers at the UNSW to invent PERC (Passivated Emitter and Real Cell) technology in 1983 to kick off Australia being the international leader in Solar cell technology. Funding and the foresight to commercialise could not be assembled here, so the technology was used to develop the manufacturing industry in China, where Wright’s law has facilitated the growth of a dominating share of the world market for wafers, cells, and completed solar modules.

Forecasting manufacturing costs is at the core of every successful manufacturer. While in the early stages of commercialisation there will be a host of variables you need to be able to model, understanding the relationship between your cost base and scale will remove a significant weight from your shoulders when planning capital requirements.

Australia again finds itself on the cusp of being an international leader in Quantum computing, biotechnology, Hydrogen sourced energy, and rare earth extraction and value addition. Let’s not allow ourselves to be distracted this time, we may not get another chance.

Successful economies all have one thing in common: they manufacture stuff others want to buy. Australia’s history is littered with great ideas, and technical innovations that are commercialised elsewhere for lack of foresight, leadership and capital. We would be desperately stupid to let it happen again!

 

The simple solution to supply chain disruption. 

The simple solution to supply chain disruption. 

While there is no silver bullet, there is a lot of tactical advice around that will increase the dependability and resilience of your supply chains.

Shortening lead times, removing steps in the chain, paying a premium for service to specification, creative logistic management, making information transparent, and many others.

All will deliver some benefit, and together can make a dramatic difference, but miss the essential nature of significantly improving supply chain performance.

When you ‘flip’ the chain, changing the drivers of the chain from supply to demand, the game changes.

Developing a clear view of demand, and responding only to the signals of demand, rather than the often functional signals coming from within the vertical management hierarchies of supply chain participants, alters the nature of the challenges being faced.

It becomes a demand chain, rather than a supply chain, or even a value chain.

In lean parlance, there is the concept of ‘Takt time’. This is a measure of the ‘pull’ put on a supply organisation by the demand from customers. It is the production time required to meet customer demand.

The so called ‘bullwhip effect’, the magnification of fluctuations in orders back through the supply chain will be at least mitigated by application of a metric that reflects real demand from the market.

Remember the panic buying of toilet paper, amongst other things, at the beginning of the pandemic? The underlying demand had not changed, we still all went to the loo at about the same rate. However, the sudden shortage on supermarket shelves created by panic buying resulted in supermarkets increasing their orders on suppliers, who in turn increased orders on their suppliers. At each point in the supply chain because of the uncertainty, everybody was increasing their orders, building inventory, magnifying the boom/bust cycle of supply, creating a ‘bullwhip’ effect. This is where the trajectory at the tip of the whip is progressively magnified by movement back through the length of the whip. Swung hard enough, it will ‘crack’ just like your supply chain.

The challenge is to match the whole supply chain to the real level of demand coming from the marketplace, demand uninfluenced by short term hiccups in the chain. If there is a silver bullet, that is it.

A marketers explanation of ‘Lean Accounting’

A marketers explanation of ‘Lean Accounting’

 

 

The double entry bookkeeping system we are familiar with, or should be, has been around for millennia. In the form we now know it, double entry bookkeeping was codified by Franciscan monk Luca Pacioli, a collaborator of Leonardo da Vinci in a mathematics text published in 1494.

It remained largely unchanged, just increasingly complex until the 1920’s when Alfred Sloan, the king of General Motors for 50 years developed the system of management accounts we still use, with standard product costs as a foundation.

As the ‘lean manufacturing’ movement, pioneered by Toyota, extended throughout the western world from the late 70’s onwards, the system of standard costs became increasingly problematic.

It tends to set in stone the assumptions that are built into  the standard product costs, rather than using them as a basis for continuous improvement. Even worse, management KPI’s tend to be centered around functional silos that have little to do with the overall productivity of assets in delivering value to customers.

I have been subjected to ‘stalking’ variances, those that seem never to go away, but persist in defiance of management edict many times. The easiest way to get rid of them is to adjust the standard. Not very smart, but accepted practice and often the only way to achieve KPI’s in a corporate environment. It also has the effect of hiding opportunities for improvement, and ensuring reliable data is not available in real time. In parallel, we have increasingly digitised operational processes by multimillion dollar installations of MRP (Manufacturing Resource Planning) and its more expensive sibling ERP (Enterprise Resource Planning)  systems. These tend to set in stone standard costs and variances down to the micro transaction level contained in work orders, which complicates and adds cost to the reporting and management processes without adding  value for customers.

One of the core ideas of Lean is ‘Flow’ which is at odds with standard costing systems. Standard costing gives precedent to operational efficiency at individual stages in a process, rather than flow through a whole system, ignoring varying capacity and efficiency constraints. This results in several usually uncomfortable conflicts.

Two examples:

  • Lean seeks to reduce inventory of all types, raw material, work in progress and finished goods, seeing it as a cost, tying up working capital. Traditional accounting treats inventory as an asset, and when a Lean project reduces inventory, it reduces the current assets in the balance sheet, giving a misleading perception of financial performance.
  • Lean focusses on capacity utilisation and ‘Flow’ through the processes necessary to create a product. Capacity is the key operational constraint, but does not appear anywhere in the general ledger other than by inference, as a function of capital invested, the calculated value of inventory, and unit sales. Delivering capacity is only of value when that capacity is used to add value in some way, usually by producing more product from the same fixed  cost base. Standard costing ignores this reality of operational management.

There are no easily GAAP (Generally Accepted Accounting Practice) conforming measures for calculating immediate capacity utilisation, and flow, and no sensible calculation of actual product costs on a short term basis that conforms to the standard cost model. A second set of measures, which use the same data base as GAAP accounting, but in different ways is necessary.

While it will take work to set up these alternative measures, once deployed they will reduce the reporting workload and error rates inherent in the highly transaction based standard cost models, delivering both utility and accuracy to operational reporting and analysis. Deployment is however not like installing an ERP system, it is a process of continuous improvement.

Setting out to implement a Lean accounting environment in the absence of collaboration and mutual understanding at the senior executive level, is akin to climbing Everest in a t-shirt. Success requires a complete change of mindset from that taught by most accounting institutions where the concentration is on financial and reporting compliance, rather than gathering and critically analysing the information that enables better management decision making and continuous improvement.

 

Header credit: Nick Katco from ‘The Lean Accounting CFO’