Video: CSCOs Are Solving Inventory Challenges

All content by Russell W. Goodman, SupplyChainBrain

Original post here:

Bill Benton, co-founder of GAINS details why planning processes don’t meet and solve inventory challenges.

There are several primary contributors to inadequate inventory planning, Benton says. For one thing, methodologies applied in inventory policy are often based on “pretty rudimentary rules.” That’s true “whether those are simple things like targeting segments of inventory classes [or] fixing service levels without regard to things like cost, supply variability or demand forecast variance.”

Even where policies are adaptive, they rarely consider upstream effects. “If you’re a manufacturer with a bill of materials, how do you optimize and synchronize up and down that chain, which could be many layers deep?” Benton asks. “Secondly, if you’re a distributor or a manufacturer distribution network, how do you manage across those tiers? If you sum up all the tiers of distribution, plus, if applicable, all the tiers of the bill of material, you have a very complicated web of interdependencies, and that’s rarely taken care of.”

While technology investment is important, Benton points to a stat that says 67% of chief financial officers find negative ROI on digital transformations. He feels many such initiatives are just too large. “It’s overly ambitious, and they deliver improvements only on a very elongated timeframe. If they had the ability to deliver more quick wins and sustain momentum and show improvement, that would help.”

Benton sees value in trying to digitally simulate supply chains, but remains skeptical of some promises about digital twins. “I want to be careful,” he says, “because I do think that the concept of trying to do simulations that emulate your supply chain is very valuable. But trying to completely create a twin is overwrought, and that overreach prevents you from doing smaller-scale but very useful scenario planning and analysis. Despite tremendous leaps in computing power and machine learning, we’re not really even close to doing that.”