As you approach the end of 2016, it’s a good time to review your company’s approach to supplier quality assurance and supplier scorecards. The review, to be comprehensive, should begin by asking what the very basis of the approach will be. Will it be based on QA performance data, or will be it be based on expectations? Will it include a capability to analyze trends, which may or may not be positive, or are you simply expecting a continuation of an absence of serious breakdowns, if that’s the way it’s been in 2016?
Taking a chance on an absence of breakdowns (“so far, so good”) has seen companies suffer serious losses of profits, reputation, as well as criminal prosecution of senior managers, in recent years. Instead of expecting your company to move forward without any incidents, adopting a more proactive approach is more prudent and responsible, reducing the possibility of losses in all domains of your company’s future.
New Pressures From The Environment
The increasing complexity of supply chains, coupled with the tightening of regulation and greater consumer awareness fueled by social media make staying abreast of change critical, especially in those parts of the chain outside of the enterprise. Companies must periodically review their supply quality systems to make sure their systems can meet new challenges. A proactive end-of-year assessment may be a useful exercise.
One of the challenges for companies in undertaking a process that may affect how management views the status quo in their supply quality management systems is the problem of cognitive biases inherent in administering ambiguous threats.
Breakdowns in quality are never seen with certainty. They’re always a surprise.
The tendency is to err on the side of “it won’t happen here, to us.” In addition, suppliers are perceived as being far away. The farther away they are, either in their location on a tier of a complex chain or geographically, the smaller the perceived threat. If the threat is invisible, it’s no threat at all, in terms of perception. This cognitive bias can be a source of management inattention leading to dangerous supply chain contamination. It can allow counterfeit, fraudulent and/or harmful materials to flow through the supply chain all the way to the end user without management oversight.
Battling Cognitive Bias: Helicopter View
The overarching, top level conversation which reviews the status quo and re-assesses the company’s risk at the end of the year should include a consideration of the value of supply chain quality management.
Careful analytical thought includes a consideration of the “shape” of the supply chain, with some concern for the potential for new, perhaps ambiguous, threats. Are the systems and processes and management methodology¬¬––or even staffing––equal to handling increased exposure in new markets, new countries, new processes, new products?
This leads to a consideration of measurement. The top level of considering measurement is to understand measurement systems themselves, beyond any specific measurement. Are your measurement systems refined enough to meet tighter requirements in key domains? Should there be more bite and nuance in specifications, moving from nominal measures (yes/no) to ordinal measures (more/less)?
Re-assessing Risk at “Ground Level”
The over-arching conversation about supply quality management considers the big picture, asking about the appropriateness of the supply quality status quo. It leads to the next tier of considerations, which is the processes being used, as measured against changes and increased challenges coming from a changing external environment.
Moving on from a top-level consideration of measurement systems, the focus should turn to the kind of data that is being collected, i.e., what is actually being measured. Is the data being collected meaningful when assessed against the strategic concerns of the company? What is not being measured, or not measured in a meaningful way, that should be measured?
This leads to a consideration of the system for processing the data being collected. Is the system the company uses adequate for whatever higher volume or greater complexity of data that may not have existed a year ago? The company may discover that a software or method that was adequate at one level has led to slowdowns, unnecessary labor costs or dangerous breakdowns at another level.
If the system can manage the volume of data, then how sensitive is it to time, i.e., is it fast enough and configurable enough to transform data into information in a way that turns it into timely actionable knowledge? Or is the data merely being stockpiled?
How quickly can your management move to avert a problem, keep sub-standard material from entering your supply chain or reaching your customers?
How independent of location is your system? Is it equally fast and transparent no matter what the reach of geographic distribution is or will be in the coming year?
One of the most critical questions in assessing potential risks in the coming year is about the periodicity of testing. For example, is supply quality management being performed by using periodic audits or is it conducted through ongoing testing? In other words, is every critical material’s lot being tested by someone, using reliable measures of the appropriate elements?
To answer this question, let’s use an analogy from managing finances: are you using an annual financial audit, without using a checkbook to track your expenditures on an ongoing basis? An audit is a snapshot of your financial condition at a certain moment in time, and it’s “post-factum.” By the time it happens, it’s too late to intervene in a real-time situation. If a supplier passes an audit, the future rests on the hope that the same positive result will happen the next time, i.e., plays the odds that it will. The passed audit also infers that the supplier’s performance will continue, and the delivered materials will also perform as promised.
How manageable is the data in terms of facilitating any necessary intervention in real-time? That consideration translates into these questions:
- Is the data being provided directly by suppliers using a standardized template and in electronic form, or is it in a supplier-generated form of document needing transformation from format to format to be read?
- Is the data instantly available to be processed through Statistical Process Control, which can quickly display a trend in either a positive or negative direction, and can indicate falsification?
- Can interested professionals receive alerts pertaining to unsafe conditions so that action can be taken close to real-time?
While some companies have realized that the electronic approach and ongoing testing is the right thing to do both to protect the reputation of the company and the safety/satisfaction of the consumer, many others do not know that the solution is available and is cost-effective.
Potential breakdowns are, in reality, costlier. There are systems that can provide new, more efficient supply quality management in Software-as-a-Service form which remove all the above concerns, while greatly improving the security of a modern company’s supply chain quality management approaches.
Our GSQA® SaaS supply chain QA management service is one of those approaches, with 20 years of accumulated know-how. See how it can help by viewing some of the modules pages … before 2017.