Quality assurance professionals know their jobs are to make sure materials and products stay within specifications. As a corollary, this includes the ability to respond effectively when violations occur, requiring the skills, experience, and technology to quickly distinguish natural variability from an imminent threat. Ultimately, their mission is prevention – keeping a process from going “off the rails” at any point — from the raw material stage, all the way to the customer. This is a huge challenge, both technically and psychologically because the rewards for preventing bad things that do not happen aren’t as extravagant as for making good things happen.
Getting things right, meaning creating products within specification, on time, and responding to things going awry in time, is a much rougher and longer road than anyone suspects.
Our ability to get things right is rooted in the invention of tools that measure time, as well as measuring space.
Measuring time and measuring space have seen centuries of incremental improvement, but the crucial moments that have produced our breathtaking technologies have been leaps, not merely improvements.
Quality assurance has had the same progression: incremental improvements, and then a leap.
To get some perspective on how the first two (time and space) drive the third (quality assurance), let’s take a short tour of history. Often, we take the ability to measure anything, including such bedrock notions as telling time or being able to say how large something is, for granted. Yet each of these abilities has been hard fought, showing both incremental improvements, as well as grand leaps.
In 1707, a British Admiral, Sir Cloudesly Shovell, heading up a fleet of warships came to a violent end in a disaster that had nothing to do with warfare. In unfamiliar waters, neither the Admiral nor anyone else in his fleet was able to accurately calculate their longitude, and thereby determine their location. As a result, over fifteen hundred sailors and officers were killed, including the Admiral.
What he lacked was time, in the form of an accurate watch. Without the watch, the navigators had no chance to truly mark their position. By the time their mistake was discovered among rocky islands, it was too late; there was no time to recover. They were missing: 1) accurate feedback and b) time to respond.
The Admiral, and all the seafarers of those days had access to only half of the tool set: navigators had been calculating latitude for centuries by using the sun at noon and the stars at night. However, without an accurate clock, it wasn’t possible to account for the earth’s rotation to use these heavenly bodies to determine longitude.
As a result of this disaster, there was a groundswell of support for acting on this challenge. In 1714, a $2,000,000 prize (in today’s dollars) was offered by the Board of Longitude to anyone who could come up with a reliable watch. It needed to be sturdy enough to withstand the rough life of an ocean-going warship. And it had to “resolve position at sea to within 30 nautical miles after sailing to the West Indies” which translates into not losing more than 3 seconds a day.
John Harrison, a woodworker (and musician) won that contest. He improved on the specifications by offering a clock accurate to within 1 second a day. This clock became every seafarer’s best friend.
The following centuries all showed many incremental improvements on this clock up to 1960, when we jumped to something a million times more accurate. Instead of using our sun to tell time, we began basing our system on a quantum resonance within a cesium atom (Universal Coordinated Time, or UTC.) The result is that the physical clock in Greenwich is now a tourist attraction rather than the standard which determines the accuracy of clocks around the world.
Another fundamental measurement is our ability to measure distance accurately, and that comes down to something as basic as agreeing on the length of a meter, the unit used in science and in society in most parts of the world. It would be hard to imagine how manufacturing, for example, with specifications in some industries refined to hundreds of a centimeter, could take place without this bedrock of common measurement.
Who would think that it would take so much effort and controversy to come to an agreement on what a meter is?
The meter is relatively young, the conversation to determine it going back to the beginnings of 18th century. One definition insisted that this new basic unit should be defined by the length of a pendulum with a half-period of one second. Others thought it should be one ten-millionth of the length of the earth’s meridian along a quadrant (one-fourth the circumference of the earth). After the dust of the French Revolution had settled, the French Academy of Sciences in 1791 settled on the meridian, rejecting the pendulum because the force of gravity caused variation at different places on the surface of the earth. Thus, the meter became that which is equal to one ten-millionth of the length of the meridian through Paris (of course!) from pole to the equator. Even this proved challenging. The first prototype of a meter was short by 0.2 millimeters due to miscalculations related to the earth’s rotation (the same issue that had thwarted the calculation of longitude.) In 1889, an alloy of platinum and 10% iridium was cast to form a new international prototype of a meter. It was measured to be within 0.0001 of specification, when the prototype was measured at the melting point of ice (temperature being yet another hard-fought measure.)
A generation later, in 1927, there was another adjustment defining the meter as being between the axes of the two central lines marked on the bar of platinum-iridium kept at 0 degrees Celsius in the keeping of the BIPM (the International Bureau of Weights and Measures ***) and assumed to be “subject to standard atmospheric pressures, and supported on two cylinders of at least one centimeter in diameter, symmetrically placed in the same horizontal plane at the distance of 571 millimeters from each other.” ***
It’s reasonable to ask the following question: If such basic measurement standards required this much work to arrive at one definition of something being within specification, then how is an enterprise to deal with thousands of shipments of specified materials arriving from plants all over the world to determine if what they are receiving is appropriate?
Quality Assurance in Supply Quality Management
The requirement for disciplined supply quality management began to grow quickly with the advent of outsourcing and globalization, and the rise of material variability.
The bedrock of supply chain quality management is overarching:
- careful analytical thought,
- the selection of not only measures but measurement systems,
- the data that is collected from what is measured, and
- a system to process the data into meaningful, actionable knowledge,
- against the background of an intense sensitivity to time,
- And independent of location.
The tools used to perform the tasks have been developing along with the growing need for better and safer products. Because of the speed of this development, it’s often difficult to assess whether the tools being used are the equivalent of a water clock used in China during the 11th century, or the atomic clock in use today? Are the tools being used by the company to manage supply chain quality still centered on the Sun? Or has the company made the leap to the atomic clock?
In the days when our ancestors were wrestling with creating basic measurements that could be held in common across geographical and cultural boundaries, determining whether something met specifications were relatively simple, and that held until the most recent era of outsourcing and globalization. In most cases, up until twenty years ago, simple paper documentation was adequate. When volumes increased, material variability increased as well, along with extended lead times. Some of the documentation turned to digital images, scans of the paper documentation. Because test reporting formats (certificates of analysis) were all generated by suppliers, the receiving locations needed to standardize the data so that apples could be compared to apples. But, to translate the inbound documentation into actionable data time is required on the receiving end and can induce gaps in accuracy. As the flow and complexity increases, using tools created for a bygone era become unworkable.
An example of the growing challenge comes from one of our users of GSQA, Handgards, a North American leader for food service disposables, many of which must meet medical grade standards.
“Before changing their methodology, Handgards received COAs from every supplier as well as from an array of laboratories, using e-mails.
With hundreds of products arriving from overseas suppliers (many of them making the same items) managing the data became an overwhelming task. Working with 45 suppliers who could be making 800 products at any given time, all with different lot numbers, Handgards was getting a COA from every supplier for every product for every lot.
If the suppliers were producing vinyl gloves, they were required to list each ingredient used in their manufacture. If an item required ten ingredients to make, the suppliers needed to produce certifications for each of the ingredients, not only for the final product itself. While Handgards gave suppliers a standard template, and each of the suppliers was able to produce the documentation, they in fact adjusted the data itself, based on their internal conventions. In addition, suppliers sent documents in a range of formats. Some of the documents arrived as images, some came as PDF files, and some were Excel spreadsheets. In the beginning, some sent their data to Handgards in their own language, so that each document needed to be translated.”
It’s clear that in the face of today’s supply quality challenges, having suppliers use their own formats to send COAs, whether submitted on paper or as images, is not an adequate response. The complexity and volume of data is too great.
Handgards reported that onboarding suppliers during the GSQA implementation process was easier than expected. Part of that ease came from good preparation by Handgards, which included employing China-based consultants who came to thoroughly understand the requirements and acted as liaisons to help suppliers adopt the system.
“We stopped doing things in paper. Everyone is excited about transitioning to data management. Even the janitors are getting excited about going digital. One of the most valuable results has been a new ability to present a very good dashboard; we can show what we are doing on the scorecard that is part of the GSQA program. This means we can empower suppliers by giving them feedback.”
Ana Ramos, VP of Quality
The dividing line between the previous era and today is the much greater need for timely, standardized measurement. Asking the following questions can help your company determine which side of the line its supply chain quality system is on:
- Is supply quality management being performed by using periodic audits or is it ongoing? In other words,
- Is every lot being tested, using reliable measures of the appropriate elements?
- Is the data being provided directly by suppliers using a standardized template and in electronic form, or is it in a supplier generated form of document needing transformation from format to format to be read?
- Is the data instantly available to be processed through Statistical Process Control which can quickly display a trend in either a positive or negative direction, and cannot be falsified?
While some companies have realized that the electronic approach and ongoing testing is the right thing to do both to protect the reputation of the company and the safety/satisfaction of the consumer, many others do not know that the solution is available and is cost-effective.
The company need not invent the tool itself, just as it has no need to invent for itself a reliable clock or come up with a basic system for measuring the dimensions of an object. We can build on the centuries of work it took to invent and manufacture a reliable clock, and take inspiration from the determination of our ancestors to create a common system of measurements. We can add to the legacy by entering into the era that can use the data our measurement systems generate to create reliable and safe products.