Taguchi's Quality Engineering
Introduction to Taguchi's quality engineering
The term Taguchi Methods refers to a collection of principles which make up the framework of a continually evolving approach to quality. This system of quality engineering takes its name (at least in the United States) from Genichi Taguchi, who along with Deming, Juran and Ishikawa, is considered a pioneer of the modern quality movement.
In order to gain a fuller understanding of Taguchi's philosophy it is beneficial to examine its roots and the conditions which led to its developments and also to look closely at what is meant by "quality".
In the 1940's and 1950's W. Edwards Deming, often referred to as "the father of the modern quality movement", proposed an innovative approach to quality management. His approach, including statistical measures, stressed the importance of the "voice of the customer", winning the confidence of co-workers, reduction of variation, and continual improvement in terms of manufacturing process and product. Deming's approach was enthusiastically studied and applied in Japan, where in 1951, the Japanese Union of Scientists and Engineers named their prestigious quality award the "Deming Prize". In the U.S., however, Deming's theories were for the most part ignored. This fact was to become very significant for manufacturing in later years.
American manufacturers ruled over U.S. markets in monopolistic fashion until roughly 1970. During the 1950's and 1960's, companies were concerned mainly with profit in the short term. Selection of suppliers was based entirely on reducing cost. In this high quality and low cost were not compatible concepts. Upper-level management was increasingly adversarial with all worker levels and companies were isolated from customers, as evidence by the dealer networks developed by automobile manufacturers to handle sales and service. As a consequence of these developments, American manufacturers suffered substantial losses in domestic and worldwide market share in automobiles and such areas as profitable consumer electronics. This same period of time, however, saw Japan make major gains in the areas lost by U.S. manufacturers. The Japanese stressed to the importance of customer opinion and focused on increased communication between management, workers, vendors, and consumers.
It was this competitive crisis in manufacturing during the 1970's and 1980's that gave rise to the modern quality movement, leading to the introduction of Taguchi methods to the U.S. in the 1980's. While Deming's approach deals with management and Taguchi's is a system of design engineering, the two philosophies share a common goal: to increase quality.
It was mentioned before that Taguchi's philosophy is continually evolving. The ever-changing nature of the Taguchi methods is a natural and necessary extension of the concept called "Kaizen" by the Japanese. Simply defined, Kaizen means improvement, but it is more than that. It means ongoing improvement, collectively involving managers and workers. Taguchi methods seek to improve quality and, in light of Kaizen, are themselves subject to continual change and improvement.
This brings us to a question that is central to a discussion of Taguchi Methods. What is meant when we say "quality"? Quality can be defined many ways. For instance, one simple way to define it is by customer satisfaction. Consumers provide a gauge of a product's quality through their wallets. Another way to define a product's quality is through its performance when "rapped, overloaded, dropped, or splashed". Put another way, quality is a product's (or design's) ability to cope with variation and conditions of use in the customer's hands. This will be discussed in more detail later. Perhaps one of the best ways to define a product's quality is by the product's "fitness for use", as stated by Juran. For purposes of Taguchi Methods, quality (or lack thereof) is determined in relation to a loss suffered by society due to a product's failure.
Taguchi's methods
There has been a great deal of controversy about Genichi Taguchi's methodology since it was first introduced in the United States. This controversy has lessened considerably in recent years due to modifications and extensions of his methodology. The main controversy, however, is still about Taguchi's statistical methods, not about his philosophical concepts concerning quality or robust design. Furthermore, it is generally accepted that Taguchi's philosophy has promoted, on a worldwide scale, the design of experiments for quality improvement upstream, or at the product and process design stage.
Taguchi's philosophy and methods support, and are consistent with, the Japanese quality control approach that asserts that higher quality generally results in lower cost. This is in contrast to the widely prevailing view in the United States that asserts that quality improvement is associated with higher cost. Furthermore, Taguchi's philosophy and methods support the Japanese approach to move quality improvement upstream. Taguchi's methods help design engineers build quality into products and processes. As George Box, Soren Bisgaard, and Conrad Fung observed: "Today the ultimate goal of quality improvement is to design quality into every product and process and to follow up at every stage from design to final manufacture and sale. An important element is the extensive and innovative use of statistically designed experiments."
Taguchi's definition of quality
The old traditional definition of quality states quality is conformance to specifications. This definition was expanded by Joseph M. Juran (1904-) in 1974 and then by the American Society for Quality Control (ASQC) in 1983. Juran observed that "quality is fitness for use." The ASQC defined quality as" the totality of features and characteristics of a product or service that bear on its ability to satisfy given needs."
Taguchi presented another definition of quality. His definition stressed the losses associated with a product. Taguchi stated that "quality is the loss a product causes to society after being shipped, other than losses caused by its intrinsic functions." Taguchi asserted that losses in his definition "should be restricted to two categories: (1) loss caused by variability of function, and (2) loss caused by harmful side effects." Taguchi is saying that a product or service has good quality if it "performs its intended functions without variability, and causes little loss through harmful side effects, including the cost of using it."
It must be kept in mind here that "society" includes both the manufacturer and the customer. Loss associated with function variability includes, for example, energy and time (problem fixing), and money (replacement cost of parts). Losses associated with harmful side effects could be market shares for the manufacturer and/or the physical effects, such as of the drug thalidomide, for the consumer.
Consequently, a company should provide products and services such that possible losses to society are minimized, or, "the purpose of quality improvement ... is to discover innovative ways of designing products and processes that will save society more than they cost in the long run." The concept of reliability is appropriate here. The next section will clearly show that Taguchi's loss function yields an operational definition of the term "loss to society" in his definition of quality.
Taguchi Methods of Quality Engineering design are built around three integral elements:
• the loss function
• signal-to-noise ratio
• and orthogonal arrays
which are each closely related to the definition of quality.
Taguchi's loss function
We have seen that Taguchi's quality philosophy strongly emphasizes losses or costs. W. H. Moore asserted that this is an "enlightened approach" that embodies "three important premises: for every product quality characteristic there is a target value which results in the smallest loss; deviations from target value always results in increased loss to society; [and] loss should be measured in monetary units (dollars, pesos, francs, etc.)."
Figure I depicts Taguchi's typically loss function. The figure also contrasts Taguchi's function with the traditional view that states there are no losses if specifications are met.
Taguchi's Loss Function
It can be seen that small deviations from the target value result in small losses. These losses, however, increase in a nonlinear fashion as deviations from the target value increase.
Essentially, this equation states that the loss is proportional to the square of the deviation of the measured value, Y, from the target value, T. This implies that any deviation from the target (based on customers' desires and needs) will diminish customer satisfaction. This is in contrast to the traditional definition of quality that states that quality is conformance to specifications. It should be recognized that the constant k can be determined if the value of L(Y) associated with some Y value are both known. Of course, under many circumstances a quadratic function is only an approximation.
Since Taguchi's loss function is presented in monetary terms, it provides a common language for all the departments or components within a company. Finally, the loss function can be used to define performance measures of a quality characteristic of a product or service. This property of Taguchi's loss function will be taken up in the next section. But to anticipate the discussion of this property, Taguchi's quadratic function can be converted to:
This can be accomplished by assuming Y has some probability distribution with mean, a and variance o.2 This second mathematical expression states that average or expected loss is due either to process variation or to being off target (called "bias"), or both.
SIGNAL-TO-NOISE RATIO
The signal-to-noise concept is closely related to the robustness of a product design. Robustness has to do with a product's ability to cope with variation and is based on the idea that quality is a function of good design. A robust design or product delivers a strong "signal". It performs its expected function and can cope with variations ("noise"), both internal and external.
Since a good manufacturing process will be faithful to a product design, robustness must be designed into a product before manufacturing begins. According to Taguchi, if a product is designed to avoid failure in the field, then factory defects will be simultaneously reduced. This is one aspect of Taguchi Methods that is often misunderstood. There is no attempt to reduce variation, which is assumed to be inevitable, but there is a definite focus on reducing the effect of variation. "Noise" in processes will exist, but the effect can be minimized by designing a strong "signal" into a product.
This is antithetical to "Zero Defects" policy which has been prevalent in American manufacturing. Under Zero Defects, strict on-line controls are imposed on manufacturing processes in order to minimize losses in the factory. The idea is, an effort to minimize process failure in a factory will lead to minimization of product failure in the field. Quality losses are seen in terms of costs incurred in the factory due to products that cannot be shipped, costs of rework, etc. A product whose components exhibit wide variations within spec and is shipped, but then fails to perform its function properly under varied conditions in the field is not considered a loss. For Taguchi, such a product would be loss.
The dimensionless signal-to-noise ratio is used to measure controllable factors that can have such a negative effect on the performance of a design. It allows for the convenient adjustment of these factors. Provided that a process is consistent, adjustments can be conveniently made using the signal-to-noise ratio to achieve the desired target.
ORTHOGONAL ARRAYS
Given that a maximized signal to noise ratio is crucial, how do companies go about this. Most world class companies follow a three step process.
1. They define and specify the objective selecting or developing the most appropriate signal and estimating the concomitant noise.
2. They define feasible options for the critical design values, such as dimensions and electrical characteristics.
3. they select the option that provides the greatest robustness or the greatest signal to noise ratio.
It really isn't. It has been said that in order to optimize the steering mechanism of a car using this method a set of 13 design variables must be analyzed. If you used the conventional method of comparing each set variables to each other, you would have to make 1,594,323 experimental iterations to observe every possible combination. Clearly this is not acceptable in today's market place. What then can be done to reduce the total number of iterations necessary. Sir Ronald Fisher developed the solution: Orthogonal Arrays. "The orthogonal array can be thought of as a distillation mechanism through which the engineers experiment passes." (Ealey, 1988) The array allows the engineer to vary multiple variables at one time and obtain the effects which that set of variables has on the average and the dispersion. From this the engineer can track large numbers of variables and determine:
1) The contribution of individual quality influencing factors in the product design stage.
2) Gain the best, or optimum condition for a process, or a product, so that good quality characteristics can be sustained.
3) Approximate the response of the product design parameters under the optimum conditions.
The benefits being abundantly clear, the next question that comes should come to mind is "How do I use this powerful tool?" While it is not possible to cover OA's in too much detail in this paper, the key points for constructing an OA can be identified. First and foremost, one must remember what the main objective is: to determine the optimal condition of a system of variables.
The procedure for using OA's can be broken down into seven main steps. These are as follows.
• IDENTIFY THE MAIN FUNCTION.
• IDENTIFY THE NOISE FACTOR.
• IDENTIFY THE QUALITY CHARACTERISTICS TO BE OBSERVED
• IDENTIFY THE CONTROL FACTORS AND ALTERNATIVE LEVELS
• DESIGN THE MATRIX EXPERIMENT AND THE DEFINE DATA ANALYSIS
• CONDUCT THE MATRIX EXPERIMENT
• ANALYZE THE DATA TO DETERMINE THE OPTIMUM LEVELS OF CONTROL FACTORS
Taguchi's rule of manufacturing
Taguchi realized that the best opportunity to eliminate variation is during the design of a product and its manufacturing process. Consequently, he developed a strategy for quality engineering that can be used in both contexts. The process has three stages:
1. System design
2. Parameter design
3. Tolerance design
� System design
This is design at the conceptual level, involving creativity and innovation.
� Parameter design
Once the concept is established, the nominal values of the various dimensions and design parameters need to be set, the detail design phase of conventional engineering. Taguchi's radical insight was that the exact choice of values required is under-specified by the performance requirements of the system. In many circumstances, this allows the parameters to be chosen so as to minimise the effects on performance arising from variation in manufacture, environment and cumulative damage. This is sometimes called robustification
� Tolerance design
With a successfully completed parameter design, and an understanding of the effect that the various parameters have on performance, resources can be focused on reducing and controlling variation in the critical few dimensions.
TAGUCHI APPROACH TO ENGINEERING DESIGN
The central idea behind Taguchi's approach to quality engineering design is that variations in a product's performance can result in poor quality and monetary losses during the life span of the product. These variations can be classified as either controllable parameters or uncontrollable (noise) parameters. Controllable parameters are parameters that can be specified and modified by the designer while noise consists mainly of environmental factors and natural laws.
The distinction between these types of parameters has been and always will be with us, although as technology increases some noise factors will become controllable. A good example of this, as well as a distinction between the two types, can be seen with a hypothetical example of the invention of the wheel. The wheel began as a square causing a terrible ride on the old carriages. After many complaints, an engineer began analyzing the problem. The engineer realized that the ride discomfort was caused by the variation in distance between the axle and the earth when the square wheel was on an edge and when it was on a flat surface. This distance is shown in figure 1 as h.
The engineer deduced that h is inversely proportional to the number of sides, n, of the wheel. Thus, h � 1 - cos(p /n).
The engineer now realized that n is a controllable parameter. As he increased n, h decreased causing a smoother ride. He now ran into two noise factors, or uncontrollable parameters. The first was that technology only allowed for straight cuts in his era. He was not able to make an infinite amount of cuts and therefore could not minimize h by making the wheel round. The second noise factor was the fact that he could not control the contour of the land that the riders chose to commute on. Eventually he, or another engineer, realized that he can achieve an infinite amount of sides with only two cuts. He could cut the wheel out of a tree.
Taguchi's approach can be broken down into a few different steps. These steps include problem formulation, experimental planning, experimental results and confirmation of the improvement. This is essentially a closed loop process as shown in figure 2. If the objective is not met, the procedure must begin again with modified parameters.
Benefits of taguchi's design
Many of the orthogonal arrays that Taguchi has advocated are saturated arrays, allowing no scope for estimation of interactions. This is a continuing topic of controversy. However, this is only true for "control factors" or factors in the "inner array". By combining an inner array of control factors with an outer array of "noise factors", Taguchi's approach provides "full information" on control-by-noise interactions, it is claimed. Taguchi argues that such interactions have the greatest importance in achieving a design that is robust to noise factor variation. The Taguchi approach provides more complete interaction information than typical fractional factorial designs, its adherents claim.
Inefficiencies of taguchi's design
Statisticians in response surface methodology (RSM) advocate the "sequential assembly" of designs: In the RSM approach, a screening design is followed by a "follow-up design" that resolves only the confounded interactions that are judged to merit resolution. A second follow-up design may be added, time and resources allowing, to explore possible high-order univariate effects of the remaining variables, as high-order univariate effects are less likely in variables already eliminated for having no linear effect. With the economy of screening designs and the flexibility of follow-up designs, sequential designs have great statistical efficiency. The sequential designs of response surface methodology require far fewer experimental runs than would a sequence of Taguchi's designs
Conclusion
Following the lead of their Japanese counterparts, the U.S. has only recently begun to adapt the Taguchi Method to US manufacturing methods. In true American the Taguchi method is used under the guise of Total Quality Control. With its basis in the "competitive manufacturing crisis of the 1970s and 1980s" between the US and Japan the US is enveloped in the "modern quality movement" . Turning from their "Company knows what's best for the customer" attitude GM, Ford, Chrysler, and American Motors have led the current manufacturing revolution. Manufacturers are scrapping the hierarchical approach to management in exchange for a closely networked team of laborers, managers, engineers, and sales staff. This highly versatile, interlaced working community is more adept at focusing on high quality with low cost. These two components are essential if the U.S. is going to continue to win the market shares of cameras, televisions, automobiles, computers, and microelectronics.
The application of the Taguchi method to the automobile industry has brought about a dramatic change. The prior mind state was blatant disregard for design defects until the final product was developed. In fact, most engineering teams worked independent of the other without any cross talk occurring until attempts to put the final design together were dismally unsuccessful. Only then would engineers and designers collaborate together in a less than whole hearted fashion to identify, research, and correct design flaws. Despite the many times with this scenario was repeated again and again, US manufacturers continued to attempt to fit the final product into the design specifications. Following the new thinking in quality control, manufactures are now learning to "focus on designing with minimum loss, with the product being designed as close to optimum as is feasibly possible". The use of new philosophy, technology, and advanced statistical tools must be employed to design high quality products at low cost. Robust design, as this method is called, is a systemic and efficient approach for finding near optimum combinations of design parameters. Adherence to this principle ensures that the "financial loss to society" is kept to a minimum. What significance does this have? Try to recall what the US was like twenty or thirty years ago. How many recycling bins did you put out for the garbage collector. How often did you see air quality reports in the newspaper, or hear about companies being fined for emitting pollutants. If you owned a car twenty years ago, how often did you take it in to have smog check? Did you use recycled paper back then? The point is that "a poorly designed product begins to impart losses to society from the very start of the production stage". Waste water contamination, industrial noise, smog, acid rain, are all pollutants which result from a poor quality product. As US manufacturers begin to understand this first part of the Tacguchi method we begin to see more and more concern for the environment. Thus the government has passed smog certification laws and strict controls on the pollutants which may be emitted from factories. As a society we are recycling more and more.
No comments:
Post a Comment