Oliveira 2011

Download as pdf or txt
Download as pdf or txt
You are on page 1of 10

Continuous Process Improvement and Optimization

J C Oliveira, University College Cork, Cork, Ireland


ª 2011 Elsevier Ltd. All rights reserved.

Introduction known. In a sense, the fact that various people analyzed


various aspects of what was later called the ‘Toyota
Origin of the Concept
Production System’ (TPS) and then added their own
‘Continuous process improvement’ is a term that emerged perspectives and interpretations has ultimately led to
from the studies undertaken in American business science the emergence of several ‘westernized’ concepts, such
on the manufacturing strategies used by the Japanese as just-in-time, total quality management, continuous
industries. As such, it is often known by its Japanese process improvement, and lean manufacturing.
term ‘kaizen’. There are many other terms that became However, the Japanese, and Toyota in particular,
popular and are related to managerial and operational would not consider any of these concepts as an inde-
strategies from Japanese practice, of which continuous pendent methodology, but as part of a whole.
process improvement is one element, such as quality The first studies of TPS brought to global attention a
engineering, lean manufacturing, value engineering, man- completely different way of organizing production, with
ufacturing excellence, and world-class manufacturing, just-in-time stock management, a production pull system
and even six sigma can be related to it. Six sigma is rather than push system, and a revolutionary approach
discussed further in the article Plant and Equipment: to the role and intervention of operators in the process,
Quality Engineering. The methodologies developed for which in itself sparked a new philosophy about team-
continuous process improvement are now regarded as work. Studies of the TPS became known as the first
crucial for ensuring competitiveness in a global market, paradox (how can quality be achieved with low cost
and as such, have been permeating across all sectors of the when there are few economies of scale?) after another
manufacturing industry. influential work addressed what it called the second
It is worthwhile to understand the origins of the Toyota paradox, which deals with the new product
concept. Following World War II, Japanese industry development approach of Toyota (how can a new pro-
was devastated. In the 1950s, products made were of duct development cycle be faster and cheaper by
generally poor quality and the lack of economies of delaying decisions as much as possible?). Another land-
scale and of investment capacity seemed to condemn mark in the dissemination of these practices was the
seminal work on lean manufacturing, The Machine
Japanese industry to a secondary role for a long time,
That Changed the World.
compared to the paradigms of the day, the US industry.
However, by the 1970s, American companies began
recognizing that the Japanese products were of super-
ior quality and lower price, beating them in their own From Car Manufacturing to Dairy
markets. One of the industrial sectors more strongly
There is a very important conclusion to bear in mind
affected by the giant leaps of quality and efficiency of
from the historical backgrounds of continuous process
the Japanese industry was car manufacturing. It was
improvement and all the TPS-related paraphernalia of
obvious by then that some time during the 1950s and modern management best practices and fads: they were
1960s Japanese manufacturing was able to find methods not developed and implemented by Toyota, the biggest
to produce high-quality products at a cheaper rate, car manufacturer in the world (that is not what Toyota
even though it did not avail (at that time) of the was then). They were developed and implemented by
economies of scale that the ‘Detroit giants’ (Ford, Toyota, a small company producing low-quality pro-
Chrysler, and General Motors) had. As the Toyota ducts at a high cost because it had no economies of
Motor Company was at the forefront of these devel- scale, and therefore had very little financial capacity for
opments, and it practiced a notable open philosophy investments. What it had was the belief that it could do
about itself, it became the most widely studied by better, that it could solve the paradox ‘high quality and
American business science researchers. Toyota was low cost even without economies of scale’. Toyota did
not the only company developing these approaches in not develop its famous TPS by hiring expensive con-
Japan, nor the creator of all the methodologies used, sultants to design optimum solutions by applying best-
but was one of the pioneers, and became the best in-class principles. It had no financial or human

263
264 Plant and Equipment | Continuous Process Improvement and Optimization

resources to do so. It started with months of observation United States (and Europe) had been to design an
of what happened in the reality of the factory floor, optimum process. This implies defining what the opti-
what operators really did and how. Then, considering mum target is, often accepting that the multicriteria
what was the best in class, how could those efficiencies nature of ‘optimum’ will require some compromise or
be emulated? And finally, finding out that they could trade-off, and from that conceptual definition of what
actually be bettered. It is very informative to know the optimum should be, the process is then designed in
what really happened: Liker provides a historical one single go to reach that optimum, using sophisticated
account that is well worth a read. methods. As the design is complex, it needs advanced
The relevance of this to modern industries is obvious, knowledge, so the designer is an expert in designing,
as emulating means that abstract principles can be applied does so expensively, and is unlikely to be engaged in the
across industrial sectors, and there is none that has been actual daily operation of that design. On the other hand,
immune to the application of the principles of continuous the operators are typically not involved in the design,
process improvement, lean manufacturing, six sigma, and and are supposed to do exactly as told and exactly as
other improvement programs (see for instance, the list of planned in the design. This is, of course, a gross general-
main clients of the Kaizen Institute – one of the many ization, but the underlying nature of the approach can
consultancy groups providing services in this area – at its lead to such extreme dichotomy between the optimum
website). For the dairy industry, see the cover story by abstract ideal of the design and the practical reality on
Markgraf on an American dairy. the factory floor.
It is however crucial to bear in mind the main lesson of Therefore, continuous process improvement (kaizen)
Toyota’s origins: if deploying continuous process experts often like to boast of the improvements they
improvement (or any associated principles) requires an were able to extract out of what was supposedly an
expensive engagement of external experts who will pro- ‘optimum’ process. Notwithstanding, process optimiza-
pose an optimum solution from their analysis, then the tion methodologies are not to be discarded, some
proposition negates the philosophy right from the start. kaizen engineering design methods over rely on
Continuous process improvement (1) must be a transfor- simplified approaches, and ‘western’ engineering design
mation that comes from within (albeit guidance by has developed some more consistent and comprehen-
external experts facilitates commencement, cuts corners, sive ones. From a practical point of view, of course,
and can thus save time and money), (2) must be ardently semantics are irrelevant, and what matters is the
desired by the most senior management, and (3) should efficacy of the methods deployed to improve
require few, if any, costs. competitiveness.

Continuous Process Improvement or Process


Operational Improvements
Optimization?
As the name suggests, continuous process improvement The first step for improving a manufacturing process is
uses methodologies to improve a manufacturing process, to analyze its operations, from procurement to sales,
so the first question should be what is the target of and not only on the factory floor. Starting with pur-
improvement – costs? productivity? quality? This is a chase orders (when are ingredients purchased and how,
methodology to improve what? The answer is simple: where are they stored and for how long?) and ending
everything. That is one of the reasons for using the with sales (manufacture-to-stock, or manufacture-to-
word ‘paradox’ in relation to the TPS: How can every- order?) means that stock management strategies are
thing be made better, when reality usually requires an integral part of process improvement. In relation
compromises and trade-offs? How can costs be reduced to the manufacturing process itself, the whole sequence
and quality improved? The second feature of the term is of operations needs to be considered, including not
the word ‘continuous’: process improvement is a never- only the actual processing functions, but also its asso-
ending journey, the process is to be made better all the ciated functions, such as quality control. In essence, a
time, one improvement after another, because optimum is factory organizes a series of operations, generically,
an ideal, and as such, unreachable. buy ! make ! test ! sell (not necessarily linearly).
Semantics, therefore, show that the concept implies The first core principle of kaizen is to analyze the
an incremental approach (one step at a time), where no operations from the point of view of the flow of the
improvement is too small (summing enough small product itself, which already brings in a difference
improvements leads to a big improvement). This, from ‘western’ conventions, where processes tend to
therefore, reveals a fundamental difference between be described from the managerial perspective.
continuous process improvement and the ‘western’ In order for a process to be improved, this combination
counterpart, process optimization. The practice in the of operations must be the most efficient. Redesigning the
Plant and Equipment | Continuous Process Improvement and Optimization 265

entire set can be associated to the concept of business the product. A typical example is quality control: from
process reengineering (BPR), a management concept the point of view of the product or the client, it is a
that gave mixed results and was therefore more popular wasteful operation. This might seem an excessive com-
in the 1990s than it is today. BPR considers the ‘wes- ment, but a more detailed analysis shows the philosophy
tern’ approach: that an optimum can be designed from at play here. Quality control is a problem of the manu-
scratch by best principles applications, using sophisti- facturing company, not of the client. The client expects
cated methods if necessary, and isolated from real that quality, indeed, he paid for it. If the company is not
practice if need be. A BPR proposes a giant leap to an able to deliver that quality, that is the company’s pro-
ideal world. It is not a useless concept, and it can give blem that becomes a client’s waste. From the point of
very good results (and has in reported literature), but view of the product, being stored somewhere waiting for
obviously, the more reality deviates from ideality, the the results of a quality control test is a waste of time.
greater the chance that the giant leap may be toward a The product should have quality unless the process is
big hole. Literature also reports negative experiences of faulty. An improved process should not be faulty, so the
companies with BPR. Incremental approaches therefore product should have quality all the time. This would
have this big advantage: they may be more modest but lead to the concept of quality by design, which is dis-
will always lead to a better situation, and over several cussed in detail in the article Plant and Equipment:
incremental steps, the gains will become significant. Quality Engineering.
Critics will contest that incremental solutions may not A second principle is that the flow of the product
explore areas of the solution space beyond convention, must be continuous. From entering the raw materials
while advocates prefer to explore without excessive risk storehouse to leaving for sale, the product (ingredients/
in the explorations, and point to practical end results components, etc.) must not stop; if it does, it is lying idle,
that are equally enviable. It is noted that BPR often and so there is a waste. Hence, storage is a waste, and so
shows up associated to the introduction of novel solu- the concept of just-in-time, one of the first that was
tions from information technologies (IT) to reorganize immediately grasped for its financial advantages. The
business processes. This is a special case of BPR, where best way to ensure this continuous flow is to pull the
it comes associated with the digestion of new IT pro- product from the end, that is, the last operation sends an
cedures by company and staff, which in itself (with order to the before last, and so forth, so the product is
BPR or not) has its own success and failure factors. pulled from the end. A pull operation (request for a
Introducing IT may facilitate process improvement, or product or part needed) was originally sent with a card
it may not, and it is not a concern here. IT is a tool; it in Toyota, or ‘kanban’ in Japanese, a term that endures
should be recognized that in fact IT always facilitates as synonym of just-in-time operations by a production
kaizen from a conceptual (theoretical) perspective – but pull strategy.
how will it work in practice? kaizen in itself is not about A good starting point for analysis of the improve-
a better business concept, it is about a better business ments that can be made by eliminating wasteful
reality. operations (and hence become ‘lean’) is to apply the
Continuous process improvement will suggest start- lean questionnaire developed by MIT (LESAT – Lean
ing more modestly than reinventing from scratch, by Self Assessment Tool), available for free at the website
analyzing all operations and apply what is also known of the Lean Advancement Initiative of MIT (this site
to some extent as value engineering. Of all the business contains plenty of other free tools and studies under the
operations involved in manufacturing, some add value ‘products’ menu). It may help to give a different per-
to the product and thus to the end client, but some do spective on what can be classified as waste, and how
not. They may well be necessary for some reason, or much waste there is in a normal business operation when
they would not be there, but if they do not add value, it is analyzed from the perspective of the value to the
then they are a waste from the point of view of the client, or of the product flow.
product and of the client. The Japanese word for was- Waste is not confined to operations that should be
teful activity (‘muda’) is often cited in this context. A eliminated because they do not add value; TPS deals
better process will be one where waste is nonexistent, with two other types of waste. An obvious waste comes
so these operations should, ideally, be eliminated. from product variability and consequent loss of quality,
Therefore, it is logical to start improving a process by which is addressed in the article Plant and Equipment:
eliminating non-value operations (why spend time and Quality Engineering, and often referred to by its
resources improving those that are best eliminated Japanese term ‘mura’. The third is waste resulting
altogether?). from overcomplexity (‘muri’ in Japanese). This means
This analysis is not as straightforward as it looks, that every operation and sequence of operations should
because company executives and operators are not be as simple and direct as possible. Standardization of
used to think from the point of view of the client or of simple operations and their straightforward combination
266 Plant and Equipment | Continuous Process Improvement and Optimization

are the ideal basis to eliminate this type of waste and reality, and may also be too complex. The continuous
hence maximize productivity. In some industrial sectors process improvement concept prefers to ‘let the system
where operations involve manual handling, eliminating speak by itself’, and obtain process data from which to
muri requires appropriate ergonomic design of the infer improvements. Process optimization does the same,
workplace and workstations (the operation should be only that it assumes that a single giant leap is possible, and
easy and simple to perform, thus minimizing energy, that the absolute optimum is identifiable. Kaizen is pre-
effort, and time). pared to move stepwise.
The need for process data to infer directions of
improvement is an issue in some industrial sectors,
Process Engineering Improvements such as dairy. If the equipment operates with large
quantities of product at any given time, and the business
General Approach
margins are small, there is little room to perform tests
The operational and managerial implications of kaizen with the equipment, each of which could cost many
leave a highly flexible, efficient, and productive factory kilograms of product that run the risk of being unsui-
floor, alongside an entire lean supply chain. Operations on table for sale (when the test happens to try settings of
the factory floor involve a series of equipment units where the control factors that do not lead to a suitable pro-
the product components are progressively incorporated duct). However, process data can also be collected
and turned into the final product. Each of these operations simply from the historical records of the process. This
must now be improved. has the huge disadvantages of providing data that have
In the context of a process industry such as dairy, the not been planned, with many data points around the
raw materials undergo transformations toward a product same settings, with uncontrolled variabilities, and a scan
of a different nature (e.g., fermentation in yogurts, coa- of the solution space decided by chance and the inac-
gulation in cheese) or that hinder microbial activity to curacies of the control systems. However, it is ‘free’ data
provide shelf life (e.g., ultra-high temperature (UHT) and may reveal useful information about the system, if
processing, drying). These operations are controlled by handled properly.
a number of variables or control factors, such as tem-
peratures, flow rates, amounts of ingredients, and time
Design of Experiments
of operation. The company decides on the settings
(values) of these factors from its knowledge of the Kaizen recommends an experimental plan, and as such
system, characteristics of the equipment, recipes, and it starts by giving great importance to the planning of
other things. Each of these control factors can be set at the tests. It is obvious that if each test involves the
any value within a range of physical interest, and there actual process and equipment, the most information
is an infinite number of combinations of settings of these needs to be obtained from the least amount of data,
factors that result in a final product, although not all and therefore, it is not surprising that the area of statis-
combinations are possible. Some, however, result in tics dealing with design of experiments (DoE) must be
products that are better than others, or in processes brought in.
that are cheaper, or more productive, or use less energy, The most widely used approach in Japan is due to
and so on (for instance, the objective of reaching a given Genichi Taguchi, and is generally known as the Taguchi
microbial lethality in the thermal treatment of liquid method of robust engineering design. The method seeks
milk can be achieved equally at a higher or lower to achieve improvement of performance and also consis-
temperature, with the processing time being longer the tency of performance, and as such it is the foundation of
lower the temperature, but the quality characteristics of quality engineering too, discussed in the article Plant and
the product, such as nutrient retention and taste, are Equipment: Quality Engineering. In the context of per-
better at higher temperature-short time than at low formance improvement or optimization, it is noted that in
temperature-long time). In this context, process order to minimize experimental requirements and take as
improvement will deploy methods to find a combination much information as possible from the data, Taguchi
of settings of the control factors that will result in a chose designs based on orthogonal arrays (aka Latin
better performance. squares). They are usually designated as L-4, L-8, L-9,
It may be possible to estimate better combinations of L-12, L-16, L-27, and so on, where L stands for ‘Latin
these settings by developing mathematical models that squares’, and the number indicates the number of rows of
mimic the operation of the equipment by applying first the array, which is also the number of tests that needs to
principle equations (laws of conservation of mass, energy be performed. When consistency of performance is to be
and momentum, thermodynamics, kinetics, heat and mass considered as well, the whole set must be repeated a
transfer, etc.). However, such models will need to make number of times, and therefore, using arrays with as few
assumptions that may or may not be good images of rows as possible is important. Each array will allow testing
Plant and Equipment | Continuous Process Improvement and Optimization 267

a number of control factors, which depends on the array, temperature, there is an interaction between flow rate
with one factor associated to a column of the array. That and temperature.
column will contain the settings to be used for the factor. For this reason, ‘western’ statisticians prefer other
Some arrays have only two different settings (two-level experimental designs that provide a better differentiation
design), others have three (three-level design), and other between different effects, such as the ‘central composite
commonly used arrays have four by combining columns design’. This design uses five settings for each factor, and
of two-level factors (for instance, M-16 or L-16M refers tests some combinations of those settings that conform to
to the L-16 array modified, which tests four different a particularly well-balanced view of the solution space,
settings of factors, but it can be used for much less factors which minimizes error regions of mathematical models
than the original L-16). There are also some mixed level then used to interpret the data. However, it requires a
designs. Table 1 shows an example, the L-8 array, which large amount of data than the orthogonal arrays.
can be used to test up to seven factors with only two Whatever DoE is chosen, it is noted that it can be
settings used for each. considered as a plan for tests that will be done (the ideal
These experimental designs are very good for limit- scenario), or as a filter to be used to collect data from
ing the number of tests that need to be performed, but historical records in order to ensure minimum bias of the
come at a cost: the design generates an intricate set of analysis of those data (even if only a small subset of data
confoundings that may be difficult to separate. are thus collected; this is preferable to overweighing parts
‘Confounding’ is the statistical name given to a combi- of the solution space).
nation of terms or factors in the design that results in
their effect being indistinguishable. It does not mean Stepwise Approach
that the confounded factors are confounded by nature,
it is a consequence of the experimental design that their The Taguchi method advocates a comprehensive
impact is confounded (pooled together). For instance, if approach including meetings, brain-storming, and team-
a design tests a system at 20 and 30  C, and also con- work, that is not detailed in this text, but it is noted that
siders a flow rate of 1 or 2 l s1, but in all tests the first step must be to consider how many factors may
performed the flow rate of 1 l s1 was always used influence the system. Usually, this analysis compiles a
with 20  C and the flow rate of 2 l s1 was used only large number of factors that may be interesting to analyze.
with 30  C, then when analyzing the data it is not If the system should ‘speak for itself’, then one should not
possible to know if the differences were due to (1) the curtail the list by rational thought; instead, it is better to
temperature increase, with flow rate change being irre- rely on Vilfredo Pareto’s famous principle, the 80/20 rule,
levant, or (2) the flow rate increase, with the change in and use a first experimental plan to zoom in on the 20% of
temperature being irrelevant, or (3) both changes. factors that may be causing 80% of the consequences.
Orthogonal arrays do not produce confoundings This is also where analyzing historical records could be
between factors, but they do between the effects of helpful, and provide ‘free’ information. A simple two-
factors and the effects of their interactions. level orthogonal array design can be used for this purpose.
An ‘interaction’ is a basic feature of nonlinearity in a An L-8 can test up to 7 factors with 8 runs, an L-16 allows
system. It means that the way that a factor influences a one to consider up to 15 factors with only 16 runs, and an
system depends on the actual value of another factor. L-32 would allow for up to 31 factors with just 32 runs
For instance, if increasing the flow rate has no effect at (can, and should, be repeated), and so on, so that the
lower temperature but is important at higher selection can be done quite efficiently.
It is noted that the consequences of the confoundings
Table 1 L-8 orthogonal array
of this design are that some factors that may be considered
important in the data analysis could actually be negligible
Test no. 1 2 3 4 5 6 7 (and this would be because one of the confounded inter-
1 1 1 1 1 1 1 1
active effects was relevant), but if a factor is judged to be
2 1 1 1 2 2 2 2 negligible, then it is, and so are all effects confounded
3 1 2 2 1 1 2 2 with it. That means that nothing of importance is lost by
4 1 2 2 2 2 1 1 analyzing in this way which factors are more crucial for
5 2 1 2 1 2 1 2 improving the performance.
6 2 1 2 2 1 2 1
7 2 2 1 1 2 2 1
There are possibilities for overlooking important issues
8 2 2 1 2 1 1 2 with these two-level designs, though: if a factor has an
influence that shows a maximum or minimum of its aver-
Control factors are assigned to columns, and the rows indicate the age impact on the performance, it is possible that by a
settings of each factor for each of the eight tests that need to be
performed. Each factor is tested with only one of two settings. Designs stroke of bad luck the averages at the two extremes used
should be replicated, if possible. for the settings (low and high, 1 and 2) are similar, and the
268 Plant and Equipment | Continuous Process Improvement and Optimization

data analysis will then infer that the factor was negligible, each factor the setting that had the best average per-
while data from a design with three levels would conclude formance. This has the advantage of not relying on any
otherwise. Continuous process improvement is however model fitting, and the disadvantages are that it will
prepared to be incremental – there will always be time to suggest settings for all factors from only among those
go back into more analysis and improvements. that were used in the experimental design (even if the
Once two, three, or at most four control factors are best combination is not one of those tested), and that it
chosen for being the most crucial ones, then a design with does not account for interactive effects. It is possible to
more levels (three, four, or a central composite design) include a correction for the effect of some interactive
can be used and the solution space explored in more effects if the DoE has enough degrees of freedom for
detail to reveal regions of improved performance. that, but it is a matter of interpretation which inter-
active effects are being tested because of the intricate
nature of the confoundings. Taguchi recommends the
Data Analysis: Identifying Crucial Factors
calculation of a severity index to at least evaluate the
The Taguchi method does not need to fit mathematical relative potential importance of all interactions between
models to the data obtained with the orthogonal array pairs of control factors, but the extent to which this
design. Instead, it applies an analysis of variance effectively gives unique results is not clear.
(ANOVA), a statistical method that quantifies the varia- Furthermore, when searching for a region of optimum
bility of the data collected by its variance, and then performance, the designs usually have at least three (if
determines how much of it can be explained by the fact not more) settings, and in this case it is virtually
that each of the factors changed its settings. If the amount impossible to properly account for any interaction,
of data is sufficient to have enough degrees of freedom unless one is effectively using a full factorial design
with the DoE used, a significance test can also be applied (e.g., only two factors in an L-9). Therefore, the best
to ascertain which factors have a statistically significant combination of settings obtained by the Taguchi
effect. Results can be shown in the form of a pie chart, method assumes in practice that interactions between
which gives a very good image of what is more important factors are negligible. Taguchi recommends validation
and what might be neglected in a first approach. tests for the new settings, of course.
A typical result from ANOVA, with the most relevant Figure 2 gives an example of the graphs showing the
outcomes in the form of a table and pie chart for a averages of the data for each setting of each factor,
situation where the relevance of seven factors was tested known as the ‘means plots’, for a system where three
with an L-8, is shown in Figure 1 (case study regarding a factors were changed with four levels, according to a
particle coating process). In this example, the design was modified L-16. The maximum performance in that case
replicated (2 runs for each condition, totaling 16 data is suggested for the combination of settings 3-4-2 of the
points), which gives enough degrees of freedom to test three factors. The estimated performance of the system
for significance. The pie chart shows not only the relative for that combination of settings is 99.08, obtained sim-
importance of each factor, but also how much of the ply by adding to the global average of the data (81.29
variability of the data is unexplained. The unexplained in that case) the incremental benefit of choosing the
amount of the variance may be due to (1) the relevance of respective setting of each factor, as per the means plots.
interactions and the consequences of the intricate con- Estimating performance in this way may lead to phy-
founding of the design, (2) the influence of other factors sically inconsistent values in some cases (for instance,
that were not controlled and not considered in the design, performance above 100% or losses below 0%), which
and (3) the natural variability, or white noise, which may can be improved by a logarithmic transformation, such
come from variability in control factors, in the character- as the omega transformation.
istics of the materials or process, and also of the method
of analysis of the performance, which is similar to random
Data Analysis: Identifying Settings for Improved
experimental error. The example shown suggests that
Performance with Response Surface Method
changing the settings of three of the factors accounts for
a large proportion of the changes that can be achieved in An alternative that has been widely applied in Europe
the performance. and the United States is the response surface method
(RSM; or response surface analysis (RSA)). It can be
applied to any design, including the central composite
Data Analysis: Identifying Settings for Improved
design, and is based on postulating a mathematical
Performance with the Taguchi Method
model to describe the influence of the factors (and
The estimate of which is the combination of settings of interactions) on the performance. It has the advantage
the control factors that gives the best performance is that designs such as the central composite design that
done in the Taguchi method by simply choosing for have much less confounding issues can be handled, but
Plant and Equipment | Continuous Process Improvement and Optimization 269

ANOVA table for data


FACTORS SS df VARIENCE F
Torque 0.000 798 1 0.000 798 0.739 589
Rotational speed 0.001 58 1 0.001 58 1.464 292
Temperature 9.51E-05 1 9.51E-05 0.088 097
Oil viscosity 0.008 145 1 0.008 145 7.548 277
Drip feed rate 0.037 733 1 0.037 733 34.96 838
Drip concentration 0.021 098 1 0.021 098 19.551 75
Coating 0.037 153 1 0.037 153 34.430 41
Error 0.008 633 8 0.001 079
Total 0.115 234 15 0.007 682

Total no. of data points: 16 F-limit: 3.457 919


Ne: 2

(a)
Rotational
speed
1%
Torque Temperature
1% 0%
Error
7% Oil viscosity
7%

Coating
32% Drip feed rate
34%

Drip
concentration
18%
Figure 1 Example of an analysis of variance (ANOVA) table (a) and pie chart of the corrected sums of squares (b) for a system
potentially influenced by seven factors, tested with an L-8 design replicated once. Factors in bold in the ANOVA table have statistical
significance at a 90% confidence level. Ne is the effective number of data points, which is equal to the actual number of data points
divided by 1+ the sum of degrees of freedom of the factors used to produce the estimate. The table and graph were produced in MS
Excel.

the disadvantage is that the results will assume the useful in terms of all parameters of the model having
validity of the mathematical model, and so the lack of a very clear meaning, it assumes parabolic curves for all
fit is added to the overall amount of unexplained effects, which tends to suggest points of minimum or
variance. The simplest model is a quadratic multifac- maximum that do not really exist.
torial polynomial, that is, the sum of linear, interactive, Once a model is fitted to the data and its goodness
and quadratic terms. A linear term is proportional to of fit accepted, it can be used to pinpoint the location
the value of the factor, an interactive term is propor- of the best combination of settings (searching for the
tional to the product of the value of one factor and the point of maximum or minimum within the constraints
other, and a quadratic term is proportional to the of the solution space). The goodness of fit is typically
square of the value of the factor. All values must be quantified by the coefficient of determination (desig-
normalized between maximum and minimum, which is nated R2), which quantifies the percentage of the
called ‘coding’ (numerically, 1 and 1 are common, but variance of the data that is explained by the model,
can also be 0 and 1, or 0 and 100%). While being and values over 90% are usually desired.
270 Plant and Equipment | Continuous Process Improvement and Optimization

Drip feed rate Coating Drip concentration


95

90

85 Best
Best
Best

80

75

70
1

4
ng

ng

ng

ng

ng

ng

ng

ng

ng

ng

ng

ng
tti

tti

tti

tti

tti

tti

tti

tti

tti

tti

tti

tti
Se

Se

Se

Se

Se

Se

Se

Se

Se

Se

Se

Se
Figure 2 Example of means plots of the performance of a system affected by three factors changed with four settings each,
according to a modified L-16 array. The dotted line indicates the average of all data. For each factor, the data are divided in four
subsets according to the setting of that factor and the dots show the four averages. Calculations and plots were produced in MS
Excel.

There are two useful plots, Pareto charts and surface intervals. An example is shown in Figure 3 (case study of
plots. The Pareto chart represents the standardized effects loss of an active ingredient in a pasteurization process, as
as horizontal bars, and as such the limit of statistical affected by flow rate, inlet temperature, and length of
significance can also be represented as a vertical line. holding section). Surface plots can only be made for a
The bigger the bar, the more important the effect. pair of factors at a time, and they can be represented in
Standardized effects are the effects divided by the stan- three dimension (3D) or two dimension (2D) (Figure 4)
dard errors, and when using a model they are also equal to to help visualizing the regions of improved performance,
the parameters of the model divided by their confidence as predicted by the model.

P1 9.35

P11 5.98

P2 5.27

P12 4.82

P22 2.89

P3 1.81

P13 0.95

P33 0.71

P23 0.63

0 2 4 6 8 10
Standardized effects

Figure 3 Example of a Pareto chart for a system affected by three factors, which was tested with a central composite design, fitted
with a quadratic model. The vertical line represents the limit of significance at a 95% confidence level. The notation 1, 2, or 3 refers to the
linear effects (linear terms of the model), 11, 22, and 33 to the quadratic effects, and 12, 13, and 23 to the interactive effects, where 1, 2,
and 3 are the generic names of the factors. In this example, factor 3 and all its interactions are not statistically significant. Factor 1 is the
most important, and its effect is strongly nonlinear, showing a significant curvature (quadratic effect very relevant) and an influence
affected by the settings of factor 2 (interactive effect 1  2 significant). In this case study, the factors were inlet temperature, flow rate,
and length of holding section in a pasteurizer. The calculations and graph were produced in MS Excel.
Plant and Equipment | Continuous Process Improvement and Optimization 271

(a)
46
44
42
40
38
80
36
70 34
60 32
50 30
40 28
30 26
20 24
10 22
0 20
0 18
–1 2
3 0 16
3 8 1
1 .2 14
2 6
2 4 1.1 .20 5
1 12
Tb 2 22 1 .1 5
1.0 .05 0 10
20 8 0 0
0 .9 8
1 8 0. .90 5
1 4 0 0.80 85 6
1 .75 4
2

(b)
32 50
48
46
30 44
42
28 40
38
36
26 34
32
24 30
28
26
22 24
22
20 20
18
16
18 14
12
16 10
8
6
14 4
0.75 0.80 0.85 0.90 0.95 1.00 1.05 1.10 1.15 1.20 1.25 2

Figure 4 Examples of three-dimensional (a) and two-dimensional (b) plots of the influence of two factors on the performance of a
system predicted by a model (other influential factors are kept at a constant setting). In the example, optimum performance means
minimum losses of a valuable active ingredient, and the two factors shown are inlet temperature and flow rate. The plots were made
using the Statistica software (Statsoft).

See also: Plant and Equipment: Quality Engineering. Monden Y (1981a) What makes the Toyota production system really
tick? Industrial Engineering 13(1): 36.
Monden Y (1981b) Adaptable kanban system helps Toyota maintain
just-in-time production. Industrial Engineering 13(5): 29.
Further Reading Monden Y (1981c) Smoothed production lets Toyota adapt to demand
changes and reduce inventory. Industrial Engineering 13(8): 42.
Bashein B, Markus M, and Riley P (1994) Preconditions for BPR Monden Y (1981d) Toyota production smoothing. 2. How Toyota
success – and how to prevent failures. Information System shortened supply lot production time, waiting time and conveyance
Management 11(2): 7–13. time. Industrial Engineering 13(9): 22–30.
Womack J, Jones D, and Roos D (1990) The Machine That Changed the Montgomery D (2009) Design and Analysis of Experiments, 7th edn.
World – The Story of Lean Manufacturing. New York: McMillan Pub. Co. Hoboken, NJ: John Wiley & Sons.
Liker J (2004) The Toyota Way – Fourteen Management Principles from Ross P (1988) Taguchi Techniques for Quality Engineering. New York:
the World’s Greatest Manufacturer. New York: McGraw-Hill. McGraw-Hill.
Markgraf S (1997) Fortified with kaizen: Superior Dairy celebrates its Roy R (2001) Design of Experiments Using the Taguchi Approach – 16
75th anniversary with a ‘different’ approach to doing business – Steps to Product and Process Improvement. New York: Wiley
kaizen business concept. Dairy Foods 1 September 1997. Interscience.
272 Plant and Equipment | Continuous Process Improvement and Optimization

Shigeo S and Dillon AP (1989) A Study of the Toyota Production System Ward A, Liker JK, Cristiano JJ, and Sobek DK (1995) The second Toyota
from an Industrial Engineering Viewpoint. Norwalk, CT: Productivity paradox: How delaying decisions can make better cars faster. Sloan
Press. Management Review 36(3): 43–61.
Sugimori Y, Kusunoki K, Cho F, and Uchikawa S (1977) Toyota
production system and kanban system materialization of just-in-time
and respect-for-human system. International Journal of Production
Research 15(6): 553–564.
Taguchi G, Chowdhury S, and Taguchi S (2000) Robust engineering Relevant Websites
implementation strategy. In: Robust Engineering – Learn How to
Boost Quality While Reducing Costs and Time to Market, ch. 2, http://kaizen.com – Kaizen Institute.
pp. 10–15. New York: McGraw-Hill. http://lean.mit.edu – Lean Advancement Initiative of MIT.

You might also like