Case Studies


Published: March 2016
Authors: Mark Anderson, Wayne Adams, Patrick Whitcomb

By sizing experiment designs properly, test and evaluation (T&E) engineers can assure they specify a sufficient number of runs to reveal any important effects on the system. For factorial designs laid out in an orthogonal matrix this can be done by calculating statistical power (Anderson and Whitcomb, 2014). However, when a defense system behaves in a nonlinear fashion, then response surface method experiment (RSM) designs must be employed (Anderson and Whitcomb, 2005). The test matrices for RSM generally do not exhibit orthogonality, thus the effect calculations become correlated and degrade the statistical power. This in turn leads to inflation in the number of test runs needed to detect important performance differences that may be generated by the experiment. A generally acceptable alternative to sizing designs makes use of fraction of design space (FDS) plots. This article details the FDS approach and explains why it works best to serve the purpose of RSM experiments done for T&E.

Publication: ITEA Journal

Employing Power to "Right-Size" Design of Experiments

Published: March 2014
Authors: Mark Anderson, Patrick Whitcomb

This article provides insights on how many runs are required to make it very likely that a test will reveal any important effects. Due to the mathematical complexities of multifactor Design of Experiments (DOE) matrices, the calculations for adequate power and precision (Oehlert and Whitcomb 2002) are not practical to do by 'hand' so the focus is kept at a high level--scoping out the forest rather than detailing all the trees. By example, reader will learn the price that must be paid for an adequately-sized experiment and the penalty incurred by conveniently grouping hard-to-change factors. (The article is not available on the ITEA Journal web site without membership. Click on the "Download" link to view the manuscript.)

Publication: The ITEA Journal

Published: March 2014
Authors: Mark Anderson, Patrick Whitcomb

Due to operational or physical considerations, standard factorial and response surface method (RSM) design of experiments (DOE) often prove to be unsuitable. In such cases a computer-generated statistically-optimal design fills the breech. This article explores vital mathematical properties for evaluating alternative designs with a focus on what is really important for industrial experimenters. To assess “goodness of design” such evaluations must consider the model choice, specific optimality criteria (in particular D and I), precision of estimation based on the fraction of design space (FDS), the number of runs to achieve required precision, lack-of-fit testing, and so forth. With a focus on RSM, all these issues are considered at a practical level, keeping engineers and scientists in mind. This brings to the forefront such considerations as subject-matter knowledge from first principles and experience, factor choice and the feasibility of the experiment design.

Publication: Journal of Statistical Science and Application

Published: March 2012
Authors: Patrick Whitcomb, Mark Anderson

Statistical methods are becoming increasingly vital for pharmaceutical manufacturers. Design of experiments (DOE) is a primary tool for determining the relationship between the factors that have an effect on a process and the response of that process.

Publication: Stat-Ease, Inc.

Published: September 2010
Authors: Mark Anderson, Patrick Whitcomb

The statistical design of experiments is an essential ingredient of successful product development and improvement, and provides an efficient and scientific approach to obtaining meaningful information. In contrast to traditional vary one-factor-at-a-time (OFAT) experimentation, variables are changed together, permitting evaluation of interactions. Standard texts give details about the construction of specific test plans, such full and fractional factorial, and response surface designs, and the analysis of the resulting data. This article gives a brief overview. The focus here is on the fundamental elements of experimental design: defining the purpose and scope of the experiment, differentiating between alternative types of experimental variables, understanding the underlying environment and constraints, and conducting stage-wise experimentation. Brief discussions dealing with the statistical analysis tools, multiple response variables, and some historical background are also provided.

Publication: Kirk-Othmer Encyclopedia of Chemical Technology

Making Use of Mixture Design to Optimize Olive Oil - A Case Study

Published: August 2009
Authors: Mark Anderson, Patrick Whitcomb

Olive oil, an important commodity of the Mediterranean region and a main ingredient of their world-renowned diet (see sidebar), must meet stringent European guidelines to achieve the coveted status of "extra virgin." Oils made from single cultivars (a particular cultivated variety of the olive tree) will at times fall into the lower "virgin" category due to seasonal variation. Then it becomes advantageous to blend in one or more superior oils based on a mixture design for optimal formulation.

Publication: ASQ Chemical and Process Industries Division Newsletter

Automated Optimization of a Multiplex PCR Using Sagian AAO Software for the Biomek FX Liquid Handling System

Published: October 2007
Authors: Dana Campbell, Lisa Fan, Keith Roby, Graham Threadgill, Patrick Whitcomb

Optimizing biological assays conditions is often a challenging process facing scientists. The demand to produce quality and robust assays that work across a range of biological conditions is often strived for along with a short development timeframe. In addition, automated systems are often required to enable scientists to screen in a high-throughput environment.

Publication: Beckman Coulter

Graphical Selection of Effects in General Factorials

Published: October 2007
Author: Patrick Whitcomb

Power point presentation demonstrates equivalency of new method with Daniel's half-normal plot of effects for two-level factorials. Demonstrates the general method: two replicates of a 3x2 factorial, two replicates of a 3x2x2 factorial, single replicate of a 3x4x4 factorial.

Publication: 2007 Fall Technical Conference

Published: October 2007
Authors: Mark Anderson, Patrick Whitcomb

This article starts with the basics on RSM before introducing two enhancements that focus on robust operating conditions: Modeling the process variance as a function of the input factors and Propagation of error(POE) transmitted from input factor variation. It discusses how to find the find the flats high plateaus for maximum yield and broad valleys that minimize defects. Proceeding from International SEMATECH Manufacturing Initiative (ISMI) Symposium on Manufacturing Effectiveness.

Publication: https://cdnm.statease.com/pubs/RSM_for_peak_performance.pdf

Published: March 2007
Authors: Mark Anderson, Patrick Whitcomb

This article deals with thorny issues that confront every experimenter, i.e., how to handle individual results that do not appear to fit with the rest of the data - damaging outliers and/or a need for transformation. The trick is to maintain a reasonable balance between two types of errors: (1) deleting data that very only due to common causes, thus introducing bias to the conclusions. (2) not detecting true outliers that occur due to special causes. Such outliers can obscure real effects or lead to false conclusions. Furthermore, an opportunity may be lost to learn about preventable causes for failure or reproducible conditions leading to break-through improvements (making discoveries more or less by accident).

Publication: Quality Engineering