Case Studies and White Papers


Published: January 2009
Authors: Mark Anderson, Zivorad Lazic

A statistically based design of experiments (DOE) approach developed specifically for mixtures was used to formulate a blend of rayon fibers that produced maximal tampon absorbency.

Design of Experiments Demonstrates Robustness of Biopharmaceutical Process

Published: January 2009
Author: Mark Anderson

Diasorin used DOE to evaluate the robustness of its process for manufacturing an alpha-1-antitrypsin (AAT) assay. The results provided a considerable degree of confidence that existing in-process quality control criteria sufficed for being assured of meeting finished product requirements. This case study provides an excellent example of how DOE can reduce the time required to perform a latitude study while delivering statistical analysis that increases the degree of confidence in the study.

Published: October 2008
Author: Mark Anderson

Engineers at a major medical device manufacturer used RSM to successfully model a key process for their flagship product. The RSM model then became the foundation for development of robust specifications to ensure quality at six sigma levels.

Design of Experiments Reduces Rubber Scrap by 90%

Published: September 2008
Author: Mark Anderson

A custom rubber molder used DOE to uncover a combination of material selection and manufacturing protocol that created unacceptable results. Armed with this process knowledge, they achieved breakthrough quality improvements.

Publication: Rubber & Plastics News

Published: August 2008
Author: Mark Anderson

This article demonstrates how to uncover "sweet spots" where multiple fab-process specifications can be met in a most desirable way.

Publication: Fab Engineering & Operations

Statistical Design of Experiments on Fabrication of Starch Nanoparticles - A Case Study for Application of Response Surface Methods

Published: April 2008
Authors: Nadeem Irfan Bukhari, Simran Kaur, Saringat H. Bai, Yuen Kah Hay, Abu Bakar Abdul majeed, Yeow Beng Kang, Mark Anderson

This paper details the fabrication of nanoparticles as an example for showing a statistically-rigorous approach to design and analysis of pharmaceutical experiments.

Published: April 2008
Author: Mark Anderson

In this mini-paper, Mark Anderson details an in-class experiment illustrating the power of two-level factorial design. Also learn how to shoot a wicked slap shot!

Publication: Statistics Division Newsletter

Published: October 2007
Authors: Mark Anderson, Patrick Whitcomb

This article starts with the basics on RSM before introducing two enhancements that focus on robust operating conditions: Modeling the process variance as a function of the input factors and Propagation of error(POE) transmitted from input factor variation. It discusses how to find the find the flats high plateaus for maximum yield and broad valleys that minimize defects. Proceeding from International SEMATECH Manufacturing Initiative (ISMI) Symposium on Manufacturing Effectiveness.

Publication: https://cdnm.statease.com/pubs/RSM_for_peak_performance.pdf

Response Surface Methods for Peak Process Performance

Published: August 2007
Author: Mark Anderson

This is the third article of a series on design of experiments (DOE). The first publication provided tools for process breakthroughs via two-level factorial designs. The second article illustrated how to re-formulate rubbers or plastics using powerful statistical methods for mixture design and analysis. Via two case studies, the author now brings the focus back to process improvement. The key is in-depth DOE aimed at producing statistically-validated predictive models. Response maps made from these models point the way to pinnacles of process performance--sweet spots at high yield of in-specification products made at lowest possible cost.

Publication: Rubber & Plastic News

Published: March 2007
Authors: Mark Anderson, Patrick Whitcomb

This article deals with thorny issues that confront every experimenter, i.e., how to handle individual results that do not appear to fit with the rest of the data - damaging outliers and/or a need for transformation. The trick is to maintain a reasonable balance between two types of errors: (1) deleting data that very only due to common causes, thus introducing bias to the conclusions. (2) not detecting true outliers that occur due to special causes. Such outliers can obscure real effects or lead to false conclusions. Furthermore, an opportunity may be lost to learn about preventable causes for failure or reproducible conditions leading to break-through improvements (making discoveries more or less by accident).

Publication: Quality Engineering