Issue: Volume 4, Number 12
Date: December 2004
From: Mark J. Anderson, Stat-Ease, Inc. (

Dear Experimenter,

Here's another set of frequently asked questions (FAQs) about doing design of experiments (DOE), plus alerts to timely information and free software updates. If you missed previous DOE FAQ Alerts, please click on the links at the bottom of this page. If you have a question that needs answering, click the Search tab and enter the key words. This finds not only answers from previous Alerts, but also other documents posted to the Stat-Ease web site.

Feel free to forward this newsletter to your colleagues. They can subscribe by going to If this newsletter prompts you to ask your own questions about DOE, please address them via mail to:

Here's an appetizer to get this Alert off to a good start:

Several weeks ago I enjoyed a stunning display of shimmering northern lights (Aurora Borealis) that would be impossible to describe, but you can get some idea of what it was like by the photos posted at Work your way back to the home page for this site and follow links for cool pictures of tornados that hit the midwestern USA this past summer.  Sky-watching keeps us folks entertained all year round but I will take northern lights over tornados any time!  The National Oceanic and Atmospheric Administration offers an awesome site on space weather at .  For details on the Aurora, both north and south, see:

Here's what I cover in the body text of this DOE FAQ Alert (topics that delve into statistical detail are designated "Expert"):

1. Info alert: "RSM Simplified" published—response surface methods made easier and more fun than ever before
2. FAQ: Experimentation on computer simulations
3. Reader reply: Strategy of experimentation
4. Events alert: Stat-Ease appearing at Six Sigma conference
5. Workshop alert: "Experiment Design Made Easy" in San Jose

PS. Quote for the month: Something incredible.


1. Info alert: "RSM Simplified" published—response surface
methods made easier and more fun than ever before

OK, I admit being biased in making the new book "RSM Simplified" (Anderson and Whitcomb, Productivity, Inc., New York, NY, copyright 2005) sound so enlightening and enjoyable to read, but after working on it for the last few years, I cannot help myself. As you will see in my response to questions noted below, it has already come in very handy for addressing various topics related to response surface methods and DOE in general.  Together with the previous book co-authored by Pat and me, "DOE Simplified," the full range of tools for breakthrough and optimization of processes are now presented in a non-academic manner.

Whereas the first book in the series comes with Design-Ease® software, the new "RSM Simplified" book provides the more advanced Design-Expert® program.  Both programs are fully-functional, but limited to 180 days of use.  However, you may be very interested to hear that the Design-Expert for "RSM Simplified" is a beta release of version 7, which offers an impressive upgrade in designs, tools for analysis, and graphics.  Thus, by buying this new book you will also get a sneak preview of the upcoming new software release (expected mid-2005 or so, depending on how long it takes for beta testing).

For more details on "RSM Simplified" and how it can be purchased, see

(Learn more about RSM by attending the three-day computer-intensive workshop "Response Surface Methods for Process Optimization."  See for a complete description.  Link from this page to the course outline and schedule.  Then, if you like, enroll online.)


2. FAQ: Experimentation on computer simulations

-----Original Question-----
From: New York

"I read your DOE FAQ Alert every month and am always fascinated by your quotes and references. That said, I'm really disappointed that there are no books out there that understand/discuss experimentation when doing computer simulations. We and other aerospace, defense, and automotive companies have developed a wealth of knowledge on the differences between classical physical experimentation and analytical experimentation. If you have a paper or book that you could refer me to that has a compilation of the latest and greatest, I'd really appreciate the reference."

I posted an excerpt from the original manuscript for "RSM Simplified" that addresses this topic: See I expect to see further developments on this relatively new application for DOE.  If any of you readers have suggestions, please e-mail me.


Reader reply: Strategy of experimentation

-----Original Question-----
From: Dick DeLoach, Senior Research Scientist, NASA Langley Research Center

"Not to pick nits in an otherwise wonderful (as usual) issue of your newsletter, but the seven-step process for designing an experiment that you presented in your September 2004 issue [see FAQ #1 by Stat-Ease consultant Shari Kraber] contained one recommendation that we generally try to discourage among our experiment-design neophytes here at NASA Langley Research Center.  Your recipe advises the practitioner to prioritize factors in order to examine a manageable subset, suggesting that 5-8 variables is about all that is typically practical.  In your example, you suggest holding raw material constant while examining process variables.

As I know you are aware, this approach implicitly assumes no significant interactions between material and process, which may or may not be true.  Would it not be better advice in general to recommend at least considering a highly fractionated initial screening design featuring as many candidate variables as possible (recognizing that even in this case there might be some that would have to be held constant)?

We try to teach our 'newbies' that factor effects wrought through subtle interactions are often more key to understanding a process than the main effects of those factors alone, so that it is useful to try to confirm through screening experiments that interactions are unimportant (small or non-existent) before treating certain factor effects as independent.  This process often leads to other useful surprises, such as a discovery that factors forecasted on the basis of intuition to have relatively small or relatively large effects sometimes having the opposite effect on response variables.  While there is only anecdotal evidence to support this assertion, I can say that this kind of surprise is more nearly the norm than the exception in certain types of experiments we do here (configuration aerodynamics experiments, for example).

We try to withhold factor selection decisions until an objective preliminary screening design suggests which factors are important, which factors act independently, and which factors interact significantly.  We then proceed to a higher-resolution follow-on experiment featuring factors shown objectively to be important, trying to hold constant only those factors that we are reasonably confident exhibit no interactions (or only weak ones) with the factors we select for closer examination.

There is of course always a tradeoff between experimental objectives and resource constraints, and some variables might have to be excluded even when a preliminary screening strategy is employed.  We simply counsel our first-timers to take at least a preliminary look at whatever might reasonably be expected to influence system response.  We have found that there can be (usually are!) some interesting surprises lurking!"

Dick, you make very good points regarding the need to screen as many variables as possible at the earliest stages of process development.  FYI, take a look at the write up on strategy of experimentation excerpted from the manuscript for "RSM Simplified" and posted to

I think you will agree that we're all on the same page regarding the need for screening and being open to the possibility of interactions.

What Shari did not mention in the interest of brevity is that we often suggest that clients who get overwhelmed by dealing with too many factors in one experiment should consider screening down several 'batches' of 5-8 variables each.  Ideally these will be done systematically.  For example, being a professional chemical engineer, I think in terms of unit operations and want to start doing screening designs furthest upstream in a series of process steps.

In any case, it's always tricky deciding where to start in with the first DOE, how many factors to include, the choice of design, etc., etc., etc.  However, my hope is that by educating process experts on the tools of DOE, the odds of discovering an optimal configuration are greatly enhanced—regardless of the precise
path taken for the improvement project.

Shari adds these comments:
"My concern is that Dick's suggestion to run a highly-fractionated initial screening design with 'as many candidate variables as possible' leads experimenters to extremely low resolution designs that confuse interactions with main effects (resolution III).  As Figure 1-1 of the linked excerpt from "RSM Simplified" indicates, Stat-Ease advocates resolution IV designs for screening.  However, even these medium-resolution designs will not identify which factors are interacting—they remain confounded with each other.  The proper identification of interactions is best done by following up with resolution V or better designs as shown in the 'breakthrough' phase in the flowchart for strategy of experimentation."

(Learn more about strategy of experimentation by attending the three-day computer-intensive workshop "Experiment Design Made Easy."  See for a course description.  Link from this page to the course outline and schedule.  Then, if you like, enroll online.)


4. Events alert: Stat-Ease appearing at Six Sigma conference

The American Society for Quality (ASQ) is sponsoring a "Six Sigma Conference" on February 7-8, 2005 in Palm Springs California.  For information, see (Update 3/07: link no longer available). I will be there to staff a booth for Stat-Ease and present a talk titled "Cost-Effective and Information-Efficient Robust Design for Optimizing Processes."

Click for a list of appearances by Stat-Ease professionals.  We hope to see you sometime in the near future!


5. Workshop alert: "Experiment Design Made Easy" in San Jose

Experiment Design Made Easy, our three-day computer-intensive workshop on the basics of DOE will be presented twice in California this Winter:

—December 7-9, 2004 in Anaheim
—January 25-27, 2005 in San Jose

See for schedule and site information on all Stat-Ease workshops open to the public.  To enroll, click the "register online" link on our web site or call Stat-Ease at 1.612.378.9449.  If spots remain available, bring along several colleagues and take advantage of quantity discounts in tuition, or consider bringing in an expert from Stat-Ease to teach a private class at your site.  Call us to get a quote.


I hope you learned something from this issue. Address your general questions and comments to me at:



Mark J. Anderson, PE, CQE
Principal, Stat-Ease, Inc. (
2021 East Hennepin Avenue, Suite 480
Minneapolis, Minnesota 55413 USA

PS. Quote for the month: Something incredible.

"Somewhere, something incredible is waiting to be known."
—Carl Sagan

[For something else incredible, but funny, see
Trademarks: Design-Ease, Design-Expert and Stat-Ease are registered trademarks of Stat-Ease, Inc.

Acknowledgements to contributors:
—Students of Stat-Ease training and users of Stat-Ease software
—Fellow Stat-Ease consultants Pat Whitcomb and Shari Kraber (see for resumes)
—Statistical advisor to Stat-Ease: Dr. Gary Oehlert (
—Stat-Ease programmers, especially Tryg Helseth (
—Heidi Hansel, Stat-Ease marketing director, and all the remaining staff

DOE FAQ Alert—Copyright 2004
Stat-Ease, Inc.
All rights reserved.


Interested in previous FAQ DOE Alert e-mail newsletters?
To view a past issue, choose it below.

#1 Mar 01
, #2 Apr 01, #3 May 01, #4 Jun 01, #5 Jul 01 , #6 Aug 01, #7 Sep 01, #8 Oct 01, #9 Nov 01, #10 Dec 01, #2-1 Jan 02, #2-2 Feb 02, #2-3 Mar 02, #2-4 Apr 02, #2-5 May 02, #2-6 Jun 02, #2-7 Jul 02, #2-8 Aug 02, #2-9 Sep 02, #2-10 Oct 02, #2-11 Nov 02, #2-12 Dec 02, #3-1 Jan 03, #3-2 Feb 03, #3-3 Mar 03, #3-4 Apr 03, #3-5 May 03, #3-6 Jun 03, #3-7 Jul 03, #3-8 Aug 03, #3-9 Sep 03 #3-10 Oct 03, #3-11 Nov 03, #3-12 Dec 03, #4-1 Jan 04, #4-2 Feb 04, #4-3 Mar 04, #4-4 Apr 04, #4-5 May 04, #4-6 Jun 04, #4-7 Jul 04, #4-8 Aug 04, #4-9 Sep 04, #4-10 Oct 04, #4-11 Nov 04, #4-12 Dec 04 (see above)

Click here to add your name to the FAQ DOE Alert newsletter list server.

Statistics Made Easy™

DOE FAQ Alert ©2004 Stat-Ease, Inc.
All rights reserved.


Software      Training      Consulting      Publications      Order Online      Contact Us       Search

Stat-Ease, Inc.
2021 E. Hennepin Avenue, Ste 480
Minneapolis, MN 55413-2726
p: 612.378.9449, f: 612.378.2152