Issue: Volume 3, Number 11
Date: November 2003
From: Mark J. Anderson, Stat-Ease, Inc. (

Dear Experimenter,

Here's another set of frequently asked questions (FAQs) about doing design of experiments (DOE), plus alerts to timely information and free software updates. If you missed previous DOE FAQ Alerts, please click on the links at the bottom of this page. Feel free to forward this newsletter to your colleagues. They can subscribe by going to If this newsletter prompts you ask to your own questions about DOE, please address them to

Here's an appetizer to get this Alert off to a good start: Link to and check out the growing gallery of aurora pictures taken during the geomagnetic storms last week.  These colorful atmospheric displays stemmed from two of the largest solar flares ever recorded.  Unfortunately we saw nothing here in the Minneapolis area due to a long stretch of cloudy (and unseasonably cool) weather.  I hope those of you in other regions of the Northern Hemisphere had better luck.  For forecasts of future phenomena, see the maps of Aurora Borealis at

Here's what I cover in the body text of this DOE FAQ Alert (topics that delve into statistical detail are designated "Expert"):  

1. FAQ: How to deal with DOE data generated by a number of manufacturing stations, each making multiple parts  
2. Expert-FAQ (mixtures): Accounting for inert ingredients; and how to deal with amount as well as composition  
3. Expert-FAQ: Primers on propagation of error (POE), an advanced DOE tool especially suitable for Six Sigma  
4. Reader response: How to assess effects from a split plot  
5. Reader alert: A 'heads-up' on a web site devoted to power  
6. Info alert: A case study on the application of DOE to defect reduction in powder coatings (links are provided to the article)  
7. Events alert: Link to a schedule of appearances by Stat-Ease  
8. Workshop alert: An Anaheim workshop is coming soon; also, the 2004 schedule is now posted to our web site

PS. Quote for the month: Lyrics for people (or pirates?) who like math


1. FAQ: How to deal with DOE data generated by a number of manufacturing stations, each making multiple parts

-----Original Question-----
From: New York

"I love the monthly DOE FAQ Alert.  The questions are usually very helpful and the quotes are great, too.

This is an "FAQ" around here that I have discussed with you previously.  I would like to suggest that you address it in your monthly newsletter since I have to believe it would come up at other manufacturing sites as well.  Your putting it in print would also give me something that I could hand off to others as well! References to literature supporting your conclusions would also be useful.

The problem is that in manufacturing "widgets" there may be multiple "stations" all assembling "identical" widgets.  However, we know that not all stations are truly making "identical" widgets, either due to process differences at the station or variability of the parts coming to the station. When we have an experimental design, we have a set of design parameters and can set up the "widget-making equipment" for those design parameters. From a practical standpoint, when we make the test widgets, each one will have been made on a different station.  When we do the design analysis, should duplicate lines be entered in the design, one for each widget station, or should the results be averaged and put in once?  These widgets are not really replicates.  When we go to the next set of design parameters, we may not get product from the same set of widget stations as the first set.  Obviously, if we could ensure getting product from the same set of stations, we might be able to include the station as one of the design parameters and determine performance as a function of widget station.

Let's throw in one more item...  Measurement devices are not all perfect. If we measure one of the widgets -- say 5 times -- and get 5 slightly different answers, should all responses be used in the fitting program, or should only the average be used?

Final part of the question...  What is the effect of handling the data in the several different ways?  If all the data is entered individually, more variance is put in, making it harder, it seems, to come up with a significant fit.  On the other hand, if all response values for a parameter setting are averaged, then the repeatability understanding is lost, unless you reset the equipment and run the cell again."


By the way, did you know that a widget is a real device used to make canned beer taste like it's been freshly drafted out of a keg?  Supposedly a widget will make any old swill taste like Guinness straight out of the tap. For details on this marvelous device, see

All frivolity aside, I recommend setting up a blocked two-level factorial design where each "station" represents a block -- within which the engineer varies fixed factors.  If he or she only wants to investigate two or three factors, the design can be fully replicated on each station.  However, if many factors need to be screened, consider splitting a much larger design into two, four or more stations and running each block of runs in parallel.  In our "Experiment Design Made Easy" workshop we show an example of this that involves seven factors varied over two injection molding stations in two blocks with half the runs each. Obviously during a specified run a number of parts will be made and measured some number of times.  By averaging all the measurements over all the parts the resulting response (only one number will be entered) exhibits a much reduced variance (versus one measurement from one part).  Doesn't that make sense?

Here's a trick for dissecting the results from a completely replicated design (Warning - do NOT do this with data from fractional designs!): Selectively ignore all but any given block. Then you can break things down station-by-station and look for any aberrant behavior.  In Design-Ease® and Design-Expert® software this can be done very easily via a right-click menu option that toggles runs in or out for analytical purposes.

PS. If the objective were to quantify the components of variance (equipment-to-equipment versus part-to-part versus test-to-test), then:

1. The DOE must be set up differently (with restrictions in randomization); and
2. Before doing the analysis of variance (ANOVA) you'd need to deal with the split plot structure and identify factors that are fixed versus random.*  (I am not sure how the stations should be treated since you were not sure if they would be fixed or subject to random selection.)

A typical industrial (widgets!) scenario on this can be viewed at (link no longer works). Stat-Ease software can be manipulated to properly analyze data from designs with nested factors, either fixed or random (but only for very simple cases), as detailed in Design-Expert User Guide, Section 4 (posted at, page 4-11 and beyond.

*(In studies affecting people it's common to perform "repeated measures" involving random effects. For a web-based primer on  this, see, or refer to the aptly named "Analysis of Messy Data" Volume 1:  Designed Experiments by Milliken and Johnson (CRC Press; Reprint edition, 1993).)

(Learn more about blocking and replication by attending the three-day computer-intensive workshop "Experiment Design Made Easy." See for a complete description.  Link from this page to the course outline and schedule.  Then, if you like, enroll online.)


2. Expert-FAQ (mixtures): Accounting for inert ingredients; and how to deal with amount as well as composition

-----Original Question-----
From: Atlanta

"I took the mixture class last November. My question may be two-fold.  I am designing a 4-component D-optimal mixture design with water as a possible 5th component. The response that will be tested is % kill of an organism. While finding the optimal ratio  is a priority, the optimal dose is also important. Is this where I perform two DOEs, one for ratios of components and one for dose? How should I consider an inert ingredient, such as water, that would be a variable but not necessarily important for performance? Any advice would be appreciated!"

Answer (Shari Kraber, Stat-Ease Statistical Consultant):

"You can treat water as the 5th component if it is needed to make the mixture add to a specific total. In class we sometimes had  the last component simply be "the rest", meaning that the rest of the ingredients were reapportioned so that the total was 100% (or whatever number you need.)

It appears that you could do this in one combined experiment. The mixture components are crossed with the amount (dose) used. Check in section 5, towards the end, and look for the mixture amount case study on the ibuprofen tablets. I think this may be similar to what you want to do."

Shari is referring to experiments where the response depends on both composition AND amount of the mixture, for example:

- Spraying different types of fertilizer in varying quantities onto plots of land
- Applying a varying thickness of paint made with varying ingredients.

Normally, by definition, in a true mixture experiment the response must be a function of proportions, not amount.  The ibuprofen case study (from section 5 of our mixture workshop) shows how to design an experiment that generates information about composition AND amount of coating on the rate of drug release.  In this case, the problem is solved via the "crossed" d-optimal design option in Design-Expert software, which offers users the choice of mixture and/or numerical and/or categorical factors.

(Learn more about doing mixture-amount designs by attending the three-day computer-intensive workshop "Mixture Design for Optimal Formulations."  See for a complete description.  Link from this page to the course outline and schedule. Then, if you like, enroll online.)


3. Expert FAQ: Primers on propagation of error (POE), an advanced DOE tool especially suitable for Six Sigma

-----Original Question-----
From: Belgium

"I would like to read some articles where people used propagation of error during optimization with experimental design. Do you have references of articles where this is described?"


For the technicalities of POE, see A very high-powered application of POE for DOE is detailed at Also see "DOE FAQ Alert" Volume 1, Number 3 (, item 3 on the topic.  Finally, for the funniest write up on POE (as you will see, this a very biased opinion), link to

(Learn more about POE from the "Robust Design, DOE Tools for Reducing Variability" workshop.  For course content, see This class requires proficiency in RSM which can be gained by attending the "Response Surface Methods for Process Optimization" workshop (


4. Reader response: How to assess effects from a split plot

-----Original Question-----
From: North Carolina

"As usual, I found your DOE FAQ alert to be interesting and informative. But I have a question about how the analysis was done for the split plot trial described in item 4 of DOE FAQ Alert, Volume 3, Number 10 - October, 2003 [see].  How do you set up the analysis to get the two half normal plots? I think I understand setting up the trial design (the whole plot factors are 1/2 factorial with three factors while the two subplot factors are run as a full factorial + center point, correct?) And after taking several of your courses, I definitely know how to interpret the half normal plots. But I can't figure out how to get from the trial design to the two half normal plots."


I am glad you are getting some good out of the DOE FAQ Alert.  For details on how to get separate half-normal graphs for the two families (whole and subplot) of effects from a two-level split plot design, link to section 6 of the Design-Expert software User Guide via and go to page 13 for a write up entitled "Neat Tricks, Two-Level Factorial Analyzed as a Split Plot."  There you will see detailed a case study from George Box, et al.  It's very cool!

(If you want to hone your skills on factorial design, bring a Stat-Ease consultant in for a private presentation of "Real-Life DOE: Tricks of the Trade" (in-house Only) workshop. For a description, see  Link from this page to the course outline.  Call 1.612.378.9449 and ask for a quote to bring this workshop to your site.)


5. Reader response: A 'heads-up' on a web site devoted to power

-----Original Question-----
From: Paul James of National Starch (a regular correspondent)

"Have you seen Russ Lenth's Power Analysis Site at"


This is a great 'heads-up' from Paul.  I recommend you click the link that Professor Lenth provides to an early draft of his publication "Some Practical Guidelines for Effective Sample Size Determination" (The American Statistician, 55, 187-193).  On page 3 of this article you will find an example illustrating a simple comparative study on how to treat blood pressure, something near and dear to me because a close relation suffers from this malady. Lenth shows how you can make use of his two-sample t test calculator (Java applet) to determine what sample size will be needed to get the desired power, for which the 'sweet spot' is 0.8 to 0.95.*  Watch out though, the interface has changed a bit from what's depicted in the paper on Figure 1. There's now a field labeled "Allocation" that you must change via a drop-down list to "Optimal."  If you slide the bars to the proper levels for sigma, the true difference in means (required effect size) and power as detailed in the example, you can reproduce Lenth's recommendation for sample size.  It's fun!

*(For more detail on power, see "Sizing Fixed Effects for Computing Power in Experimental Designs," by Stat-Ease consultant Pat Whitcomb and Advisor Gary Oehlert, posted at  This is an early draft of an article published in "Quality and Reliability Engineering International.")


6. Info alert: A case study on the application of DOE to defect reduction in powder coatings (links are provided to the article)

The April 2003 issue of "Powder Coating" magazine (Vol 14, #3, pp 12-15) features the article "DOE Software Paints Picture of Powder Coating Defects," which details how Morton Powder Coatings in Reading, Pennsylvania used design of experiments to deal with a situation that defied conventional problem-solving techniques.  An abstract is at with an option to purchase the complete article.  A draft version is available for free at .


7. Events alert: Link to a schedule of appearances by Stat-Ease

Click on for a list of appearances by Stat-Ease professionals.  We hope to see you sometime in the near future!


8. Workshop alert: An Anaheim workshop is coming soon; also, the 2004 schedule is now posted to our web site

"Experiment Design Made Easy" will be presented in Anaheim, California this month (November) on the 18th through the 20th. It's only a few weeks away, so call now if you'd like to enroll.

See for schedule and site information on all Stat-Ease workshops open to the public, including the coming new year of 2004.  To enroll, click the "register online" link on our web site or call Stat-Ease at 1.612.378.9449.  If spots remain available, bring along several colleagues and take advantage of quantity discounts in tuition, or consider bringing in an expert from Stat-Ease to teach a private class at your site.  Call us to get a quote.


I hope you learned something from this issue. Address your general questions and comments to me at:



Mark J. Anderson, PE, CQE
Principal, Stat-Ease, Inc. (
Minneapolis, Minnesota USA

PS. Quote for the month -- Lyrics for people who like math:

"I'm very well acquainted, too, with matters mathematical,
I understand equations, both the simple and quadratical,
About binomial theorem I'm teeming with a lot o' news,
With many cheerful facts about the square of the
hypotenuse. I'm very good at integral and differential calculus;
I know the scientific names of beings animalculous."

—Excerpt of the song "I Am the Very Model of a Modern Major General" from The Pirates of Penzance by Gilbert and Sullivan

PPS. Shari Kraber says "I saw George Box sing this at the 2001 Joint Statistical Meetings (JSM).  Quite a sight!"

Trademarks: Design-Ease, Design-Expert and Stat-Ease are registered trademarks of Stat-Eae, Inc.

Acknowledgements to contributors:

—Students of Stat-Ease training and users of Stat-Ease software
—Fellow Stat-Ease consultants Pat Whitcomb and Shari Kraber (see for resumes)
—Statistical advisor to Stat-Ease: Dr. Gary Oehlert (
—Stat-Ease programmers, especially Tryg Helseth (
—Heidi Hansel, Stat-Ease marketing director, and all the remaining staff


Interested in previous FAQ DOE Alert e-mail newsletters?
To view a past issue, choose it below.

#1 Mar 01
, #2 Apr 01, #3 May 01, #4 Jun 01, #5 Jul 01 , #6 Aug 01, #7 Sep 01, #8 Oct 01, #9 Nov 01, #10 Dec 01, #2-1 Jan 02, #2-2 Feb 02, #2-3 Mar 02, #2-4 Apr 02, #2-5 May 02, #2-6 Jun 02, #2-7 Jul 02, #2-8 Aug 02, #2-9 Sep 02, #2-10 Oct 02, #2-11 Nov 02, #2-12 Dec 02, #3-1 Jan 03, #3-2 Feb 03, #3-3 Mar 03, #3-4 Apr 03, #3-5 May 03, #3-6 Jun 03
, #3-7 Jul 03, #3-8 Aug 03, #3-9 Sep 03 #3-10 Oct 03, #3-11 Nov 03 (see above)

Click here to add your name to the FAQ DOE Alert newsletter list server.

Statistics Made Easy™

DOE FAQ Alert ©2003 Stat-Ease, Inc.
All rights reserved.


Software      Training      Consulting      Publications      Order Online      Contact Us       Search

Stat-Ease, Inc.
2021 E. Hennepin Avenue, Ste 480
Minneapolis, MN 55413-2726
p: 612.378.9449, f: 612.378.2152