DOE FAQ Alert Electronic Newsletter

Issue: Volume 2, Number 1
January 2002
Mark J. Anderson, Stat-Ease, Inc.

Here's another set of frequently asked questions (FAQs) about doing design of experiments (DOE), plus alerts to timely information and free software updates. If you missed previous DOE FAQ Alerts, go to
Feel free to forward this newsletter to your colleagues. They can subscribe by going to

I offer the following link as a light appetizer:, which reveals the world's funniest joke according to over 100,000 participants in an experiment on humor. It's not too late for you to put in your vote or contribute a new joke. Have fun!

Here's what I cover in the body text of this DOE FAQ Alert:

1. FAQ: Teaching principles of DOE by making microwave popcorn
2. X-FAQ*: Why reported powers look poor for highly-constrained mixture designs
3. Simulation Alert: Try your hand at knocking down a castle
4. User feedback: Design-Expert® software earns superior rating
5. Events alert: A heads-up on DOE talks and demos
6. Workshop alert: Stat-Ease does San Jose, Philadelphia and Dallas

PS. Quote for the month - biology vs. physics vs. statistics.

*(Reminder: topics that delve into statistical detail are rated "X" for eXpert. Read these only if you dare to eXpand the eXtent of your knowledge of statistics for eXperimenters.)

1 - FAQ: Teaching principles of DOE by making microwave popcorn

-----Original Question-----

From: Pennsylvania

"I took your "DOE Simplified" course in Philadelphia last Summer, and thought it was great. [In conjunction] with the ACS Kids and Chemistry program,* I am planning to do a science experiment for classes in the local elementary school and thought the microwave popcorn experiment would be fun and educational. I [looked up] the articles on your web site. [Note from MJA: See my 1993 study at (done for 5th grade science project), and for a recent study done at my office and published in the "Stat-Teaser."] Do you have any suggestions for doing this in 4th through 6th grade classrooms with around 25 students? The whole thing should take up only about an hour, so we can't spend more than 30 minutes popping the corn (5 runs at an average of 6 minutes between). I'm counting on Design-Ease(R) to make the analysis go quickly so we will have some time to discuss what we are doing and how it turned out."

*[For details on this American Chemical Society program, go to\curriculum\kidchem.html (this link has changed to and click on the "Kids and Chemistry" link in the K-12 section.]


This will be fun and educational for your students. I suggest that you set up a simple 2^2 factorial design, fully-replicated and blocked by classroom. Solicit ideas from your students on what might affect the taste of popcorn. This would be a good time to make some popcorn, taste it and talk about how to rate it. (I think a 1-worst to 10-best scale would be simplest.) The Ishikawa diagram works well as a template for collecting ideas on cause-and-effect. See a helpful description of this diagram at for which I got permission to link from the author - Doris Quinn, Director of Quality Education and Measurement at Vanderbilt University Medical Center in Nashville, TN. Also, see my note below* on a good reference text that describes the Ishikawa diagram and other quality tools.*

My partner Pat and I suggest this choice for factors:

A. Brand of popcorn: One versus another (the top two found at your local supermarket)
B. Type of popcorn: Light versus regular (less versus more butter)

When you analyze the results (averaged taste ratings per run), I expect that you will see a much bigger effect from B than A. You may see an interaction of factors, particularly if the competing brands are not consistent in what they label as "light" versus "regular." As you say, Design-Ease (or the Factorial portion of Design-Expert) software will make the analysis easy. I suggest you use a spreadsheet such as Excel to do the averaging and then enter these numbers in the DOE software.

Let me know how it goes!

(Stat-Ease provides an overview of the 1993 popcorn case in its one-day "DOE Simplified" (DOES) presentation based on the book of the same name ( See for details on the presentation. It will be done next in Dallas on February 21.)

Although "DOE Simplified" is fun and informative, it's only intended to get people started on the path to more effective experimentation. We hope that participants will then be motivated to take the next step by attending our "Experiment Design Made Easy" (EDME) workshop, which will be presented on February 5-7, 2002 in San Jose, California. We then go through the popcorn more thoroughly and ask students to do some of the calculations so they know what's going on.)

* Free book(s)! While cleaning house I came across several paperback copies of "Statistical Methods for Quality Improvement" by Hitoshi Kume. This classic text and training manual focuses on real-world applications of statistical tools for improving quality. It's highly recommended by Dr. Ishikawa. I will give these books away on a first-come, first-serve basis upon request. If you're not one of the lucky "winners," you can purchase the book (or an equivalent text) from Productivity Inc. online at

2 - X-FAQ: Why reported powers look poor for highly-constrained mixture designs

-----Original Question-----

From: Michigan

"I haven't worked much with mixture designs and am trying to put one together. I am using the D-optimal procedure because the components do not have the same range: A 45-65%, B 25-45%, C 0-1% (total A+B+C is 90%). I have a design that appears to be nice, but the evaluation indicates that the power across the board (reported at 1/2, 1 and 2 standard deviations (SD) in Design-Expert software) is only 5% for all but one of the effects. Do you have any suggestions to improve this design?"

Answer (with assistance from Pat Whitcomb and Gary Oehlert):

I'm afraid this is the nature of the beast for inherently non-orthogonal mixture designs, particularly when you further impose tight constraints on one or more components. Two factors combine to make the power low. First, the mixture nature of the design and the tight constraints combine to make the standard deviations of the estimated effects large. (The fixed sum constraint, and the additional constraints on other components, make the model predictors correlated, which in turn makes the estimated coefficients for those predictors have higher standard deviations.) Second, the 2 SD effect is measured across the entire simplex, but we only get to see a small part of that for highly constrained components. For example, C ranges between 0 and .01, so for even a 2 SD-sized C effect, the observed range of the C effect in the design is only .02 SD. These two features combine to make the signal to noise ratio very low, which in turn makes our power (our probability of detecting an effect) low as well. [Note: the floor for power is whatever threshold risk level (alpha value) you've established (5% by default in Design-Expert).]

The good news is that a design like yours is likely to produce useful response surface plots despite the low power to resolve individual model terms. Unfortunately, so far as we know, nobody has come up with a way to calculate the overall power of a mixture model as a whole, which is really what's important as a practical matter for industrial experimenters like you. One good sign in your case is that you've got fairly good power (about 70% for a 2 standard deviation effect) for the AB term. To assess overall power, perhaps it's reasonable to look at interaction terms made up of parent components that are relatively unconstrained. I'm hoping that somebody from academia will research this whole issue and provide some better rules-of-thumb.

(Learn more about these issues by attending the 3-day computer-intensive workshop "Mixture Design for Optimal Formulation." Go to for a description and links to the course outline, schedule and on-line registration.)

3 - Simulation alert: Try your hand at knocking down a castle

Based on the continued popularity of movies such as "Lord of the Rings" ( it seems safe to say that there's something quite fascinating about medieval cultures. Maybe you'd like to try your hand at bombarding a castle with an ancient weapon of war called a trebuchet. To see how a group of modern-day scientists reconstructed one of these medieval war-machines, click, type "trebuchet" into the search field, and select the link entitled "Ready, Aim, Fire" (update-3/07: you will find the link at However, you need not go to the trouble of building your own trebuchet. Just click on (trebuchet no longer available for public access), read the instructions and simulate the process of knocking down a castle.

I recently reviewed this nifty educational tool for DOE. It was developed by Bill Hathaway of, an Internet provider of Six Sigma education. He uses our "DOE Simplified" book and Design-Ease software for the design of experiments portion of his training. Bill kindly consented to me providing a link to his simulation, which he will leave up on his web site for a limited time. Let me know how you like it and I will forward your comments to Bill. I got lucky and hit the castle after only a few pre-trials (my motto is: Ready, Fire, Aim!). Then I laid out a two-part DOE that put me on target in only 24 runs. See if you can do better applying a systematic approach, rather than blind luck.

4 - User feedback: Design-Expert software earns superior rating

Jack Reece, who retired from SEMATECH at the rank of Fellow in the Statistical Methods Group in 1996, recently completed the DOE portion of a "Critical Review of Statistical Software Used in Industry" for the "Handbook of Statistics: Statistics in Industry" (Editors: C. R. Rao and R. Khattree). This will be volume 23 in a series started by the late Professor P. R. Krishnaiah. Jack put in an impressive amount of effort on the DOE aspects of statistical software, more than anyone else since Chris Nachtsheim's landmark review of 1987.* Chris rated Stat-Ease software of that time (Design-Ease V1.1) very highly compared to more than a dozen competing packages, saying that it was "incredibly easy to learn and use." Since then, many of the packages reviewed by Nachtsheim fell by the wayside, but we've steadily improved our DOE software, culminating in Design-Expert version 6 (DX6), our current release. We are proud that Jack Reece rated DX6 outstanding in its class - those products that concentrate on experimental design only. It will be nice to see this in print.

(If you don't already use Design-Expert V6, get a free, full-functional, trial version of the software at:

*"Tools for Computer-Aided Design of Experiments," Journal of Quality Technology (JQT), Vol. 19, No. 3, July '87, pp 132-160.

5 - Events alert: A heads-up on DOE talks and demos

As mentioned in last month's DOE FAQ Alert, DOES Institute will be representing Stat-Ease at the American Institute of Aeronautics and Astronautics ( Aerospace Sciences Meeting and Exhibit on January 14-17 in Reno. Stop by the DOES booth (#500B) to see our Design-Expert software.

Click for a listing of where Stat-Ease consultants will be giving talks and doing DOE demos. We hope to see you sometime in the near future!

6 - Workshop alert: Stat-Ease does San Jose, Philadelphia and Dallas

There's still time for you to sign up for upcoming workshops coming near you:
- Experiment Design Made Easy, February 5-7 in San Jose, CA
- Mixture Design for Optimal Formulations, February 5-7 in Philadelphia, PA
- DOE Simplified, February 21, Dallas, TX
We're hibernating here in Minneapolis until March, when we present Robust Design, DOE Tools for Reducing Variation on the 12th through 14th of that month.

See for schedule and site information on all Stat-Ease workshops open to the public. To enroll, call Stat-Ease at 1.612.378.9449. If spots remain available, bring along several colleagues and take advantage of quantity discounts in tuition, or consider bringing in an expert from Stat-Ease to teach a private class at your site. Call us to get a quote.

I hope you learned something from this issue. Address your questions and comments to me at:

Mark J. Anderson, PE, CQE
Principal, Stat-Ease, Inc. (
Minneapolis, Minnesota USA

PS. Quote for the month - biology vs. physics vs. statistics.

"If it moves, it's biology; if it stinks it's chemistry; if it doesn't work it's physics; and if it puts you to sleep, it's statistics."
- Sign posted at Purdue University

Trademarks: Design-Ease, Design-Expert and Stat-Ease are registered trademarks of Stat-Ease, Inc.

Acknowledgements to contributors:

- Students of Stat-Ease training and users of Stat-Ease software
- Fellow Stat-Ease consultants Pat Whitcomb and Shari Kraber (see for resumes)
- Statistical advisor to Stat-Ease: Dr. Gary Oehlert (
- Stat-Ease programmers, especially Tryg Helseth (
- Heidi Hansel, Stat-Ease marketing director, and all the remaining staff

Interested in previous FAQ DOE Alert e-mail newsletters? To view a past issue, choose it below.

#1 - Mar 01, #2 - Apr 01, #3 - May 01, #4 - Jun 01, #5 - Jul 01 , #6 - Aug 01, #7 - Sep 01, #8 - Oct 01, #9 - Nov 01, #10 - Dec 01, #2-1 - Jan 02 (See above)

Click here to add your name to the FAQ DOE Alert newsletter list server.

Statistics Made Easy

©2001 Stat-Ease, Inc. All rights reserved.