Issue: Volume 6, Number 1
Date: January 2006
From: Mark J. Anderson, Stat-Ease, Inc. (

Dear Experimenter,

Here's another set of frequently asked questions (FAQs) about doing design of experiments (DOE), plus alerts to timely information and free software updates. If you missed the previous DOE FAQ Alert, please click on the links at the bottom of this page. If you have a question that needs answering, click the Search tab and enter the key words. This finds not only answers from previous Alerts, but also other documents posted to the Stat-Ease web site.

Feel free to forward this newsletter to your colleagues. They can subscribe by going to If this newsletter prompts you to ask your own questions about DOE, please address them via mail

This plain text message may be easier to read if you do not remove the line breaks. If you use Microsoft Outlook, press the bar labeled "Extra line breaks in this message were removed." In newer versions select "Restore line breaks."

Happy New Year! Take a second of your time to check out this appetizer aimed at getting this first 2006 Alert off to a good start: For the technical details on why our clocks got off, see Up until recently, I calibrated my watch against the US Naval Observatory time given by telephone at 202.762.1401, but lately I've slacked off by using the less-precise web time at The famous clock maker Benjamin Harrison must be spinning in his grave over these gyrations in time. After reading Dava Sobel's book "Longitude: The True Story of a Lone Genius Who Solved the Greatest Scientific Problem of His Time," I traveled to London and took a boat ride down the Thames to see the Harrison clocks at the Royal Observatory in Greenwich, England. I highly recommend the book and the museum for anyone with a temporal bent. To see the story made short (and a picture of the time piece on page 3), go to

Here's what I cover in the body text of this DOE FAQ Alert (topics that delve into statistical detail are designated "Expert"):

1. Free books (US/Canada only): Enter drawing for two Montgomery DOE texts plus autographed set of DOE/RSM "Simplified" series
2. Expert-FAQ: New capabilities for mixture design in just-released Design-Expert® version 7 software (see details and link to free trial at
a. Mixture in mixture ('Mix-Mix') capability
b. Ability to invert simplexes or other mixture geometries
3. Reader response: Follow-up to "Struggle for Power vs. Resolution vs. Simplicity in an ASTM Standard"
4. Events alert: Design for Six Sigma (DFSS) conference (Second Notice)
5. Workshop alert: See when and where to learn about DOE

PS. Quote for the month: "The Modern Design of Experiments in Three Laws" by Dick De Loach of NASA Langley Research Center.


1. Free books (US/Canada only): Enter drawing for two Montgomery DOE texts plus autographed set of DOE/RSM "Simplified" series

(Sorry, due to the high cost of shipping, this offer applies only to residents of the United States and Canada.) Simply reply to this e-mail before February 1st if you'd like a chance at one of two free copies of "Design and Analysis of Experiments" by Douglas Montgomery, 5th Edition (Wiley, 2000). These books, near mint condition, became surplus when Stat-Ease stocked up on Dr. Montgomery's newer 6th edition.

A drawing will also be held for free autographed copies of "DOE Simplified: Practical Tools for Effective Experimentation" ( and "RSM Simplified: Optimizing Processes Using Response Surface Methods for Design of Experiments" ( These two books will be signed by the authors (myself and Stat-Ease consultant Patrick Whitcomb).

In your e-mail, feel free to specify which of these three free book(s) you would like. (Reminder: If you reside outside the US or Canada, you are NOT eligible for the drawing because it costs too much to ship the books.)

For a complete list of books offered for sale by Stat-Ease, go to its e-commerce site at


2. Expert-FAQ: New capabilities for mixture design in just-released Design-Expert® version 7 software (see details and link to free trial at

a. Mixture in mixture ('Mix-Mix') capability

-----Original Question-----
From: Cleveland, Ohio
"I want to evaluate five alternatives each to two reactive chemicals that will be kept to 2% each in my formulation. This stoichiometry is needed for proper curing."

I advise you use a new design type offered by Design-Expert called a "mix-mix". Via d-optimal selection, it creates an alternative design that requires 135 blends versus the 235 combinations that you laid out using older technology. I created the more efficient mix-mix design by editing the model to a combined order of cubic, thus eliminating fourth-order terms—unlikely to be needed for predictive modeling—that cropped up when crossing the two quadratic mixture (Scheffe) polynomials.

Stat-Ease Consultant Pat Whitcomb suggests:
"An example Cornell* uses to illustrate a mix by mix is a two-layer film. Each layer is a separate mixture and is a film on it's own. The performance of the two-layer film depends on the composition of each layer, possibly the thickness of each layer and also how the two mixtures interact."

*("Experiments with Mixtures: Designs, Models, and the Analysis of Mixture Data," 3rd Edition, Wiley, 2002, available at

b. Ability to invert simplexes or other mixture geometries

-----Original Question-----
From: Cairo, Egypt
"We know that as water increases so does the workability of concrete. We have done many designed experiments in many countries and the trace plots and the model equations provided by Design-Expert version 6 indicated the expected behavior. However, in this new experiment, we have something that I can not explain: the final equation in terms of actual components exhibits positive terms of water, which makes sense, but the trace plot displays water having a negative effect on workability."

Version 7 of Design-Expert recognizes inverted simplexes or constrained non-simplex spaces that would be better handled using the upper ("U"), rather than lower ("L"), coding for design and analysis. I think in this case that it is not that big an advantage. The downside of U-coding is that the axes flip on trace, contour and 3D plots. Thus everything goes opposite of what you would expect.

It turns out that when you built your mixture design Design-Expert V7 software detected that an upper bounded pseudo coding provided a larger design space. Upper-bounded (U) coding means that instead of 0 being low and 1 being high, 0 is assigned to the high level and 1 is assigned to the low level. Therefore, what goes down on the trace plot actually goes up in reality, which is what you expected for the water component. Also on the triangular plots the vertices (tips) are the low and the opposing base the high for each component, in other words the opposite of what you normally see on mixture graphs.

For what it's worth, when you perform numerical optimization it will provide the expected solution, that is, water increases the workability of concrete.

(Learn more about inverting mixture design by attending the three-day computer-intensive workshop "Mixture Design for Optimal Formulations." See for a complete description. Link from this page to the course outline and schedule. Then, if you like, enroll online.)


3. Reader response: Follow-up to "Struggle for Power vs. Resolution vs. Simplicity in an ASTM Standard"

The article titled "Struggle for Power vs. Resolution vs. Simplicity in an ASTM Standard" in the December 2005 Stat-Teaser (posted at tells how I helped upgrade the strategy of experimentation specified in the E1169 Standard for Ruggedness Testing. It prompted this response.

-----Original Comment-----
From: Paul N. Sheldon, Independent Consultant, Tonawanda, New York
"I read about your volunteering for ASTM E1169 service with some amusement. As you say, be prepared to back up your recommendations with participation. But this committee is very important. I find many chemists (at least those not in analytical chemistry roles) have difficulty embracing DOE principles. Having organizations such as ASTM and committee E1169 use appropriate techniques can help with their skepticism. I hope you are successful in getting the committee to accept the graphical analysis as well. As we say in Six Sigma, apply "PGA" -- Practical --> Graphical --> Analytical—the right order for examining data.

I retired a year ago and now do a bit of consulting. Helping people with planning experiments and analyzing data has always been something I enjoyed, especially experiments with mixtures (apologies to John Cornell). I look forward to acquiring the latest release to Design-Expert. You guys are the best in this field."

PS. I am happy to report that my article catalyzed a reader, Arved Harding, to join the E1169 committee. I've been corresponding with Arved about DOE for at least a decade and visited with him and his colleagues at Eastman Chemical in Kingsport, Tennessee. He will not only be a great addition to the group for his statistical ability, but also for being such a good fellow to work with. Arved, who signs off on e-mail as "Your friendly, neighborhood statistician," will provide valuable perspective from the process industry. Way to go Arved! — Mark


4. Events alert: Design for Six Sigma (DFSS) conference (Second Notice)

Stat-Ease will attend and display our software at the Design for Six Sigma conference, January 16-18, 2006, in Memphis, Tennessee. Among the speakers are several (probably more that I missed or don't know of) clients of Stat-Ease software and training resources:
—(Keynoter) Martha Gardner, Global Quality Leader, Global Research, General Electric Company (GE),
—Kevin Ward, Director R&D Design Systems, and Dan Kussman, Master Black Belt, both from Boston Scientific.
The lineup of top people and their companies is quite impressive as you will see at the conference web site:

Click for a more complete list of planned appearances by Stat Ease professionals. We hope to see you sometime in the near future!


5. Workshop alert: See when and where to learn about DOE

See for schedule and site information on all Stat-Ease workshops open to the public. To enroll, click the "register online" link on our web site or call Stat-Ease at 1-612-378-9449. If spots remain available, bring along several colleagues and take advantage of quantity discounts in tuition, or consider bringing in an expert from Stat-Ease to teach a private class at your site. Call us to get a quote.


I hope you learned something from this issue. Address your general questions and comments to me at:




Mark J. Anderson, PE, CQE
Principal, Stat-Ease, Inc. (
2021 East Hennepin Avenue, Suite 480
Minneapolis, Minnesota 55413 USA

PS. Quote for the month—"The Modern Design of Experiments in Three Laws" by Dick De Loach of NASA Langley Research Center:

Dick, a frequent correspondent with Stat-Ease, teaches Modern Design of Experiments (MDOE) for Langley's Wind Tunnel University (WTU). He preaches these three Laws for MDOE:

1. First Law (Key to Quality): "Random variations occur in experimental data about mean values that change systematically with time." You already know this, of course, as it is the basis for the randomization and temporal blocking imperative. But most of the experimental aeronautics community still blithely assumes that randomization and blocking are completely unnecessary (and that replication is not generally the most productive use of limited resources that might be better spent maximizing the number of unique independent variable combinations that are set.) So I describe this as a "law" of experimentation, and firmly believe it to be.

2. Second Law (Key to Productivity): "Each new data point adds value as a monotonically decreasing function of the volume of data already acquired." This flows from defining the "value" of an additional data point in terms of its potential for reducing inference error risk, which decreases monotonically with data volume. But as I often say to students in the short course I teach on experiment design, "a million and one points is always better than a million. But not much better." And since it takes just as much time and effort to acquire another point no matter how many we already have, costs keep going up with data volume even when benefits have for practical purposes flattened. The second law implies a point of diminishing returns for data volume, which suggests we can be more productive (do more experiments per year) if we simply know when we're done with any particular experiment.

3. Third Law (Heuristic): "The results of well-designed experiments are easy to explain but hard to forecast." I have no proof of this, except to say that it is a reliable outcome.

Dick goes on to say:
"Conventional DOE has an unjustifiably bad rap among experimental aerodynamicists who have heard only enough about it to associate DOE with relatively simple response surface shapes (mostly first-order and factor-interaction, with a little second-order for optimization.) Unfortunately, our wind tunnel response surfaces do not look like smooth hills, and certainly not like flat planes (or even twisted planes). Your basic yawing moment as a function of Mach number, angle of attack, and aileron deflection angle, for example, can more closely resemble the ridge-line of the Swiss Alps between Mürren and Zermatt connecting the Eiger with the Matterhorn, than anything you'd ever see in Kansas!

To combat this unfortunate stereotyping of DOE as a low-order tool more suitable for product and process improvement than for hairy-chested aerodynamics, we have used the term "MODERN Design of Experiments" to suggest an emphasis on extensions of conventional DOE that are relevant in aerospace research. The "M" in "MDOE" implies higher-order response functions, alternatives to polynomial forms, inference space truncation methods to generate piecewise continuous response surfaces comprised of multiple relatively simple models, an emphasis on "scaling" (a priori estimates of adequate data volume) and both quality assurance (via not only randomization of set-point order but temporal blocking) and quality assessment (routine estimates of precision intervals).

MDOE also entails a focus on quantifying a contribution to the unexplained variance that is not as often relevant in conventional DOE; namely, that component attributable to the time-varying bias errors that are caused by such persisting effects as temperature gradients and instrumentation drift. DOE practitioners seek to eliminate this component from the unexplained variance in order to improve the precision of their hypothesis tests, but MDOE practitioners (who also want to isolate it for the same reason) have a further need—to quantify it so as to more accurately reflect the true reproducibility of results acquired in real-world facilities such as wind tunnels. Such complex and highly energetic facilities cannot and do not maintain states of statistical control to within the parts per million of unexplained variance implied by our precision requirements. We must therefore quantify these subtle departures from statistical control when we report our results to avoid overstating our precision as we would do if we accounted only for the relatively small pure error in our uncertainty estimates. Again, this is a nuance that frequently distinguishes high-precision scientific experimentation (particularly in aerospace) from industrial product or process improvement applications.

None of the features of designed experiments I have mentioned is strictly outside the scope of conventional DOE. However, the combined effect of these differences in emphasis for aerospace (vs industrial) engineering applications is significant enough that we try to explicitly recognize them through a modest extension of the name. The intent is to unambiguously recognize this art form as "DOE", but to also recognize the potential bias against DOE as "too simple" for our purposes, by signaling that MDOE features the focus required for our special extended needs.

Perhaps more than you wanted to hear from a battered veteran of the "culture wars" waged to establish professional experiment design as the default approach to empirical aerospace research, but I thought you might find this background interesting. In any case, if you have no objection to calling it "The Modern Design of Experiments in Three Laws", I certainly have no objection to your passing these along. Thanks!

PS. I have attached a slide from the introductory lesson of the short course I teach at aerospace conferences titled "Concepts in the Modern Design of Experiments". (I believe you know that I use Doug Montgomery's textbook for this course, bundled with the student version of Design-Expert software from John Wiley & Sons, and spend some time introducing the students to DX as part of the course.)

Four Eras in the History of Designed Experiments (Credit: Douglas Montgomery)
I. The agricultural origins, 1918 to 1940s
- R. A. Fisher & his co-workers
- Profound impact on agricultural science
- Factorial designs, ANOVA
II. The first industrial era, 1951 to late 1970s
- George Box and others
- Response surface methods
- Applications in the chemical & process industries
III. The second industrial era, late 1970s to 1990
- Quality improvement initiatives in many companies
- Taguchi and robust parameter design, process robustness
IV. The modern era, beginning circa 1990"

Trademarks: Design-Ease, Design-Expert and Stat-Ease are registered trademarks of Stat-Ease, Inc.

Acknowledgements to contributors:
—Students of Stat-Ease training and users of Stat-Ease software
—Fellow Stat-Ease consultants Pat Whitcomb and Shari Kraber (see for resumes)
—Statistical advisor to Stat-Ease: Dr. Gary Oehlert (
—Stat-Ease programmers, especially Tryg Helseth (
—Heidi Hansel, Stat-Ease marketing director, and all the remaining staff


Interested in previous FAQ DOE Alert e-mail newsletters?
To view a past issue, choose it below.

#1 Mar 01
, #2 Apr 01, #3 May 01, #4 Jun 01, #5 Jul 01 , #6 Aug 01, #7 Sep 01, #8 Oct 01, #9 Nov 01, #10 Dec 01, #2-1 Jan 02, #2-2 Feb 02, #2-3 Mar 02, #2-4 Apr 02, #2-5 May 02, #2-6 Jun 02, #2-7 Jul 02, #2-8 Aug 02, #2-9 Sep 02, #2-10 Oct 02, #2-11 Nov 02, #2-12 Dec 02, #3-1 Jan 03, #3-2 Feb 03, #3-3 Mar 03, #3-4 Apr 03, #3-5 May 03, #3-6 Jun 03, #3-7 Jul 03, #3-8 Aug 03, #3-9 Sep 03 #3-10 Oct 03, #3-11 Nov 03, #3-12 Dec 03, #4-1 Jan 04, #4-2 Feb 04, #4-3 Mar 04, #4-4 Apr 04, #4-5 May 04, #4-6 Jun 04, #4-7 Jul 04, #4-8 Aug 04, #4-9 Sep 04, #4-10 Oct 04, #4-11 Nov 04, #4-12 Dec 04, #5-1 Jan 05, #5-2 Feb 05, #5-3 Mar 05, #5-4 Apr 05, #5-5 May 05, #5-6 Jun 05, #5-7 Jul 05, #5-8 Aug 05, #5-9 Sep 05, #5-10 Oct 05, #5-11 Nov 05, #5-12 Dec 05, #6-01 Jan 06 (see above)

Click here to add your name to the FAQ DOE Alert newsletter list server.

Statistics Made Easy™

DOE FAQ Alert ©2005 Stat-Ease, Inc.
All rights reserved.


Software      Training      Consulting      Publications      Order Online      Contact Us       Search

Stat-Ease, Inc.
2021 E. Hennepin Avenue, Ste 480
Minneapolis, MN 55413-2726
p: 612.378.9449, f: 612.378.2152