Re-open the data file from the previous step by going to Help, Tutorial Data and selecting Chemical Conversion (Analyzed). Then under the Analysis branch click the R1:Conversion node and go to Model Graphs to bring up the contour plot. Let’s quickly try some things here that you may find useful when making a presentation.
In the vacant region of the AB contour plot right-click and select Add contour. Then drag the contour around (it will become highlighted). You may get two contours from one click like those with the same response value. (This pattern indicates a shallow valley, which becomes apparent when we get to the 3D view later.)
Click the new contour line to highlight it. Then drag it (place the mouse cursor on the contour and hold down the left button while moving the mouse) to as near to 81 as you can. Now to obtain the precise contour level, right-click the contour you just dragged, choose Set contour value and enter 81.
Another way to set contour values: Right-click over the plot and choose Edit contours. Now, for Mode, select the Incremental option and fill in Start at 66, Step at 3, and Levels at 8.
If you go this route, be sure to look over the Min and Max values first. That gives you a clue on where to start and how big to step on the contour values.
To zoom in on the area around the center point (the red dot labeled “6”), position the crosshairs and, while holding down the left mouse button, drag over (rope-off) your desired region of interest.
Notice how the graph coordinates change. Obviously you would now want to add more contours using the tools you learned earlier in this tutorial. However, do not spend time on this now: Right-click over the graph and select Default View Window.
That’s enough for the contours plot for now. On the Graphs Toolbar go to 3D Surface view. Modify the color range via a click on the color scale gradient in the graph legend, which brings up the Edit Legend dialog box. Change the Low to 80 and the High to 90. Notice how this makes the graph far more colorful and thus informative on the relative heights.
Now click the design point sticking up in the middle. See how this is identified in the legend at the left by run number and conditions.
Now try a handy feature for pulling up the right plot for any given run. On the Factors Tool select number 1 off the Run # dropdown-list. The 3D view now shifts to the correct ‘slice’ on factor C (catalyst). However the colors are not ideal now. So right-click over the gradient and in the Edit Legend dialog box press the Defaults button. Your graph should now match the one shown below.
By the way, if you put in any comments on a particular run, it will show in this view with the point having been selected.
Much more can be done for your show-and-tell. Spend time beforehand to try different things that Design-Expert can do. Take advantage of default buttons to put things back the way they were.
Details about the variation in your input factors can be fed into Design-Expert® software. Then you can generate propagation of error (POE) plots showing how that error is transmitted to the response. Look for conditions that minimize transmitted variation, thus creating a process that’s robust to factor settings. This tutorial shows how to generate POE from an experiment designed by response surface methods (RSM).
Propagation of error is covered in the One Factor RSM tutorial in a way that is far easier to see, so be sure to review this if you want to develop a fuller understanding of this mathematical tool.
To be sure we start from the same stage of analysis, re-open the file named RSMa.dxpx. Then click the Design node on the left side of the screen to get back to the design layout. Next select View, Column Info Sheet. Enter the following information into the Std. Dev. column: time: 0.5, temperature: 1.0, catalyst: 0.05, as shown below.
Notice the software already entered the standard deviation for the analyzed response, Conversion (4.1…). Because you haven’t changed any other data, Design-Expert remembers your previous analysis choices — you can simply click through the analysis buttons.
Right-clicking the buttons to the left of response names allows you to specify a different standard deviation.
Otherwise the field will be protected, that is, you cannot alter it.
Under the Analysis branch click the Conversion node. Then jump past the intermediate buttons for analysis and click the Model Graphs tab. Select View, Propagation of Error. (This option was previously grayed out — unavailable — because the standard deviations for the factors had not yet been entered.)
Now on the Graphs Toolbar select 3D Surface.
The lower the POE the better, because less of the error in control factors will be transmitted to the selected response, which results in a more robust process.
What happens if you try to generate POE on a response that’s fitted to a linear model: When the surface is linear, such as that for Activity, error is transmitted equally throughout the region. Therefore, Design-Expert software ‘grays out’ the option for propagation of error.
Now that you’ve generated POE for Conversion, let’s go back and add it to the optimization criteria. Under the Optimization branch click the Numerical node. For the POE (Conversion) set the Goal to minimize with a Lower Limit of 4 and an Upper Limit of 5 as shown below.
You may also have to go back and set the goal for Conversion (maximize; LL 80-UL 100) and Activity (Target->63; LL 60-UL 66).
Now click the Solutions button atop the screen to generate new solutions with additional criteria. On the Solutions Toolbar click Ramps. (Note: Due to random starting points for the searches, you may see slight differences on your screen versus the shot below.)
The above optimal solution represents the formulation that best maximizes conversion and achieves a target value of 63 for activity, while at the same time finds the spot with the minimum error transmitted to the responses. So, this should represent process conditions that are robust to slight variations in factor settings. In this case it does not make much of a difference whether POE is accounted for or not (go back and check this out for yourself). However, in some situations it may matter, so do not overlook the angle of POE.
Design-Expert offers powerful tools to evaluate RSM designs. Design evaluation ought to be accomplished prior to collecting response data, but it can be done after the fact. For example, you may find it necessary to change some factor levels to reflect significant deviations from the planned set point. Or you may miss runs entirely – at least for some responses. Then it would be well worthwhile to reevaluate your design to see the damage.
For a re-cap of what’s been done so far, go to the Design branch and click the Summary node.
The summary reports that the experimenter planned a central composite design (CCD) in two blocks, which was geared to fit a quadratic model. Click the Evaluation node and notice Design-Expert assumes you want details on this designed-for order of model.
Click the Results tab for an initial report showing annotations on by default
Explore the different panes here and note the results look very good – as you’d expect from a standard design for RSM.
For a design that produces a far worse evaluation, take a look at the Historical Data RSM Tutorial.
Press ahead to the Graphs tab atop the screen. It defaults to the FDS Graph that depicts standard error versus the fraction of design space. Click the curve you see depicted. Design-Expert now provides coordinate lines for easy reading.
Based on extensive sampling of the experimental region (150,000 points by default as noted in the legend), the “y” axis on the FDS graph quantifies the maximum prediction variability at any given fraction of the total space. For example, as noted in the legend at the left of the screen, 80 percent of this response surface method (RSM) design falls at or below ~0.5 units of standard error (SE). Due to the random sampling algorithm, your FDS may vary a bit. When you evaluate alternative designs, favor those with lower and flatter FDS curves.
FDS provides the mechanism for sizing RSM designs based on how precisely you need to predict your response. Stat-Ease teaches how to do this in its workshop on RSM. For statistical details, see the pdf of the talk on “Practical Aspects of Algorithmic Design of Physical Experiments” posted to the webinar page at our website, or just click the link above.
The FDS provides insights on prediction capabilities. To view design ‘rotatability’ criteria, select View, Contour. Design-Expert then displays the standard error plot, which shows how variance associated with prediction changes over your design space.
You can see the central composite design (CCD) provides relatively precise predictions over a broad area around the 6 center points. Also, notice the circular contours. This indicates the desirable property of rotatability – equally precise predictive power at equal distances from the center point of this RSM design. For standard error plots, Design-Expert defaults to black and white shading. The graduated shading that makes normal response contour plots so colorful will not work when displaying standard error. Look closely at the corners of this graph and notice they are gray, thus indicating regions where the response cannot be predicted as precisely.
See what happens when you extrapolate beyond experimentation regions by following these steps. First select Display Options, Process Factors, Coded. Then right-click over the graph and select Edit axes…. Change the default X1 Axis values for Low to -2 and High to 2. Next, click the X2 Axis tab and change Low value to -2 and High value to 2. After completing these changes, press OK. You now should see a plot like that shown below.
As shown in the key, shading begins at one-half standard deviation and increases linearly up to 1.5 times standard deviation. So long as you stay within specified factorial ranges (plus/minus 1), shading remains relatively light — beyond that the plot darkens. Be wary of predictions in these nether regions! Before leaving this sidebar exploration, go back to Graph Preferences and reset both axes to their defaults. Also, change the Process Factors back to their actual levels.
Now on the Graphs Tool click 3D Surface.
Notice the flat bottom in this bowl-shaped surface of standard error (this bottom plot was created by changing the Y-axis low to 0.4 and high to 0.8). That’s very desirable for an RSM design. It doesn’t get any better than this!