Experimentation informs every part of the biologic life cycle. But it is costly and time consuming — especially when you are using outdated methods. As you strive for more efficiency from your scientists and engineers, can you streamline your work processes to get more learning with less experimentation?
Both multivariate analysis (MVA) and design of experiments (DoE) methods have numerous applications for simplifying the learning from large data sets and experimentation. However, many scientists and engineers still perceive these methods to be complex. Today’s newer, intuitive software applications make these techniques user-friendly for even non-statisticians. Early adopters are seeing decreased time to market, reduced development and production costs, and improved quality and reliability.
A pilot plant usually plays a key role in process development by providing essential data related to operation, safety, scale-up and other issues. The value of the pilot plant depends on the validity of the data captured. Planned experimentation is crucial to gathering meaningful data, and design of experiments (DOE) is the gold standard for finding results which are statistically significant.
A generic pharmaceutical manufacturer recently hired VerGo Pharma Research Laboratories Pvt. Ltd to develop a bioequivalent with different polymorphic forms for an anti-depressant drug that had previously been patented in crystalline form only. Bioequivalence requires that a drug be pharmaceutically equivalent and that it be delivered at the same rate and same level of bioavailability so that its efficacy and safety can be expected to be the same as the original product. By using optimal response surface methods (RSM) to reduce the number of tests required (to determine the effects of inactive ingredients on bioavailability in both fed and fasting conditions), VerGo was able to cut the development process from several years to only four months.
This article offers a five-step method to finding the optimum or "best" weld. The method detailed below utilizes a statistical tool known as the two-level factor approach. This well-tested method will assist an investigator in identifying what is to be optimized, choosing inputs for evaluation, running the tests so that statistically significant data is generated, analyzing data, and finally determining the settings for the significant inputs which result in the optimum weld.
Learn how DOE and response surface methods (RSM) catalyze process development and optimization (requires membership to view).
An industrial equipment supplier wanted to find the best operating conditions, as well as determine what performance its product could deliver for ethanol producers, before putting the device on the market. A DOE was run to successfully identify and validate a measurement method that has enabled the supplier to accurately evaluate the performance of the new product in a large number of plants under a wide range of operating conditions.
OMG Borchers paint-chemists were challenged to find a second source for an associative thickener. They set up a mixture design of experiment to screen the effects of four candidate additives. Aided by Design-Expert software, they accomplished their mission in a timely fashion. When the new formulation was prepared and tested, its performance in every application could not be discerned from the incumbent.
Due to operational or physical considerations, standard factorial and response surface method (RSM) design of experiments (DOE) often prove to be unsuitable. In such cases a computer-generated statistically-optimal design fills the breech. This article explores vital mathematical properties for evaluating alternative designs with a focus on what is really important for industrial experimenters. To assess “goodness of design” such evaluations must consider the model choice, specific optimality criteria (in particular D and I), precision of estimation based on the fraction of design space (FDS), the number of runs to achieve required precision, lack-of-fit testing, and so forth. With a focus on RSM, all these issues are considered at a practical level, keeping engineers and scientists in mind. This brings to the forefront such considerations as subject-matter knowledge from first principles and experience, factor choice and the feasibility of the experiment design.
This article provides insights on how many runs are required to make it very likely that a test will reveal any important effects. Due to the mathematical complexities of multifactor design of experiments (DOE) matrices, the calculations for adequate power and precision are not practical to do by 'hand' so the focus is kept at a high level--scoping out the forest rather than detailing all the trees. By example, reader will learn the price that must be paid for an adequately-sized experiment and the penalty incurred by conveniently grouping hard-to-change factors.
In an effort to recover additional copper and gold at KGHM International's Robinson Mine located near Ruth, Nevada, an in-plant study was undertaken to quantify potential flotation recoveries from the concentrator's final tailings stream. Tests were conducted by passing a small continuous sample of final tailings through a single 1.5 m3 FLSmidth XCELL™ demonstration flotation machine. This paper reviews the results obtained from the in-plant testing with the single 1.5 m3 flotation cell and provides a comparison to the subsequent operational performance of multiple 160 m3 flotation machines. The DOE test campaign produced a highly reliable Copper Tailings Grade Model. Actual operational data validated the performance of the predictive model and pilot cell testing. The full-scale flotation plant achieved a 27.1% recovery over a three-year period. The added recovery has increased copper production by 5.95 million kg annually and gold by a significant amount. (Proceedings from 2013 Copper International Conference, Santiago, Chile, Dec 1-4, Session MP44.)