Organizations face increasingly complex problems that are critical to their operation and, in some cases, for their survival. Such problems require the proper use of data and its interpretation. A major issue is how to develop appropriate solution strategies to develop good solutions efficiently and effectively.
Historically, data analysis focuses on tools: design of experiments (DOE), regression analysis, statistical process control, modeling. More recent tools form the foundations for analytics and data science. Tools are important for creating good solutions to complex problems. However, it is crucial to understand “the right tool, for the right job, at that right time, correctly applied.” Today, there are too many people who claim that their tool is the universal solution. The reality is more complex.
This talk outlines the new discipline that is devoted to the art and science for creating good solutions for complex problems using data. The paradigm for this new discipline is chemical engineering. Chemical engineering builds upon both chemistry and mechanical engineering to create new chemical processes and to improve existing chemical processes efficiently and effectively. Crucial to these solutions is the concept of “unit operations”. Chemical engineering theory focuses on understanding these core operations and how to develop proper strategies to deploy them. Our new discipline, statistical engineering, takes such an approach to the complex problems facing organizations today.
This talk introduces at a high level this new discipline. It then outlines the important roles that both DOE and analytics play in the solution of complex problems. In the process it emphasizes the importance of strategy and understanding exactly what the tools can and cannot do.
Geoff Vining is a Professor of Statistics at Virginia Tech. He is an Honorary Member of the ASQ, a Fellow of the ASQ, a Fellow of the American Statistical Association (ASA), and an Elected Member of the International Statistical Institute. Dr. Vining is the author of three textbooks. He is the Founding Chair of the International Statistical Engineering Association (ISEA).
Dr. Vining served as Editor of the Journal of Quality Technology from 1998 – 2000 and as Editor-in-Chief of Quality Engineering from 2008-2009. He received the 2010 ASQ Shewhart Medal,the 2015 ENBIS Medal, 2011 William G. Hunter Award from the ASQ Statistics Division, the 1990 ASQ Brumbaugh Award, and an Engineering Excellence Award from the NASA Engineering and Safety Center (2013). He also received an Honorary Doctor of Technology from Luleå University of Technology (Sweden).
Partial least squares regression, which has been around for about four decades, is a dimension-reduction algorithm for fitting linear regression models without requiring that the sample size be larger than the number of predictors. It was developed primarily by the Chemometrics community where it is now ingrained as a core method, and it is apparently used across the applied sciences.
And yet it seems fair to conclude that PLS regression has not been embraced by some, even as a serviceable method that might be useful occasionally. Nor does there seem to be a common understanding as to why this rather enigmatic method should not be used, although bumptious discussions of PLS failings can be found in some applied areas. Perhaps this is as it should be — perhaps not.
This talk is intended as a relatively informal overview of PLS regression from a statistical perspective, including historical context, personal encounters, methodology, relationship to envelopes and, near the end, a few recent asymptotic results for high-dimensional regressions.
Dennis Cook is Full Professor and Director, School of Statistics, University of Minnesota. He received his BS degree in Mathematics from Northern Montana College, and MS and PhD degrees in statistics from Kansas State University. He served a ten year term as Chair of the Department of Applied Statistics, and a three-year term as Director of the Statistical Center, both at the University of Minnesota.
His research areas include dimension reduction, linear and nonlinear regression, experimental design, statistical diagnostics, statistical graphics and population genetics. He has authored over 200 research articles and is author or co-author of two textbooks – An Introduction to Regression Graphics, and Applied Regression Including Computing and Graphics – and two research monographs, Inﬂuence and Residuals in Regression, and Regression Graphics: Ideas for Studying Regressions through Graphics. Background on these works can be found at http://www.stat.umn.edu/∼dennis/.
He has served as Associate Editor of the Journal of the American Statistical Association, The Journal of Quality Technology, Biometrika, Journal of the Royal Statistical Society and Statistica Sinica. He is a four-time recipient of the Jack Youden Prize for Best Expository Paper in Technometrics as well as the Frank Wilcoxon Award for Best Technical Paper. He received the 2005 COPSS Fisher Lecture and Award, the highest honor conferred by the statistics profession. He is a Fellow of the American Statistical Association and the Institute of Mathematical Statistics, and an elected member of the International Statistical Institute.
In a time where digital manufacturing transforms the flow of goods into a flow of data and raw materials, brand owners need effective ways to secure quality and protect their products against counterfeiting. New technologies increase speed and quality in distributed manufacturing, but add complexity to the supply chain … and possible security challenges. Intellectual property protection requires strategies to ensure that both the data and the goods are secure.
Non-destructive, speedy, and user-friendly field testing boosts both security and quality monitoring. New handheld instruments are cost-effective, especially when their capabilities are boosted with strong analytics. A blend of classic methods and new technologies using chemical tags, spectrometers and analytics provide protection for products threatened by counterfeiting, saving money and reputation.
This talk outlines the elements of the solution from the chemical “fingerprint” to field authentication using spectroscopy. This can be applied on pharmaceuticals, cosmetics, spare parts, electronics, wine and additive manufacturing, and the talk will include use cases and real-life examples. It will also touch on blockchain and when blockchain is helpful, and when it is just expensive hype.
Dr. Sharon Flank, InfraTrac CEO, is a recognized expert in anti-counterfeiting, across several industries, and a leader in intellectual property protection for additive manufacturing. She has a decade of experience creating AI solutions on the emerging technologies team at a defense contractor, followed by entrepreneurial experience in a fast-growing venture-backed company before founding InfraTrac in 2006. InfraTrac develops product protection solutions based on spectroscopy, including incorporating sensor data into cyberphysical security via blockchain.
Dr. Flank worked with defense contractor SRA (now CSRA Division of General Dynamics IT) to spin out its first technologies and create companies acquired by AOL (Navisoft), Chicago Tribune (Picture Network International) and then Kodak (eMotion), and CA (Assentor). She has authored numerous journal articles, including refereed publications in several unrelated fields, on anti-counterfeiting, medication errors, artificial intelligence and 3D printing. Dr. Flank holds ten patents. She received her A.B. from Cornell and her Ph.D. from Harvard.
"Know the SCOR for Multifactor Strategy of Experimentation": Mark Anderson (Stat-Ease)
"Developing an Assay for Screening Alzheimer's Inhibitors using RSM": Noah Johnson (U of Colorado)
"Optimizing the Vapor Deposition of Bioactive Films using Sequential Experimentation": Lou Johnson (JMC Data Experts)
"Analyzing Experiments Involving an Amount Factor with a Zero Setting": Howard Rauch (Eastman Chemical)
"Practical Considerations in the Design of Experiments for Binary Data": Martin Bezener (Stat-Ease)
"Text Mining: Discovering Themes in Text Records": Paul Prew (Ecolab)
"Improving Collapsibility Robustness of an EPS-CD by means of Simulation and Six Sigma": Michal Majzel (ZF)
"Design of Experiments in Chemistry: the Pitfalls": James Cawse (Cawse and Effect, LLC)
"DOE: A Formulator's Perspective": William Arendt (Arendt Consulting)
"How to Increase Design of Experiments Success": Carol Parendo (Collins Aerospace)
"Using Experimental Design and Statistical Software to Investigate the Impact of Amines on Metalworking Fluid Lubricity": Jason Pandolfo (Quaker Chemical)
"Optimization of the Process Parameters for PLGA Microparticle Formulation Based on Taguchi Design": Rosemond Mensah (U of Hertfordshire, UK)
"Using Multi-Sensor Data Fusion for Process Analysis and Control": Geir Rune Flåten (Camo Analytics)
"PAT in pharmaceutical formulation manufacturing": Angela Spangenberg (DisperSol)
"The Importance of Flexibility of Multivariate PAT Techniques": Eric Jayjock (Patheon)
"PAT Best Practices": Chuck Miller (Camo Analytics)
"Introduction of Hyperspectral Image Analysis for Quality Control (new Camo Analytics product)": Geir Rune Flåten (Camo Analytics)
"Delivering Automated Process and Release Analytics to Clients: The Benefits of the CDMO": Pankaj Sinha (Lonza)
"MVA and DOE: Throughout the Product Lifecycle": Chuck Miller (Camo Analytics)
"Multivariate Analysis: From Chemometrics Modeling to Process Analytics": Sylvie Roussel (Ondalys)
Abstracts to come