Зарегистрироваться
Восстановить пароль
FAQ по входу

Wolberg J. Designing Quantitative Experiments. Prediction Analysis

  • Файл формата pdf
  • размером 5,51 МБ
  • Добавлен пользователем
  • Описание отредактировано
Wolberg J. Designing Quantitative Experiments. Prediction Analysis
Berlin, Springer, 2010, 208 pages.
Prof. Emeritus John Wolberg, Faculty of Mechanical Engineering, Technion – Israel Institute of Technology, Haifa, Israel.
The common denominator in all this work is the similarity in the analysis phase of the experimental process. If one can assume that the measurement errors in the obtained data are normally distributed, the method of least squares is usually used to "fit" the data. The assumption of normality is usually reasonable so for this very broad class of experiments the method of least squares is the "best" method of analysis. The word "best" implies that the estimated parameters are determined with the smallest estimated uncertainty. Actually, the theoretically best solution to the minimization of estimated uncertainty is achieved by applying the method of maximum likelihood. This method was proposed as a general method of estimation by the renowned statistician R. A. Fisher in the early part of the 20th century. The method can be applied when the uncertainties associated with the observed or calculated data exhibit any type of distribution. However, when the uncertainties are normally distributed or when the normal distribution is a reasonable approximation, the method of maximum likelihood reduces to the method of least squares. The assumption of normally distributed random errors is reasonable for most situations and thus the method of least squares is applicable for analysis of most quantitative experiments. For problems in which the method of least squares will be applicable for analysis of the data, the method of prediction analysis is applicable for designing the proposed experiments.
The Experimental Method
Quantitative Experiments
Dealing with Uncertainty
Parametric Models
Basic Assumptions
Treatment of Systematic Errors
Nonparametric Models
Statistical Learning
Statistical background
Experimental Variables
Measures of Location
Measures of Variation
Statistical Distributions
Functions of Several Variables
The method of least squares
The Objective Function
Data Weighting
Obtaining the Least Squares Solution
Uncertainty in the Model Parameters
Uncertainty in the Model Predictions
Treatment of Prior Estimates
Applying Least Squares to Classification Problems
Goodness-of-Fit
The REGRESS Program
prediction analysis
Linking Prediction Analysis and Least Squares
Prediction Analysis of a Straight Line Experiment
Prediction Analysis of an Exponential Experiment
Dimensionless Groups
Simulating Experiments
Predicting Calculational Complexity
Predicting the Effects of Systematic Errors
P.A. with Uncertainty in the Independent Variables
Multiple Linear Regression
Separation experiments
Exponential Separation Experiments
Gaussian Peak Separation Experiments
Sine Wave Separation Experiments
Bivariate Separation
Initial value experiments
A Nonlinear First Order Differential Equation
First Order ODE with an Analytical Solution
Simultaneous First Order Differential Equations
The Chemostat
Astronomical Observations using Kepler's Laws
Random distribution
Revisiting Multiple Linear Regression
Bivariate Normal Distribution
Orthogonality
  • Чтобы скачать этот файл зарегистрируйтесь и/или войдите на сайт используя форму сверху.
  • Регистрация