 Home Page         NONLINEAR REGRESSION           Nonlinear Regression refers to the estimation method used for functional forms that are inherently nonlinear - i.e. that cannot be transformed to linear through by natural logs and other mathematical operators. Since these types of equations are basically nonlinear, linear procedures such as ordinary least squares cannot be used to estimate their parameters. In the case of linear regression, obtaining least squares estimates is computationally straightforward. For nonlinear estimation, there are a number of computational methods which could be used as alternative procedures to obtain parameter estimates. Approach #1: Direct Search - in this case the sum of squared errors function is used for alternative sets of coefficient values. Those values which result in a minimum are chosen as the final estimates. If many parameter estimates are needed, then this method is slow and seldom used. Approach #2: Direct Optimization - Parameter estimates are obtained by differentiating the sum of squared errors function with respect to each coefficient, setting the derivatives equal to zero (defining the minimum) and solving the resulting set of nonlinear equations called normal equations. This is often accomplished through the method referred to as the "steepest decent" which involves an iterative process to find the minimum. Approach #3: Iterative Linearation - The nonlinear equation is linearized (using a Taylor series expansion) around some initial set of coefficient values. Then ordinary least squares is performed on this linear equation, generating a new set of coefficients. The nonlinear equation is again relinearized around these new coefficient values and OLS is once again used to recompute new values. This last process is repeated until some type of convergence is attained -i.e. the values change very little. This is perhaps the most widely applied technique available in econometric software today. Reasons for its popular use is that (1) it is fast (2) the guidelines for its use are strongly aligned with those of linear regression. However, there are drawbacks to nonlinear estimation. There is no guarantee that the model will converge on a global maximum or minimum. Therefore, the model should always be reestimated with different starting values when possible to verify that the global maximum has probably been reached. Since the computing time can be great using nonlinear methods, it is best to begin with good starting values for your initial coefficient values - often found using OLS as a beginning.        