Continuation approaches for nonlinear least squares
Output error (OE) minimization is a commonly-used method for parameter estimation. It has the virtues that under reasonable conditions, the parameter estimates it produces are good (often maximum-likelihood) and only a simulation model is required to compute the predictions (making it equally useful for linear and nonlinear systems). Unfortunately, the procedure for computing the OE parameter estimates involves the minimization of a nonlinear cost function which requires the use of an iterative nonlinear least-squares (NLLS) algorithm such as Gauss- Newton. The present work begins with the observation that the one-step linear prediction error (PE) method - for which the parameter estimates can be computed via linear least-squares but suffers from noise-related bias - can be viewed as a particular transformation of the OE residuals. Further, that this transformation can be parameterized in such a way that a continuation of residual sequences is available that combines the properties of PE and OE. Minimizing a sequence of the cost functions associated with this continuation is shown to be a reliable method for finding the OE global minimum. It is later shown that the continuation need not begin with PE and that, in principle, any linear transformation of the OE residuals may be used provided that the resulting sequence of cost function global minima meet certain criteria. These criteria are provided in the form of a theorem which, in turn, makes use of the Implicit Function Theorem, an elementary result from analysis. The result is a method for guaranteeing that the continuation procedure has converged on the OE global minimum given a linear transformation has been chosen. No closed-form solution is given for the design of the transformation, but two designs are provided and shown to be successful on test problems.