estimating uncertainties in fitted parameters using lsqcurvefit

user1331843 picture user1331843 · Feb 27, 2013 · Viewed 7.2k times · Source

I am using lsqcurvefit to fit function like this a.*x.^b, it will give me a , b and resnorm. I am wondering how can I have uncertainty for a and b. Is it possible to use 'jacobian' like this ?

[x,resnorm,residual,exitflag,output,jacobian] = lsqcurvefit (...)

then I will have a array with two columns which I think it related to this fact I ahve two parameters for fitting! but i don't know how to interpret it or use them for estimating error for a and b.

Answer

Dan picture Dan · Feb 28, 2013

So it looks like this is best achieved using function in the statistics toolbox. See http://www.mathworks.com/support/solutions/en/data/1-18QY1/?solution=1-18QY1 and this http://www.mathworks.com/matlabcentral/answers/56734 for examples on how to get the standard deviation of the fitted parameters but only if you have access to Matlab's statistics toolbox.

If you don't have that toolbox then from the wikipedia article on simple linear regression you can find the standard error of the slope parameter using the formula:

enter image description here

Most of the numerator there is resnorm and the denominator is fairly trivial to find:

sum((X - mean(X)).^2)

Where X is a vector of all your input independent variables used to find the fit

So you could convert your fit to a linear fit by taking logs so use Yln = log(Y) and Xln = log(X) to get the new model:

Yln = b*ln(a) + b*Xln

And use the formulas for standard error of simple linear regression parameters.