Orthogonal regression fitting in scipy least squares method

Vladimir picture Vladimir · Feb 21, 2012 · Viewed 9.6k times · Source

The leastsq method in scipy lib fits a curve to some data. And this method implies that in this data Y values depends on some X argument. And calculates the minimal distance between curve and the data point in the Y axis (dy)

But what if I need to calculate minimal distance in both axes (dy and dx)

Is there some ways to implement this calculation?

Here is a sample of code when using one axis calculation:

import numpy as np
from scipy.optimize import leastsq

xData = [some data...]
yData = [some data...]

def mFunc(p, x, y):
    return y - (p[0]*x**p[1])  # is takes into account only y axis

plsq, pcov = leastsq(mFunc, [1,1], args=(xData,yData))
print plsq

I recently tryed scipy.odr library and it returns the proper results only for linear function. For other functions like y=a*x^b it returns wrong results. This is how I use it:

def f(p, x):      
    return p[0]*x**p[1]

myModel = Model(f)
myData = Data(xData, yData)
myOdr = ODR(myData, myModel , beta0=[1,1])
myOdr.set_job(fit_type=0) #if set fit_type=2, returns the same as leastsq
out = myOdr.run()
out.pprint()

This returns wrong results, not desired, and in some input data not even close to real. May be, there is some special ways of using it, what do I do wrong?

Answer

Robert Kern picture Robert Kern · Feb 21, 2012

scipy.odr implements the Orthogonal Distance Regression. See the instructions for basic use in the docstring and documentation.