How do I calculate r-squared using Python and Numpy?

Travis Beale picture Travis Beale · May 21, 2009 · Viewed 225.4k times · Source

I'm using Python and Numpy to calculate a best fit polynomial of arbitrary degree. I pass a list of x values, y values, and the degree of the polynomial I want to fit (linear, quadratic, etc.).

This much works, but I also want to calculate r (coefficient of correlation) and r-squared(coefficient of determination). I am comparing my results with Excel's best-fit trendline capability, and the r-squared value it calculates. Using this, I know I am calculating r-squared correctly for linear best-fit (degree equals 1). However, my function does not work for polynomials with degree greater than 1.

Excel is able to do this. How do I calculate r-squared for higher-order polynomials using Numpy?

Here's my function:

import numpy

# Polynomial Regression
def polyfit(x, y, degree):
    results = {}

    coeffs = numpy.polyfit(x, y, degree)
     # Polynomial Coefficients
    results['polynomial'] = coeffs.tolist()

    correlation = numpy.corrcoef(x, y)[0,1]

     # r
    results['correlation'] = correlation
     # r-squared
    results['determination'] = correlation**2

    return results

Answer

Gökhan Sever picture Gökhan Sever · Oct 4, 2009

A very late reply, but just in case someone needs a ready function for this:

scipy.stats.linregress

i.e.

slope, intercept, r_value, p_value, std_err = scipy.stats.linregress(x, y)

as in @Adam Marples's answer.