Python: Variance of a list of defined numbers

GiamPy picture GiamPy · May 21, 2013 · Viewed 51.8k times · Source

I am trying to make a function that prints the variance of a list of defined numbers:

grades = [100, 100, 90, 40, 80, 100, 85, 70, 90, 65, 90, 85, 50.5]

So far, I have tried proceeding on making these three functions:

def grades_sum(my_list):
    total = 0
    for grade in my_list: 
        total += grade
    return total

def grades_average(my_list):
    sum_of_grades = grades_sum(my_list)
    average = sum_of_grades / len(my_list)
    return average

def grades_variance(my_list, average):
    variance = 0
    for i in my_list:
        variance += (average - my_list[i]) ** 2
    return variance / len(my_list)

When I try to execute the code, however, it gives me the following error at the following line:

Line: variance += (average - my_list[i]) ** 2
Error: list index out of range

Apologies if my current Python knowledges are limited, but I am still learning - so please if you wish to help solving this issue try not to suggest extremely-complicated ways on how to solve this, thank you really much.

Answer

robinfang picture robinfang · Nov 16, 2014

Try numpy.

import numpy as np
variance = np.var(grades)