Lets say, we're calculating averages of test scores:
Starting Test Scores: 75, 80, 92, 64, 83, 99, 79
Average = 572 / 7 = 81.714...
Now given 81.714, is there a way to add a new set of test scores to "extend" this average if you don't know the initial test scores?
New Test Scores: 66, 89, 71
Average = 226 / 3 = 75.333...
Normal Average would be: 798 / 10 = 79.8
I've tried:
Avg = (OldAvg + sumOfNewScores) / (numOfNewScores + 1)
(81.714 + 226) / (3 + 1) = 76.9285
Avg = (OldAvg + NewAvg) / 2
(81.714 + 79.8) / 2 = 80.77
And neither come up the exact average that it "should" be. Is it mathematically possible to do this considering you don't know the initial values?
You have to know the number of test scores in the original set and the old average:
newAve = ((oldAve*oldNumPoints) + x)/(oldNumPoints+1)