I'm trying to learn Swift and I made a simple average function:
func average(numbers: Int...) -> Float {
var sum = 0
for number in numbers {
sum += number
}
return Float(sum)/Float(numbers.count)
}
average(1,2,3,4,5,6)
This gives me the correct result: 3.5 However, I am wondering why I have to cast both sum and numbers.count to floats. I tried casting this way:
return Float(sum/numbers.count)
but it gives me just 3.0
First, you should use Double and not Float. Float gives you very limited precision. Why you would want 6 digits precision with Float when you could have 15 digits with Double is hard to understand.
Second, a compiler does exactly what you tell it.
Float (sum) / Float (numbers.count)
takes the integer "sum", takes the integer "numbers.count", converts both to Float and divides. Divison of Float gives a result in this case of 3.5.
Float (sum/numbers.count)
divides the integer "sum" by the integer "numbers.count". Division of integers gives an integer result, which is the integer quotient disregarding any remainder. 21 / 6 equals 3 with a remainder of 3. So the result of the division is 3, which you then convert to the Float 3.0.