Sure you could divide the remaining file size by the current download speed, but if your download speed fluctuates (and it will), this doesn't produce a very nice result. What's a better algorithm for producing smoother countdowns?
An exponential moving average is great for this. It provides a way to smooth your average so that each time you add a new sample the older samples become decreasingly important to the overall average. They are still considered, but their importance drops off exponentially--hence the name. And since it's a "moving" average, you only have to keep a single number around.
In the context of measuring download speed the formula would look like this:
averageSpeed = SMOOTHING_FACTOR * lastSpeed + (1-SMOOTHING_FACTOR) * averageSpeed;
SMOOTHING_FACTOR
is a number between 0 and 1. The higher this number, the faster older samples are discarded. As you can see in the formula, when SMOOTHING_FACTOR
is 1 you are simply using the value of your last observation. When SMOOTHING_FACTOR
is 0 averageSpeed
never changes. So, you want something in between, and usually a low value to get decent smoothing. I've found that 0.005 provides a pretty good smoothing value for an average download speed.
lastSpeed
is the last measured download speed. You can get this value by running a timer every second or so to calculate how many bytes have downloaded since the last time you ran it.
averageSpeed
is, obviously, the number that you want to use to calculate your estimated time remaining. Initialize this to the first lastSpeed
measurement you get.