I always get confused about this. Is there a "standard" conversion of Megabits to bytes?
Is it:
1 Megabit == 1,000,000 bits == 125,000 bytes
Or:
1 Megabit == 2^20 bits == 1,048,576 bits == 131,072 bytes
Megabits is a unit of measure that comes from TELECOM, not CS. So it is:
1 Megabit == 1,000,000 bits == 125,000 bytes
When it's a CS based unit of measure, usually the 1024
rules apply:
1 Megabyte = 1,024 Kilobytes = 1,024 x 1,024 bytes