Calculating bits required to store decimal number

user379888 picture user379888 · Aug 22, 2011 · Viewed 132.9k times · Source

This is a homework question that I am stuck with.

Consider unsigned integer representation. How many bits will be required to store a decimal number containing:

i) 3 digits ii) 4 digits iii) 6 digits iv) n digits

I know that the range of the unsigned integer will be 0 to 2^n but I don't get how the number of bits required to represent a number depends upon it. Please help me out.

Thanks in advance.

Answer

guardianpt picture guardianpt · Aug 22, 2011

Well, you just have to calculate the range for each case and find the lowest power of 2 that is higher than that range.

For instance, in i), 3 decimal digits -> 10^3 = 1000 possible numbers so you have to find the lowest power of 2 that is higher than 1000, which in this case is 2^10 = 1024 (10 bits).

Edit: Basically you need to find the number of possible numbers with the number of digits you have and then find which number of digits (in the other base, in this case base 2, binary) has at least the same possible numbers as the one in decimal.

To calculate the number of possibilities given the number of digits: possibilities=base^ndigits

So, if you have 3 digits in decimal (base 10) you have 10^3=1000 possibilities. Then you have to find a number of digits in binary (bits, base 2) so that the number of possibilities is at least 1000, which in this case is 2^10=1024 (9 digits isn't enough because 2^9=512 which is less than 1000).

If you generalize this, you have: 2^nbits=possibilities <=> nbits=log2(possibilities)

Which applied to i) gives: log2(1000)=9.97 and since the number of bits has to be an integer, you have to round it up to 10.