I have picked up this example which converts BitSet to Byte array.
public static byte[] toByteArray(BitSet bits) {
byte[] bytes = new byte[bits.length()/8+1];
for (int i=0; i<bits.length(); i++) {
if (bits.get(i)) {
bytes[bytes.length-i/8-1] |= 1<<(i%8);
}
}
return bytes;
}
But in the discussion forums I have seen that by this method we wont get all the bits as we will be loosing one bit per calculation. Is this true? Do we need to modify the above method?
No, that's fine. The comment on the post was relating to the other piece of code in the post, converting from a byte array to a BitSet
. I'd use rather more whitespace, admittedly.
Also this can end up with an array which is longer than it needs to be. The array creation expression could be:
byte[] bytes = new byte[(bits.length() + 7) / 8];
This gives room for as many bits are required, but no more. Basically it's equivalent to "Divide by 8, but always round up."