Looking quickly at the code, it would appear that the method is used to convert a String of bits representing a byte to an actual byte
. For example, it would translate “1101” to a byte value 0x0D.
It does this by analysing each individual character in the string, starting at the least significant character (ie: str.charAt(str.length()-1 -i)
), converting it to a byte (value in theory should be 0 or 1), and then multiplying it by a power of 2 depending on its location in the string (ie: Math.pow(2,i)
), where the least significant position has a factor of 0.
It sums up all the “bits” and gives you a final answer.
Not – that there seems to be no error checking on the individual bits that it is parsing in the string, but there should only be 0 and 1 represented in the string.
solved Can anyone help me understand this method used to generate a Huffman tree? [closed]