In terms of n, where n = |Σ| denotes the alphabet size, what is the maximum number of bits that Huffman's greedy algorithm might use to encode a single symbol? Explain your answer.