Can someone explains why Brian Kernighan's algorithm takes O(log N) to count set bits (1s) in an integer. A simple implementation of this algorithm is below (in JAVA)
int count_set_bits(int n){
int count = 0;
while(n != 0){
n &= (n-1);
count++;
}
return count;
}
I understand how it works by clearing the rightmost set bit one by one until it becomes 0, but I just don't know how we get O(log N).
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…