我知道遍历一组大小为n的所有子集是一场性能梦night,并且将花费O(2 ^ n)时间。
I know that iterating over all subsets of a set of size n is a performance nightmare and will take O(2^n) time.
如何遍历大小为k的所有子集(对于(0< = k< = n))?那是一场表演梦night吗?我知道有(n,k)= n! / k! (n-k)!可能性。我知道如果k非常接近0或非常接近n,这是一个很小的数字。
How about iterating over all subsets of size k (for (0 <= k <= n))? Is that a performance nightmare? I know there are (n, k) = n! / k! (n - k)! possibilities. I know that if k is very close to 0 or very close to n, this is a small number.
就n和k而言,最差情况的性能是什么?除了O(n!/ k!(n-k)!)以外,还有其他更简单的说法吗?这是渐近小于O(n!)还是相同?
What is the worst case performance in terms of n and k? Is there a simpler way of saying it other than just O(n! / k! (n - k)!)? Is this asymptotically smaller than O(n!) or the same?
推荐答案您想要Gosper的技巧:
You want Gosper's hack:
int c = (1<<k)-1; while (c < (1<<n)) { dostuff(c); int a = c&-c, b = c+a; c = (c^b)/4/a|b; }说明:
找到设置了尽可能多的位的下一个数字基本上会减少为只包含一个 1的块的数字的情况---数字在二进制中具有一堆零,然后是一堆,然后又是一堆零
Finding the next number with as many bits set basically reduces to the case of numbers with exactly one "block of ones" --- numbers having a bunch of zeroes, then a bunch of ones, then a bunch of zeroes again in their binary expansions.
处理这样一个一堆的数字的方法是将一个左移的最高位移开,然后将其他所有的移到最低。 Gosper的破解工作是通过找到最低的置位( a ),找到包括我们不接触的位和进位( b ),然后产生一个从最小有效位开始的大小合适的块。
The way to deal with such a "block of ones" number is to move the highest bit left by one and slam all the others as low as possible. Gosper's hack works by finding the lowest set bit (a), finding the "high part" comprising the bits we don't touch and the "carry bit" (b), then producing a block of ones of the appropriate size that begins at the least-significant bit.
更多推荐
迭代给定大小的所有子集
发布评论