时间函数递归的复杂性(Time Complexity of recursive of function)

编程入门 行业动态 更新时间:2024-10-25 02:23:12
时间函数递归的复杂性(Time Complexity of recursive of function)

需要帮助证明递归函数的时间复杂性。 据说它是2 ^ n。 我需要证明这是事实。

def F(n): if n == 0: return 0 else: result = 0 for i in range(n): result += F(i) return n*result+n`

这是另一个做同样事情的版本。 赋值表示使用数组存储值以试图减少时间复杂度,所以我所做的就是这样

def F2(n,array): if n < len(array): answer = array[n] elif n == 0: answer = 0 array.append(answer) else: result = 0 for i in range(n): result += F2(i,array) answer = n*result+n array.append(answer) return answer

我正在寻找的是解释如何找到两个代码片段的复杂性,而不是只知道答案。 所有和任何帮助非常感谢。

PS:出于某种原因,我无法让“def F2”留在代码块中......抱歉

Need help proving the time complexity of a recursive function. Supposedly it's 2^n. I need to prove that this is the case.

def F(n): if n == 0: return 0 else: result = 0 for i in range(n): result += F(i) return n*result+n`

Here's another version that does the same thing. The assignment said to use an array to store values in an attempt to reduce the time complexity so what I did was this

def F2(n,array): if n < len(array): answer = array[n] elif n == 0: answer = 0 array.append(answer) else: result = 0 for i in range(n): result += F2(i,array) answer = n*result+n array.append(answer) return answer

Again what I am looking for is the explanation of how to find the complexities of the two snippets of code, not interested in just knowing the answer. All and any help greatly appreciated.

PS: for some reason, I can't get "def F2" to stay in the code block...sorry about that

最满意答案

好吧,你写的第一个函数是Exhaustive Search一个例子,你正在探索每个可能的分支,它可以由一组整数到n (你已经在参数中传递了,你正在使用for循环) 。 为了向您解释时间复杂度,我将把递归堆栈视为树(表示递归函数调用堆栈,您可以使用堆栈或使用n-ary树)

我们先叫你F1功能:

F1(3),现在将为集合S中的每个数字形成三个分支(set是整数到n)。 我已经采取了n = 3,因为我很容易为它制作图表。 您可以尝试其他更大的数字并观察递归调用堆栈。

3 /| \ 0 1 2 ----> the leftmost node is returns 0 coz (n==0) it's the base case | /\ 0 0 1 | 0 ----> returns 0

所以在这里你已经探索了所有可能的分支。 如果您尝试为上述问题编写递归方程,则:

T(n) = 1; n is 0 = T(n-1) + T(n-2) + T(n-3) + ... + T(1); otherwise

这里,

T(n-1) = T(n-2) + T(n-3) + ... T(1).

那么, T(n-1) + T(n-2) + T(n-3) + ... + T(1) = T(n-1) + T(n-1)

因此,递归方程变为:

T(n) = 1; n is 0 = 2*T(n-1); otherwise

现在您可以轻松地解决此递归关系(或者使用可以使用Masters定理来实现快速解决方案)。 您将获得时间复杂度为O(2^n)

解决递归关系:

T(n) = 2T(n-1) = 2(2T(n-1-1) = 4T(n-2) = 4(2T(n-3) = 8T(n-3) = 2^k T(n-k), for some integer `k` ----> equation 1

现在我们给出n为0的基本情况,所以,

n-k = 0 , i.e. k = n;

将k = n置于equation 1 ,

T(n) = 2^n * T(n-n) = 2^n * T(0) = 2^n * 1; // as T(0) is 1 = 2^n

所以, TC = O(2 ^ n)

这就是为什么你可以获得第一个函数的时间复杂度。 接下来,如果您观察到上面形成的递归树(树中的每个节点都是主要问题的子问题),您将看到节点正在重复(即子问题正在重复)。 因此,您在第二个函数F2使用了一个存储器来存储已经计算的值,并且每当子问题再次发生时(即重复子问题),您使用的是预先计算的值(这样可以节省再次计算子问题的时间)然后再次)。 该方法也称为动态编程。

我们现在看第二个函数,在这里你要answer 。 但是,如果你看到你的函数,你正在程序中构建一个名为array 。 主要的时间复杂性就在那里。 计算它的时间复杂度很简单,因为总是只涉及一个递归级别(或者随便说你没有涉及递归),因为数字n范围内的每个数字i总是小于数字n ,所以首先, if条件被执行并且控制从F2返回。 因此,每个调用不能超过调用堆栈中的2级。

所以,

Time complexity of second function = time taken to build the array; = 1 comparisions + 1 comparisions + 2 comparisions + ... + (n-1) comparisions = 1 + 2 + 3 + ... + n-1 = O(n^2).

让我给你一个简单的方法来更深入地观察这种递归。 您可以在控制台上打印递归堆栈,并观察函数调用的方式。 下面我写了你打印函数调用的代码。

码:

def indent(n): for i in xrange(n): print ' '*i, # second argument rec_cnt is just taken to print the indented function properly def F(n, rec_cnt): indent(rec_cnt) print 'F(' + str(n) + ')' if n == 0: return 0 else: result = 0 for i in range(n): result += F(i, rec_cnt+1) return n*result+n # third argument is just taken to print the indented function properly def F2(n, array, rec_cnt): indent(rec_cnt) print 'F2(' + str(n) + ')' if n < len(array): answer = array[n] elif n == 0: answer = 0 array.append(answer) else: result = 0 for i in range(n): result += F2(i, array, rec_cnt+1) answer = n*result+n array.append(answer) return answer print F(4, 1) lis = [] print F2(4, lis, 1)

现在观察输出:

F(4) F(0) F(1) F(0) F(2) F(0) F(1) F(0) F(3) F(0) F(1) F(0) F(2) F(0) F(1) F(0) 96 F2(4) F2(0) F2(1) F2(0) F2(2) F2(0) F2(1) F2(3) F2(0) F2(1) F2(2) 96

在第一个函数调用堆栈即F1 ,您会看到每个调用都被探索到0 ,即我们正在探索每个可能的分支,最多为0 (基本情况),因此,我们将其称为穷举搜索。

在第二个函数调用堆栈中,您可以看到函数调用仅获得两个深度,即它们使用预先计算的值来解决重复的子问题。 因此,它的时间复杂度小于F1 。

Okay, the first function you wrote is an example of Exhaustive Search where you are exploring every possible branch that can be formed from a set of whole numbers up to n (which you have passed in the argument and you are using for loop for that). To explain you the time complexity I am going to consider the recursion stack as a Tree (to represent a recursive function call stack you can either use a stack or use an n-ary Tree)

Let's call you first function F1:

F1(3), now three branches will be formed for each number in the set S (set is the whole numbers up to n). I have taken n = 3, coz it will be easy for me to make the diagram for it. You can try will other larger numbers and observe the recursion call stack.

3 /| \ 0 1 2 ----> the leftmost node is returns 0 coz (n==0) it's the base case | /\ 0 0 1 | 0 ----> returns 0

So here you have explored every possibility branches. If you try to write the recursive equation for the above problem then:

T(n) = 1; n is 0 = T(n-1) + T(n-2) + T(n-3) + ... + T(1); otherwise

Here,

T(n-1) = T(n-2) + T(n-3) + ... T(1).

So, T(n-1) + T(n-2) + T(n-3) + ... + T(1) = T(n-1) + T(n-1)

So, the Recursive equation becomes:

T(n) = 1; n is 0 = 2*T(n-1); otherwise

Now you can easily solve this recurrence relation (or use can use Masters theorem for the fast solution). You will get the time complexity as O(2^n).

Solving the recurrence relation:

T(n) = 2T(n-1) = 2(2T(n-1-1) = 4T(n-2) = 4(2T(n-3) = 8T(n-3) = 2^k T(n-k), for some integer `k` ----> equation 1

Now we are given the base case where n is 0, so let,

n-k = 0 , i.e. k = n;

Put k = n in equation 1,

T(n) = 2^n * T(n-n) = 2^n * T(0) = 2^n * 1; // as T(0) is 1 = 2^n

So, T.C = O(2^n)

So this is how you can get the time complexity for your first function. Next, if you observe the recursion Tree formed above (each node in the tree is a subproblem of the main problem), you will see that the nodes are repeating (i.e. the subproblems are repeating). So you have used a memory in your second function F2 to store the already computed value and whenever the sub-problems are occurring again (i.e. repeating subproblems) you are using the pre-computed value (this saves time for computing the sub-problems again and again). The approach is also known as Dynamic Programming.

Let's now see the second function, here you are returning answer. But, if you see your function you are building an array named as array in your program. The main time complexity goes there. Calculating its time complexity is simple because there is always just one level of recursion involved (or casually you can say no recursion involved) as every number i which is in range of number n is always going to be less than the number n, So the first if condition gets executed and control returns from there in F2. So each call can't go deeper than 2 level in the call stack.

So,

Time complexity of second function = time taken to build the array; = 1 comparisions + 1 comparisions + 2 comparisions + ... + (n-1) comparisions = 1 + 2 + 3 + ... + n-1 = O(n^2).

Let me give you a simple way to observe such recursions more deeply. You can print the recursion stack on the console and observe how the function calls are being made. Below I have written your code where I am printing the function calls.

Code:

def indent(n): for i in xrange(n): print ' '*i, # second argument rec_cnt is just taken to print the indented function properly def F(n, rec_cnt): indent(rec_cnt) print 'F(' + str(n) + ')' if n == 0: return 0 else: result = 0 for i in range(n): result += F(i, rec_cnt+1) return n*result+n # third argument is just taken to print the indented function properly def F2(n, array, rec_cnt): indent(rec_cnt) print 'F2(' + str(n) + ')' if n < len(array): answer = array[n] elif n == 0: answer = 0 array.append(answer) else: result = 0 for i in range(n): result += F2(i, array, rec_cnt+1) answer = n*result+n array.append(answer) return answer print F(4, 1) lis = [] print F2(4, lis, 1)

Now observe the output:

F(4) F(0) F(1) F(0) F(2) F(0) F(1) F(0) F(3) F(0) F(1) F(0) F(2) F(0) F(1) F(0) 96 F2(4) F2(0) F2(1) F2(0) F2(2) F2(0) F2(1) F2(3) F2(0) F2(1) F2(2) 96

In the first function call stack i.e. F1, you see that each call is explored up to 0, i.e. we are exploring each possible branch up to 0 (the base case), so, we call it Exhaustive Search.

In the second function call stack, you can see that the function calls are getting only two levels deep, i.e. they are using the pre-computed value to solve the repeated subproblems. Thus, it's time complexity is lesser than F1.

更多推荐

本文发布于:2023-07-28 01:17:00,感谢您对本站的认可!
本文链接:https://www.elefans.com/category/jswz/34/1298300.html
版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,我们将在24小时内删除。
本文标签:递归   复杂性   函数   时间   recursive

发布评论

评论列表 (有 0 条评论)
草根站长

>www.elefans.com

编程频道|电子爱好者 - 技术资讯及电子产品介绍!