Question:
I am having trouble reasoning what the time complexity of this is. I was writing a backtracking function to solve a problem. To simplify, let’s just say I have a list of size “a” and I am allowed to place down 0 or 1 into each element of the list. After trying all combinations, I return. This is clearly 2^(nm).However, what if during each recursive call I have a constant amount of work to do? I am stuck reasoning through what the complexity is here. Can you point me to sources? From my undergrad studies all I remember is Master’s theorem, but this approach is not relevant. (We subtract rather than divide to get the subproblem)
Answer:
In your case, the time complexity isT(n) = m + 2T(n - 1)
. Although we can’t apply Master’s theorem here, we can use substitution:m2^n
or ϴ(2^n).Recursion doesn’t really offer you any benefits here over readability. You could see savings if you combined this with memory of what you’ve evaluated, however. In that case, evaluating the time complexity becomes more… complex.
If you have better answer, please add a comment about this, thank you!