I have a theoretical question. If I have a computer that has the
ability to double its speed every time a function is called
recursively. i.e. each time it computes an recursion it
does so in half the time it took on the previous recursion. how would I
find the complexity for an algorithm?
e.g. binary search is usually O(log n). I'm guessing in this case
binary search would be O(log(log n))...am I right?
Thanks in advance
- Re: Time complexity of simple recurrence relation
... it is that the base cases are the ones that stop your recursion from ... then nothing in logmatters for overall complexity. ... We determine (through a given binary search) an element of this row. ... The process continues until all recursions exit. ...
- Re: How to implement a BinarySearch on a custom Collection?
... No - typically a binary search of linear storage does not have to be ... there will be no problem with recursion depth. ... To cast this in the frame of the guessing game (see Example ... return binarySearch(a, value, left, mid-1) ...
- Re: Thinking Recursion
... since my students often have some ... a fairly common pattern: one can only grasp recursion the first time ... Now, binary search is recursive. ... such things have little trouble understanding such recursive strategies. ...
- Re: tree to single linked list conversion
... Do a google search on recursion. ... Look for algorithms for pre-order, ... I wanted to have an in-place sorted list given a binary search tree... ...
- Re: Recursive Question
... does so in half the time it took on the previous recursion. ... e.g. binary search is usually O. ... Also be catious about using words like 'complexity' to refer to an execution time estimate on a non-conventional computer. ...