This notebook contains an excerpt from the Python Programming and Numerical Methods - A Guide for Engineers and Scientists, the content is also available at Berkeley Python Numerical Methods.
The copyright of the book belongs to Elsevier. We also have this interactive book online for a better learning experience. The code is released under the MIT license. If you find this content useful, please consider supporting the work on Elsevier or Amazon!
< 8.3 The Profiler | Contents | 9.0 Representation of Numbers >
Summary¶
The complexity of an algorithm is the relationship between the size of the input problem and the time it takes the algorithm to terminate.
Big-O notation is a standard method of classifying algorithmic complexity in a way that is computer and operating-system-independent.
Algorithms with log complexity are faster than algorithms with polynomial complexity, which are faster than algorithms with exponential complexity.
The Profiler is a useful tool for determining where your code is running slowly so that you can improve its performance.
Problems¶
How would you define the size of the following tasks?
Solving a jigsaw puzzle.
Passing a handout to a class.
Walking to class.
Finding a name in dictionary.
For the tasks given in the previous problem, what would you say is the Big-O complexity of the tasks in terms of the size definitions you gave?
You may be surprised to know that there is a \(\log\) time algorithm for finding a word in an \(n\)-word dictionary. Instead of starting at the beginning of the list, you go to the middle. If this is the word you are looking for then you are done. If the word comes after the word you are looking for, then look halfway between the current word and the end. If it is before the word you are looking for, then look halfway between the first word and the current word. Keep repeating this process until you find the word. This algorithm is known as a binary search, and it is \(\log\) time because the search space is cut in half at each iteration, and therefore, requires at most \(\log_2(n)\) iterations to find the word. Hence the increase in run time is only log in the length of the list.
There is a way to look up a word in \(O(1)\) or constant time. This means that no matter how long the list is, it takes the same amount of time! Can you think of how this is done? Hint: Research hash functions.
What is the complexity of the algorithms that compute the following recursive relationships? Classify the following algorithms as log time, polynomial time, or exponential time in terms of n given that the implementation is (a) recursive, and (b) iterative.
Tribonacci, \(T(n)\):
Timmynacci, \(t(n)\):
What is the Big-O complexity of the Towers of Hanoi problem given in Chapter 6? Is the complexity an upper bound or is it exact?
What is the Big-O complexity of the quicksort algorithm?
Run the following two iterative implementations of finding Fibonacci numbers in the line_profiler as well as using the magic command to get the repeated run time. The first implementation preallocates memory to an array that stores all the Fibonacci numbers. The second implementation expands the list at each iteration of the for-loop.
import numpy as np
def my_fib_iter1(n):
out = np.zeros(n)
out[:2] = 1
for i in range(2, n):
out[i] = out[i-1] + out[i-2]
return out
def my_fib_iter2(n):
out = [1, 1]
for i in range(2, n):
out.append(out[i-1]+out[i-2])
return np.array(out)
< 8.3 The Profiler | Contents | 9.0 Representation of Numbers >