Analyzing how fast algorithms run

We always care about how fast an algorithm runs and try to come up with clever ways of having faster algorithms.

We use a notation called big O notation to analyze how fast an algorithm runs, and how it scales when its input grows. We often focus on the worst case sceanrio. I.e. what happens if you have an input that makes the algorithm run as slow as possible? (e.g. we saw this in binary search when we analyzed the maximum number of steps it takes us to guess a number)

O(1)

In [ ]:
#assigning a variable
L=range(1000)
L[0]=5 #doesn't matter how big L is
L[10]=10

O(n) linear time

In [28]:
#e.g. for loops
x=[1,5,3,2,30,30,22]
min_val=x[0]
for i in x: 
    if i <min_val: #1, 1
        min_val=i
print min_val
1

O(n${^2}$) quadratic time

In [6]:
#2 for loops
#e.g. compare two lists L1 and L2 in two for loops
L1=[3,5,6,7,9,10,8] #7
L2=[9,7,6,30,20,10,15] #7
intersection=[]

for i in L1:
    for j in L2:
        if i==j:
            intersection += [i]
print(intersection)        
[6, 7, 9]

O(n${^3}$) cubic time

In [8]:
#3 for loops
L1=[3,5,6,7,9,10,8]
L2=[9,7,6,20,11,7,15]
L3=[6,8,10,20,9,20,40]
intersection =[]

for i in L2:
    for j in L2:
        for k in L3:
            if i==j==k:
                intersection +=[i]
print(intersection)        
[9, 6]

O (log(n))

In [21]:
#e.g. binary search
#Searching through 1 billion numbers (e.g. number of Facebook users)
from math import log 
log(1e9,2)
Out[21]:
29.897352853986263
With linear search it would take 1 billion steps

O (n*log(n))

e.g. merge sort

In [18]:
1e9*log(1e9,2)
Out[18]:
29897352853.986263

selection sort (n${^2}$)

In [25]:
1e9**2
Out[25]:
1e+18