# Think Bayes¶

This notebook presents example code and exercise solutions for Think Bayes.

In [7]:
# Configure Jupyter so figures appear in the notebook
%matplotlib inline

# Configure Jupyter to display the assigned value after an assignment
%config InteractiveShell.ast_node_interactivity='last_expr_or_assign'

# import classes from thinkbayes2
from thinkbayes2 import Hist, Pmf, Suite


Exercise: Let's consider a more general version of the Monty Hall problem where Monty is more unpredictable. As before, Monty never opens the door you chose (let's call it A) and never opens the door with the prize. So if you choose the door with the prize, Monty has to decide which door to open. Suppose he opens B with probability p and C with probability 1-p.

1. If you choose A and Monty opens B, what is the probability that the car is behind A, in terms of p?

2. What if Monty opens C?

Hint: you might want to use SymPy to do the algebra for you.

In [8]:
from sympy import symbols
p = symbols('p')

Out[8]:
p
In [9]:
# Solution

# Here's the solution if Monty opens B.

pmf = Pmf('ABC')
pmf['A'] *= p
pmf['B'] *= 0
pmf['C'] *= 1
pmf.Normalize()
pmf['A'].simplify()

Out[9]:
1.0*p/(p + 1)
In [10]:
# Solution

# When p=0.5, the result is what we saw before

pmf['A'].evalf(subs={p:0.5})

Out[10]:
0.333333333333333
In [11]:
# Solution

# When p=0.0, we know for sure that the prize is behind C

pmf['C'].evalf(subs={p:0.0})

Out[11]:
1.00000000000000
In [12]:
# Solution

# And here's the solution if Monty opens C.

pmf = Pmf('ABC')
pmf['A'] *= 1-p
pmf['B'] *= 1
pmf['C'] *= 0
pmf.Normalize()
pmf['A'].simplify()

Out[12]:
0.333333333333333*(p - 1)/(0.333333333333333*p - 0.666666666666667)