Implement Golden Section Search algorithm
Input parameters:
function $f$;
initial interval $[a,b]$;
tolerance $tol$ (when interval length $< tol$, program terminates)
For each iteration, print out: iteration number, $x_1, f(x_1), x_2$ and $f(x_2)$.
import numpy as np
from scipy.optimize import minimize_scalar
import matplotlib.pyplot as plt
def golden_section_search(f, a, b, tol):
Exercise 1
Use your implemented algorithm to solve:
$\text{min } f(x)=0.5-xe^{-x^2}$ in interval $[0,2]$, with tolerance $0.001$.
Before running the code, can we determine in advance the number of iterations needed for the algorithm to terminate?
Exercise 2
Use the library functions for Brent's method (MATLAB: fminbnd; Python: scipy.optimize.minimize_scalar) to solve the problem in exercise 1:
$\text{min } f(x)=0.5-xe^{-x^2}$ in interval $[0,2]$, with tolerance $0.001$.
How many number of iterations they use? Which one is faster?
minimize_scalar(f,bracket=(a,b),method='brent',tol=tol)
Take home exercise 1
$\text{min } f(x) =e^{−x}−cos(x)$ in interval $[0,1]$, with tolerance $0.01$.
Prove $f(x)$ is unimodal on interval $[0,1]$;
Prove there is a unique global minimizer in $(0,1)$;
Calculate by hand the number of iterations needed to reach the tolerance;
Run your golden section search algorithm and report the results $x1,f(x1),x2,f(x2)$ for each iteration.
Run Brent's method and report the number of iterations needed. Compare the result with golden section search
Take home exercise 2
$\text{min } f(x)=sin(\frac{1}{x})+cos(\frac{1}{\sqrt(x)})$
Use your favorite method, find out how many minima does $f(x)$ have in interval $[0.005,0.2]$.