Inpainting using Sparse Regularization

Important: Please read the installation page for details about how to install the toolboxes. $\newcommand{\dotp}[2]{\langle #1, #2 \rangle}$ $\newcommand{\enscond}[2]{\lbrace #1, #2 \rbrace}$ $\newcommand{\pd}[2]{ \frac{ \partial #1}{\partial #2} }$ $\newcommand{\umin}[1]{\underset{#1}{\min}\;}$ $\newcommand{\umax}[1]{\underset{#1}{\max}\;}$ $\newcommand{\umin}[1]{\underset{#1}{\min}\;}$ $\newcommand{\uargmin}[1]{\underset{#1}{argmin}\;}$ $\newcommand{\norm}[1]{\|#1\|}$ $\newcommand{\abs}[1]{\left|#1\right|}$ $\newcommand{\choice}[1]{ \left\{ \begin{array}{l} #1 \end{array} \right. }$ $\newcommand{\pa}[1]{\left(#1\right)}$ $\newcommand{\diag}[1]{{diag}\left( #1 \right)}$ $\newcommand{\qandq}{\quad\text{and}\quad}$ $\newcommand{\qwhereq}{\quad\text{where}\quad}$ $\newcommand{\qifq}{ \quad \text{if} \quad }$ $\newcommand{\qarrq}{ \quad \Longrightarrow \quad }$ $\newcommand{\ZZ}{\mathbb{Z}}$ $\newcommand{\CC}{\mathbb{C}}$ $\newcommand{\RR}{\mathbb{R}}$ $\newcommand{\EE}{\mathbb{E}}$ $\newcommand{\Zz}{\mathcal{Z}}$ $\newcommand{\Ww}{\mathcal{W}}$ $\newcommand{\Vv}{\mathcal{V}}$ $\newcommand{\Nn}{\mathcal{N}}$ $\newcommand{\NN}{\mathcal{N}}$ $\newcommand{\Hh}{\mathcal{H}}$ $\newcommand{\Bb}{\mathcal{B}}$ $\newcommand{\Ee}{\mathcal{E}}$ $\newcommand{\Cc}{\mathcal{C}}$ $\newcommand{\Gg}{\mathcal{G}}$ $\newcommand{\Ss}{\mathcal{S}}$ $\newcommand{\Pp}{\mathcal{P}}$ $\newcommand{\Ff}{\mathcal{F}}$ $\newcommand{\Xx}{\mathcal{X}}$ $\newcommand{\Mm}{\mathcal{M}}$ $\newcommand{\Ii}{\mathcal{I}}$ $\newcommand{\Dd}{\mathcal{D}}$ $\newcommand{\Ll}{\mathcal{L}}$ $\newcommand{\Tt}{\mathcal{T}}$ $\newcommand{\si}{\sigma}$ $\newcommand{\al}{\alpha}$ $\newcommand{\la}{\lambda}$ $\newcommand{\ga}{\gamma}$ $\newcommand{\Ga}{\Gamma}$ $\newcommand{\La}{\Lambda}$ $\newcommand{\si}{\sigma}$ $\newcommand{\Si}{\Sigma}$ $\newcommand{\be}{\beta}$ $\newcommand{\de}{\delta}$ $\newcommand{\De}{\Delta}$ $\newcommand{\phi}{\varphi}$ $\newcommand{\th}{\theta}$ $\newcommand{\om}{\omega}$ $\newcommand{\Om}{\Omega}$

This numerical tour explores the use of sparse energies to regularize the image inpaiting problem.

In [2]:
using PyPlot
using NtToolBox
# using Autoreload
# arequire("NtToolBox")

Here we consider inpainting of damaged observation without noise.

Sparse Regularization

This tour consider measurements $y=\Phi f_0 + w$ where $\Phi$ is a masking operator and $w$ is an additive noise.

This tour is focused on using sparsity to recover an image from the measurements $y$. It considers a synthesis-based regularization, that compute a sparse set of coefficients $ (a_m^{\star})_m $ in a frame $\Psi = (\psi_m)_m$ that solves $$a^{\star} \in \text{argmin}_a \: \frac{1}{2}\|y-\Phi \Psi a\|^2 + \lambda J(a)$$

where $\lambda$ should be adapted to the noise level $\|w\|$. Since in this tour we consider damaged observation without noise, i.e. $w=0$, we use either a very small value of $\lambda$, or we decay its value through the iterations of the recovery process.

Here we use the notation $$\Psi a = \sum_m a_m \psi_m$$ to indicate the reconstruction operator, and $J(a)$ is the $\ell^1$ sparsity prior $$J(a)=\sum_m \|a_m\|.$$

Missing Pixels and Inpainting

Inpainting corresponds to filling holes in images. This corresponds to a linear ill posed inverse problem.

You might want to do first the numerical tour Variational image inpaiting that use Sobolev and TV priors to performs the inpainting.

First we load the image to be inpainted.

In [3]:
n = 128
f0 = load_image("NtToolBox/src/data/lena.png")
f0 = rescale(f0[256 - Base.div(n, 2) + 1 : 256 + Base.div(n, 2), 256 - Base.div(n, 2) + 1 : 256 + Base.div(n, 2)]);

Display it.

In [39]:
figure(figsize = (6, 6))
imageplot(f0, L"Image $f_0$")
Out[39]:
PyObject <matplotlib.text.Text object at 0x000000002566FA58>

Amount of removed pixels.

In [40]:
rho = .7;
Out[40]:
0.7

Then we construct a mask $\Omega$ made of random pixel locations.

In [41]:
Omega = zeros(n, n)
sel = randperm(n^2)
Omega[sel[1:Int(round(rho*n^2))]] = 1;
Out[41]:
1

The damaging operator put to zeros the pixel locations $x$ for which $\Omega(x)=1$

In [42]:
Phi = (f, Omega) -> f.*(1 - Omega);
Out[42]:
(::#21) (generic function with 1 method)

The damaged observations reads $y = \Phi f_0$.

In [43]:
y = Phi(f0, Omega);
Out[43]:
128×128 Array{Float64,2}:
 0.0       0.0       0.0       0.596059  …  0.817734  0.0       0.0     
 0.571429  0.0       0.591133  0.610838     0.0       0.0       0.812808
 0.581281  0.0       0.0       0.0          0.0       0.0       0.0     
 0.561576  0.0       0.0       0.586207     0.0       0.802956  0.0     
 0.0       0.0       0.522168  0.0          0.0       0.0       0.82266 
 0.0       0.0       0.0       0.0       …  0.0       0.802956  0.0     
 0.0       0.0       0.0       0.0          0.0       0.0       0.0     
 0.0       0.561576  0.0       0.0          0.0       0.82266   0.0     
 0.472906  0.0       0.0       0.394089     0.0       0.0       0.0     
 0.0       0.0       0.0       0.182266     0.0       0.0       0.0     
 0.0       0.0       0.0       0.0       …  0.0       0.0       0.832512
 0.44335   0.0       0.0       0.0          0.0       0.812808  0.0     
 0.0       0.0       0.0       0.369458     0.807882  0.0       0.827586
 ⋮                                       ⋱  ⋮                           
 0.0       0.0       0.0       0.472906     0.955665  0.932677  0.0     
 0.0       0.0       0.541872  0.0          0.0       0.935961  0.876847
 0.0       0.0       0.0       0.0          0.0       0.0       0.0     
 0.0       0.0       0.497537  0.0          0.0       0.0       0.91133 
 0.0       0.507389  0.0       0.546798  …  0.0       0.0       0.906404
 0.0       0.0       0.0       0.0          0.0       0.955665  0.0     
 0.0       0.0       0.0       0.689655     0.0       0.0       0.0     
 0.0       0.487685  0.0       0.0          0.990148  0.0       0.916256
 0.0       0.0       0.0       0.788177     1.0       0.0       0.916256
 0.0       0.0       0.0       0.0       …  1.0       0.97537   0.91133 
 0.0       0.0       0.0       0.561576     0.0       0.0       0.916256
 0.0       0.0       0.0       0.738916     0.945813  0.0       0.0     

Display the observations.

In [44]:
figure(figsize = (6, 6))
imageplot(y, "Observations y")
Out[44]:
PyObject <matplotlib.text.Text object at 0x0000000025309A90>

Soft Thresholding in a Basis

The soft thresholding operator is at the heart of $\ell^1$ minimization schemes. It can be applied to coefficients $a$, or to an image $f$ in an ortho-basis.

The soft thresholding is a 1-D functional that shrinks the value of coefficients. $$ s_T(u)=\max(0,1-T/|u|)u $$

Define a shortcut for this soft thresholding 1-D functional.

In [45]:
SoftThresh = (x,T) -> x.*max( 0, 1 - T./max(abs(x), 1e-10) );
Out[45]:
(::#23) (generic function with 1 method)

Display a curve of the 1D soft thresholding.

In [46]:
x = linspace(-1, 1, 1000)

figure(figsize = (7, 5))
plot(x, SoftThresh(x, .5))
show()

Note that the function SoftThresh can also be applied to vector which defines an operator on coefficients: $$ S_T(a) = ( s_T(a_m) )_m. $$

In the next section, we use an orthogonal wavelet basis $\Psi$.

We set the parameters of the wavelet transform.

In [47]:
Jmax = log2(n) - 1
Jmin = (Jmax - 3);
Out[47]:
3.0

Shortcut for $\Psi$ and $\Psi^*$ in the orthogonal case.

In [48]:
Psi = a -> NtToolBox.perform_wavelet_transf(a, Jmin, -1, "9-7", 0, 0)
PsiS = f -> NtToolBox.perform_wavelet_transf(f, Jmin, +1, "9-7", 0, 0);

The soft thresholding opterator in the basis $\Psi$ is defined as $$S_T^\Psi(f) = \sum_m s_T( \langle f,\psi_m \rangle ) \psi_m $$

It thus corresponds to applying the transform $\Psi^*$, thresholding the coefficients using $S_T$ and then undoing the transform using $\Psi$. $$ S_T^\Psi(f) = \Psi \circ S_T \circ \Psi^*$$

In [49]:
SoftThreshPsi = (f, T) -> Psi(SoftThresh(PsiS(f), T));

This soft thresholding corresponds to a denoising operator.

In [50]:
figure(figsize = (6, 6))
imageplot(clamP(SoftThreshPsi(f0, 0.1)))

Inpainting using Orthogonal Wavelet Sparsity

If $\Psi$ is an orthogonal basis, a change of variable shows that the synthesis prior is also an analysis prior, that reads $$f^{\star} \in \text{argmin}_f \: E(f) = \frac{1}{2}\|y-\Phi f\|^2 + \lambda \sum_m \|\langle f,\psi_m \rangle\|. $$

To solve this non-smooth optimization problem, one can use forward-backward splitting, also known as iterative soft thresholding.

It computes a series of images $f^{(\ell)}$ defined as $$ f^{(\ell+1)} = S_{\tau\lambda}^{\Psi}( f^{(\ell)} - \tau \Phi^{*} (\Phi f^{(\ell)} - y) ) $$

Set up the value of the threshold.

In [52]:
lambd = .03;

In our setting, we have $ \Phi^* = \Phi $ which is an operator of norm 1.

For $f^{(\ell)}$ to converge to a solution of the problem, the gradient step size should be chosen as $$\tau < \frac{2}{\|\Phi^* \Phi\|} = 2$$

In the following we use: $$\tau = 1$$

Since we use $ \tau=1 $ and $ \Phi = \Phi^* = \text{diag}(1-\Omega) $, the gradient descent step is a projection on the inpainting constraint $$ C = \{ f \backslash \forall \Omega(x)=0, f(x)=y(x) \} $$ One thus has $$ f - \tau \Phi^{*} (\Phi f - y) = \text{Proj}_C(f) $$

For the sake of simplicity, we define a shortcut for this projection operator.

In [53]:
ProjC = (f, Omega) -> Omega.*f + (1 - Omega).*y;

Each iteration of the forward-backward (iterative thresholding) algorithm thus reads: $$ f^{(\ell+1)} = S_{\lambda}^\Psi( \text{Proj}_C(f^{(\ell)}) ). $$

Initialize the iterations.

In [54]:
fSpars = y;

First step: gradient descent.

In [55]:
fSpars = ProjC(fSpars, Omega);

Second step: denoise the solution by thresholding.

In [56]:
fSpars = SoftThreshPsi(fSpars, lambd);

Exercise 1

Perform the iterative soft thresholding. Monitor the decay of the energy $E$ you are minimizing.

In [57]:
include("NtSolutions/inverse_5_inpainting_sparsity/exo1.jl")
In [21]:
## Insert your code here.

Display the result.

In [58]:
figure(figsize = (6, 6))
imageplot(clamP(fSpars))

Exercise 2

Since there is no noise, one should in theory take $\lambda \rightarrow 0$. To do this, decay the value of $\lambda$ through the iterations.

In [59]:
include("NtSolutions/inverse_5_inpainting_sparsity/exo2.jl")
Out[59]:
PyObject <matplotlib.text.Text object at 0x0000000024F89518>
In [24]:
## Insert your code here.

Inpainting using Translation Invariant Wavelet Sparsity

Orthogonal sparsity performs a poor regularization because of the lack of translation invariance. This regularization is enhanced by considering $\Psi$ as a redundant tight frame of translation invariant wavelets.

One thus looks for optimal coefficients $a^\star$ that solves $$a^{\star} \in \text{argmin}_a \: E(a) = \frac{1}{2}\|y-\Phi \Psi a\|^2 + \lambda J(a)$$

Important: The operator $\Psi^*$ is the forward translation invariant wavelet transform. It computes the inner product with the unit norm wavelet atoms: $$ (\Psi^* f)_m = \langle f,\psi_m \rangle \quad \text{with} \quad \|\psi_m\|=1. $$

The reconstruction operator $\Xi$ satisfies $ \Xi \Psi^* f = f $, and is the pseudo inverse of the analysis operator $ \Xi = (\Psi^*)^+ $.

For our algorithm, we will need to use $\Psi$ and not $\Xi$. Lukily, for the wavelet transform, one has $$ \Xi = \Psi \text{diag(U)} f $$ where $U_m$ account for the redundancy of the scale of the atom $\psi_m$.

Compute the scaling factor (inverse of the redundancy).

In [68]:
J = Jmax - Jmin + 1;

u = reshape([4^(-J); 4.^(-floor(J+2/3:-1/3:1))], (1, 1, 13))

U = repeat(u, inner = [n, n, 1])
Out[68]:
128×128×13 Array{Float64,3}:
[:, :, 1] =
 0.00390625  0.00390625  0.00390625  …  0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625  …  0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625  …  0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 ⋮                                   ⋱  ⋮                                 
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625  …  0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625  …  0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625

[:, :, 2] =
 0.00390625  0.00390625  0.00390625  …  0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625  …  0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625  …  0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 ⋮                                   ⋱  ⋮                                 
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625  …  0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625  …  0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625

[:, :, 3] =
 0.00390625  0.00390625  0.00390625  …  0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625  …  0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625  …  0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 ⋮                                   ⋱  ⋮                                 
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625  …  0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625  …  0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625
 0.00390625  0.00390625  0.00390625     0.00390625  0.00390625  0.00390625

...

[:, :, 11] =
 0.25  0.25  0.25  0.25  0.25  0.25  …  0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25  …  0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25  …  0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 ⋮                             ⋮     ⋱                    ⋮               
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25  …  0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25  …  0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25

[:, :, 12] =
 0.25  0.25  0.25  0.25  0.25  0.25  …  0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25  …  0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25  …  0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 ⋮                             ⋮     ⋱                    ⋮               
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25  …  0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25  …  0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25

[:, :, 13] =
 0.25  0.25  0.25  0.25  0.25  0.25  …  0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25  …  0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25  …  0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 ⋮                             ⋮     ⋱                    ⋮               
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25  …  0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25  …  0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25
 0.25  0.25  0.25  0.25  0.25  0.25     0.25  0.25  0.25  0.25  0.25  0.25

Choose a value of the regularization parameter.

In [70]:
lambd = .01;
Out[70]:
0.01

Shortcut for the wavelet transform and the reconstruction.

Important: Scilab users have to create files |Xi.m|, |PsiS.m| and |Psi.m| to implement this function.

In [71]:
Xi = a -> NtToolBox.perform_wavelet_transf(a, Jmin, -1, "9-7", 0, 1)
PsiS = f -> NtToolBox.perform_wavelet_transf(f, Jmin, + 1, "9-7", 0, 1)
Psi = a -> Xi(a./U);

The forward-backward algorithm now compute a series of wavelet coefficients $a^{(\ell)}$ computed as $$a^{(\ell+1)} = S_{\tau\lambda}( a^{(\ell)} + \Psi^*\Phi( y - \Phi\Psi a^{(\ell)} ) ). $$

The soft thresholding is defined as: $$\forall m, \quad S_T(a)_m = \max(0, 1-T/\|a_m\|)a_m. $$

The step size should satisfy: $$\tau < \frac{2}{\|\Psi\Phi \|} \leq 2 \min( u ). $$

In [72]:
tau = 1.9*minimum(u);
Out[72]:
0.007421875

Initialize the wavelet coefficients with those of the previous reconstruction.

In [119]:
a = U.*PsiS(fSpars) ;
Out[119]:
128×128×13 Array{Float64,3}:
[:, :, 1] =
 0.0323998  0.0323408  0.032198   …  0.0497627  0.0497896  0.0498026
 0.0323222  0.0322629  0.0321195     0.049747   0.0497735  0.0497863
 0.0320984  0.0320383  0.0318918     0.0497036  0.0497298  0.0497425
 0.031718   0.0316577  0.0315087     0.0496433  0.0496698  0.0496825
 0.0311659  0.0311074  0.0309599     0.0495708  0.0495982  0.0496111
 0.0304433  0.0303877  0.0302443  …  0.0494936  0.0495219  0.0495349
 0.0295714  0.0295197  0.0293834     0.0494165  0.0494451  0.049458 
 0.0285937  0.0285468  0.0284203     0.0493344  0.0493626  0.0493751
 0.0275173  0.0274758  0.0273607     0.0492568  0.0492838  0.0492956
 0.026379   0.0263437  0.0262417     0.0491961  0.0492221  0.0492333
 0.0252068  0.0251783  0.0250907  …  0.0491663  0.0491926  0.0492036
 0.0239974  0.0239752  0.0239017     0.0491732  0.0492007  0.0492117
 0.0227982  0.022781   0.0227206     0.0492064  0.0492362  0.049248 
 ⋮                                ⋱  ⋮                              
 0.0382111  0.0381686  0.0379823     0.0554137  0.0557831  0.0559001
 0.0386472  0.0385807  0.0383335     0.0555067  0.0558895  0.0560103
 0.0389726  0.0388838  0.0385819     0.0555916  0.0559873  0.0561121
 0.039167   0.0390565  0.038703      0.0556713  0.056079   0.0562073
 0.0392543  0.0391239  0.0387253  …  0.0557556  0.0561754  0.0563073
 0.0392432  0.039097   0.0386641     0.0558498  0.0562819  0.0564178
 0.0391019  0.0389432  0.0384843     0.055957   0.056401   0.0565408
 0.038889   0.0387209  0.038244      0.0560627  0.0565176  0.0566608
 0.0387639  0.038588   0.0380964     0.0561424  0.0566062  0.0567523
 0.0387139  0.0385291  0.0380195  …  0.056187   0.0566565  0.0568045
 0.0387146  0.0385229  0.0379989     0.0562012  0.056674   0.056823 
 0.0387365  0.0385424  0.038014      0.0562015  0.0566757  0.0568252

[:, :, 2] =
  0.00517647    0.00517974    0.00520866   …   0.000453717   0.000450113
  0.00494234    0.00494263    0.00496545       0.000460674   0.000457591
  0.00430781    0.00430343    0.00431392       0.000493755   0.000492376
  0.003369      0.00336223    0.00336226       0.000513809   0.000514249
  0.002139      0.00212951    0.00211789       0.000498846   0.000500436
  0.000783836   0.00077358    0.000752523  …   0.00046975    0.000472512
 -0.000455605  -0.000464663  -0.000491254      0.000428269   0.000432135
 -0.00151366   -0.00152437   -0.00155787       0.000335391   0.000340167
 -0.00234035   -0.00235136   -0.00238571       0.000150589   0.000155272
 -0.00277925   -0.00278589   -0.0028116       -8.79292e-5   -8.46004e-5 
 -0.00278339   -0.00278594   -0.00280389   …  -0.000359     -0.000358327
 -0.00249213   -0.00249172   -0.00250609      -0.00060884   -0.000610867
 -0.00190337   -0.00189785   -0.00190492      -0.000821571  -0.000825059
  ⋮                                        ⋱                            
  0.00361317    0.0035157     0.00327191      -0.000156927  -0.000157751
  0.00311397    0.00302163    0.00280189       7.32275e-5    7.75564e-5 
  0.00262674    0.00254535    0.00235944       0.000221212   0.000229091
  0.00191046    0.00184993    0.00170897       0.000311088   0.000320092
  0.00101829    0.000980576   0.00088745   …   0.000377979   0.00038726 
  3.51231e-5    2.00672e-5   -2.05707e-5       0.00041864    0.00042811 
 -0.00106653   -0.00106246   -0.00105575       0.000405405   0.000414133
 -0.00205521   -0.00204404   -0.00200543       0.000322731   0.000330442
 -0.00278134   -0.00276769   -0.00270948       0.000213383   0.000219765
 -0.00340196   -0.00338264   -0.00330966   …   0.000131161   0.000135772
 -0.00381386   -0.00379104   -0.00371005       9.1912e-5     9.55647e-5 
 -0.00393484   -0.00391203   -0.00382998       8.23326e-5    8.59554e-5 

[:, :, 3] =
  0.00138346    0.00134712    0.00132677   …  0.000294635  0.000295266
  0.00135317    0.00131994    0.00130722      0.000291986  0.000292514
  0.00126775    0.00123879    0.00123711      0.00029099   0.000291185
  0.00117381    0.00114881    0.00115346      0.000299619  0.000300178
  0.00112772    0.00110504    0.00110927      0.000316047  0.00031857 
  0.00113961    0.00111067    0.0010943    …  0.000339363  0.000343832
  0.00117624    0.00113606    0.00108741      0.000364889  0.000369789
  0.00121848    0.00116595    0.00108587      0.000383485  0.000387823
  0.0012667     0.00120006    0.00108392      0.000396432  0.000398699
  0.00129647    0.0012172     0.00106779      0.000404625  0.000404162
  0.00130226    0.0012131     0.00103741   …  0.0004126    0.000409924
  0.00125916    0.00116201    0.000964115     0.00042561   0.000420912
  0.00111489    0.00101673    0.000814625     0.000437259  0.000431794
  ⋮                                        ⋱                          
 -0.00701353   -0.00645471   -0.00488546      0.00423701   0.00419635 
 -0.0061419    -0.00556284   -0.00394824      0.00463457   0.00461425 
 -0.00523841   -0.00464047   -0.00298138      0.00500965   0.00500903 
 -0.00434972   -0.00373374   -0.00203747      0.00536678   0.00538489 
 -0.00349784   -0.00286825   -0.00114521   …  0.00571198   0.00574873 
 -0.00268225   -0.00204489   -0.00030208      0.0060428    0.00609817 
 -0.00197506   -0.00133154    0.000422803     0.00636174   0.00643585 
 -0.00138652   -0.000738818   0.00102683      0.00664149   0.00673154 
 -0.000848183  -0.000201675   0.0015706       0.00686179   0.00696506 
 -0.000418623   0.000221834   0.00199185   …  0.00701477   0.00712807 
 -0.000118745   0.000514737   0.00227896      0.00709912   0.00721828 
  8.54528e-6    0.000639047   0.00239924      0.00712595   0.00724738 

...

[:, :, 11] =
  0.00917278    0.0059813    -0.0003906    …   2.6135e-5    -0.00727329 
 -0.00578015   -0.00345774   -9.64634e-5       2.88868e-6    0.00466319 
  0.000226373  -0.00015325    0.000920243     -0.00055009   -0.000929758
  0.00289008    0.00186457    4.47425e-5       1.84094e-6   -0.00195206 
 -0.0011004    -0.000295352  -0.000969456      0.00134247    0.00332507 
 -3.05964e-6    0.000455492   9.06466e-6   …  -2.87702e-6    0.00015365 
 -0.00648117   -0.00686412    0.00171912      -0.0038552    -0.00532766 
  0.012836      0.0127208     5.0436e-6        0.00529647    0.00647929 
 -0.00939992   -0.0143707    -0.00260384      -0.00197791   -0.00421301 
  2.57869e-6    0.0121628    -9.07698e-6      -1.95516e-6    5.58853e-5 
  0.00194427   -0.00793574    0.00106706   …  -0.000226475   0.0101109  
 -1.61525e-5    0.00201233    1.54474e-6      -0.000125637  -0.0156778  
 -0.000253978   0.00230396   -0.000617237     -0.00123861    0.00256596 
  ⋮                                        ⋱                            
 -0.0113686    -0.0107581    -0.00820473       0.00462062    0.00193607 
 -1.5835e-6     0.000135531   0.000844468      0.000351036  -0.0010109  
  0.00553721    0.0118431     0.0120688       -0.0017466     0.00297146 
  1.89804e-5   -0.0135594    -0.0155274       -6.6132e-6    -0.00105747 
 -0.00669351    0.00859285    0.00832474   …  -0.000579034  -0.00308679 
  1.24352e-5   -0.00573448   -1.0406e-7        0.00178989    0.00218958 
  0.0247506     0.00643042   -0.000939542     -0.00130609    0.00198111 
 -0.0406369    -0.00350874   -7.5799e-6       -0.000372674  -0.00223376 
  0.024568     -0.00146143   -0.00114894       0.00132469    0.00165565 
  9.05832e-7   -4.31222e-6   -3.18369e-6   …  -1.46323e-5   -0.00127304 
 -0.00402011    0.0061899     0.000796783     -0.000598393  -0.00490415 
 -5.99336e-7   -0.00909471    4.52905e-6       7.73119e-8    0.00990787 

[:, :, 12] =
 -0.00449085    0.00396476   -0.00232793   …  -0.004166      0.000106307
 -0.00319138    0.00245479   -0.00263145      -0.00319149    0.00150597 
  0.000276887  -4.09888e-7   -0.0035307       -5.42174e-5    1.72207e-6 
 -0.000669645   0.00224556   -0.00571916       0.000406822  -0.000740761
  0.00156905    8.3065e-7    -0.00376568      -0.00287511    0.00385917 
  0.0121829    -0.0107244     0.00396029   …  -0.00337479    0.0040742  
  0.00143818    3.64192e-6   -0.00286217      -0.000834918  -3.2223e-6  
 -0.0150026     0.0160502    -0.0109862        0.00254461   -0.00426762 
 -0.00333856    1.73205e-5    0.0149371        0.000350724  -8.21148e-6 
  0.0063219    -0.0110108     0.0278823       -0.0072361     0.0127171  
  0.00264213    1.18644e-6   -0.000154913  …  -0.00335475    0.00419147 
  0.00187368    0.00315936   -0.0111698        0.00271088   -0.0100405  
  0.00291421    1.52903e-6   -0.00382745      -0.00441138    0.00492219 
  ⋮                                        ⋱                            
 -0.00155628   -2.30734e-6    0.00878927       0.0111443    -0.0242021  
 -0.00695112   -0.000924208   0.0135361        0.00185189   -0.0106929  
 -0.00444753    1.7005e-7     0.00510469      -0.0030131    -0.000350499
 -0.00283263   -4.38726e-5    0.00161316      -0.000740089  -0.00451854 
 -0.00464792   -3.13702e-5    0.0019732    …   0.00214606   -0.00986261 
  0.00208197    0.000606016  -0.0084714       -0.00184894   -0.00279289 
 -0.0057928    -2.36751e-6    0.00556668      -0.00336604   -0.000118007
 -0.0150847    -0.000946847   0.0154178        0.00298061   -0.0110905  
  0.00436221   -7.43884e-6   -0.0181692        0.00729842   -0.0202426  
  0.011324      0.000659877  -0.0245522    …   0.00633017   -0.0198908  
  0.000485084  -1.67179e-7   -0.0091768        0.00173709   -0.00808405 
 -0.00203203   -0.000379899  -0.00998926      -0.00163171    0.0013418  

[:, :, 13] =
  0.00355613   -0.00148139    0.00109446   …   0.00567658   -0.011787   
 -0.00390518    0.00225513   -0.000823849     -0.00585691    0.00992344 
  0.00291568   -0.00260679    0.00092917       0.00384719   -0.00452704 
  0.000769272  -4.32777e-5   -0.000172785      0.000322157  -0.0015609  
 -0.00507854    0.0045675    -0.00264505      -0.00148378    0.0028184  
 -0.000486914  -2.54141e-6    0.000993943  …   0.000106665  -2.74196e-5 
  0.0223541    -0.0205873     0.0111644       -0.000694964  -1.35017e-5 
 -0.037605      0.0342995    -0.0187872        0.000727461  -0.000130751
  0.0284017    -0.0205045    -0.00207004       0.00220342   -0.00411383 
 -0.0131143    -1.62847e-6    0.0304887        6.58912e-5   -3.36512e-6 
  0.00633852    0.00410443   -0.025955     …  -0.011318      0.0199959  
 -0.00288937    3.68533e-6    0.00357207       0.0198394    -0.0331738  
 -0.00139255   -0.000999686   0.00773926      -0.0161178     0.0205128  
  ⋮                                        ⋱                            
  0.00140687   -0.00146856    0.00183502      -0.00371426    0.00227529 
 -0.000101363  -2.76017e-6   -0.000649082      0.00185308   -0.00307042 
 -0.00647417    0.000397433   0.00787183      -0.00345536    0.00705918 
  0.0127634     2.95358e-5   -0.0174405        5.61066e-5   -0.000509055
 -0.0132811    -0.000300215   0.019811     …   0.00292933   -0.0052473  
  0.00418413    1.09733e-5   -0.00906009       0.000416578  -0.000188827
  0.0167908     0.00120511   -0.0217332       -0.00517419    0.00816235 
 -0.0329837    -0.00200856    0.0501814        0.00375625   -0.00551317 
  0.0230317     0.00119328   -0.0384689       -0.000278776   0.000643716
 -4.47471e-6    8.36188e-7   -8.85988e-6   …  -0.000206552  -0.000759992
 -0.0103651    -0.000191526   0.0237438        0.00282315   -0.00670209 
  0.0106138    -5.11262e-7   -0.0282697       -0.00561311    0.014263   

Gradient descent.

In [120]:
fTI = Psi(a)
a = a + tau.*PsiS(Phi(y-Phi(fTI, Omega), Omega));
Out[120]:
128×128×13 Array{Float64,3}:
[:, :, 1] =
 0.0323998  0.0323408  0.032198   …  0.0497627  0.0497896  0.0498026
 0.0323222  0.0322629  0.0321195     0.049747   0.0497735  0.0497863
 0.0320984  0.0320383  0.0318918     0.0497036  0.0497298  0.0497425
 0.031718   0.0316577  0.0315087     0.0496433  0.0496698  0.0496825
 0.0311659  0.0311074  0.0309599     0.0495708  0.0495982  0.0496111
 0.0304433  0.0303877  0.0302443  …  0.0494936  0.0495219  0.0495349
 0.0295714  0.0295197  0.0293834     0.0494165  0.0494451  0.049458 
 0.0285937  0.0285468  0.0284203     0.0493344  0.0493626  0.0493751
 0.0275173  0.0274758  0.0273607     0.0492568  0.0492838  0.0492956
 0.026379   0.0263437  0.0262417     0.0491961  0.0492221  0.0492333
 0.0252068  0.0251783  0.0250907  …  0.0491663  0.0491926  0.0492036
 0.0239974  0.0239752  0.0239017     0.0491732  0.0492007  0.0492117
 0.0227982  0.022781   0.0227206     0.0492064  0.0492362  0.049248 
 ⋮                                ⋱  ⋮                              
 0.0382111  0.0381686  0.0379823     0.0554137  0.0557831  0.0559001
 0.0386472  0.0385807  0.0383335     0.0555067  0.0558895  0.0560103
 0.0389726  0.0388838  0.0385819     0.0555916  0.0559873  0.0561121
 0.039167   0.0390565  0.038703      0.0556713  0.056079   0.0562073
 0.0392543  0.0391239  0.0387253  …  0.0557556  0.0561754  0.0563073
 0.0392432  0.039097   0.0386641     0.0558498  0.0562819  0.0564178
 0.0391019  0.0389432  0.0384843     0.055957   0.056401   0.0565408
 0.038889   0.0387209  0.038244      0.0560627  0.0565176  0.0566608
 0.0387639  0.038588   0.0380964     0.0561424  0.0566062  0.0567523
 0.0387139  0.0385291  0.0380195  …  0.056187   0.0566565  0.0568045
 0.0387146  0.0385229  0.0379989     0.0562012  0.056674   0.056823 
 0.0387365  0.0385424  0.038014      0.0562015  0.0566757  0.0568252

[:, :, 2] =
  0.00517647    0.00517974    0.00520866   …   0.000453717   0.000450113
  0.00494234    0.00494263    0.00496545       0.000460674   0.000457591
  0.00430781    0.00430343    0.00431392       0.000493755   0.000492376
  0.003369      0.00336223    0.00336226       0.000513809   0.000514249
  0.002139      0.00212951    0.00211789       0.000498846   0.000500436
  0.000783836   0.00077358    0.000752523  …   0.00046975    0.000472512
 -0.000455605  -0.000464663  -0.000491254      0.000428269   0.000432135
 -0.00151366   -0.00152437   -0.00155787       0.000335391   0.000340167
 -0.00234035   -0.00235136   -0.00238571       0.000150589   0.000155272
 -0.00277925   -0.00278589   -0.0028116       -8.79292e-5   -8.46004e-5 
 -0.00278339   -0.00278594   -0.00280389   …  -0.000359     -0.000358327
 -0.00249213   -0.00249172   -0.00250609      -0.00060884   -0.000610867
 -0.00190337   -0.00189785   -0.00190492      -0.000821571  -0.000825059
  ⋮                                        ⋱                            
  0.00361317    0.0035157     0.00327191      -0.000156927  -0.000157751
  0.00311397    0.00302163    0.00280189       7.32275e-5    7.75564e-5 
  0.00262674    0.00254535    0.00235944       0.000221212   0.000229091
  0.00191046    0.00184993    0.00170897       0.000311088   0.000320092
  0.00101829    0.000980576   0.00088745   …   0.000377979   0.00038726 
  3.51231e-5    2.00672e-5   -2.05707e-5       0.00041864    0.00042811 
 -0.00106653   -0.00106246   -0.00105575       0.000405405   0.000414133
 -0.00205521   -0.00204404   -0.00200543       0.000322731   0.000330442
 -0.00278134   -0.00276769   -0.00270948       0.000213383   0.000219765
 -0.00340196   -0.00338264   -0.00330966   …   0.000131161   0.000135772
 -0.00381386   -0.00379104   -0.00371005       9.1912e-5     9.55647e-5 
 -0.00393484   -0.00391203   -0.00382998       8.23326e-5    8.59554e-5 

[:, :, 3] =
  0.00138346    0.00134712    0.00132677   …  0.000294635  0.000295266
  0.00135317    0.00131994    0.00130722      0.000291986  0.000292514
  0.00126775    0.00123879    0.00123711      0.00029099   0.000291185
  0.00117381    0.00114881    0.00115346      0.000299619  0.000300178
  0.00112772    0.00110504    0.00110927      0.000316047  0.00031857 
  0.00113961    0.00111067    0.0010943    …  0.000339363  0.000343832
  0.00117624    0.00113606    0.00108741      0.000364889  0.000369789
  0.00121848    0.00116595    0.00108587      0.000383485  0.000387823
  0.0012667     0.00120006    0.00108392      0.000396432  0.000398699
  0.00129647    0.0012172     0.00106779      0.000404625  0.000404162
  0.00130226    0.0012131     0.00103741   …  0.0004126    0.000409924
  0.00125916    0.00116201    0.000964115     0.00042561   0.000420912
  0.00111489    0.00101673    0.000814625     0.000437259  0.000431794
  ⋮                                        ⋱                          
 -0.00701353   -0.00645471   -0.00488546      0.00423701   0.00419635 
 -0.0061419    -0.00556284   -0.00394824      0.00463457   0.00461425 
 -0.00523841   -0.00464047   -0.00298138      0.00500965   0.00500903 
 -0.00434972   -0.00373374   -0.00203747      0.00536678   0.00538489 
 -0.00349784   -0.00286825   -0.00114521   …  0.00571198   0.00574873 
 -0.00268225   -0.00204489   -0.00030208      0.0060428    0.00609817 
 -0.00197506   -0.00133154    0.000422803     0.00636174   0.00643585 
 -0.00138652   -0.000738818   0.00102683      0.00664149   0.00673154 
 -0.000848183  -0.000201675   0.0015706       0.00686179   0.00696506 
 -0.000418623   0.000221834   0.00199185   …  0.00701477   0.00712807 
 -0.000118745   0.000514737   0.00227896      0.00709912   0.00721828 
  8.54528e-6    0.000639047   0.00239924      0.00712595   0.00724738 

...

[:, :, 11] =
  0.00917278    0.0059813    -0.0003906    …   2.6135e-5    -0.00727329 
 -0.00578015   -0.00345774   -9.64634e-5       2.88868e-6    0.00466319 
  0.000226373  -0.00015325    0.000920243     -0.00055009   -0.000929758
  0.00289008    0.00186457    4.47425e-5       1.84094e-6   -0.00195206 
 -0.0011004    -0.000295352  -0.000969456      0.00134247    0.00332507 
 -3.05964e-6    0.000455492   9.06466e-6   …  -2.87702e-6    0.00015365 
 -0.00648117   -0.00686412    0.00171912      -0.0038552    -0.00532766 
  0.012836      0.0127208     5.0436e-6        0.00529647    0.00647929 
 -0.00939992   -0.0143707    -0.00260384      -0.00197791   -0.00421301 
  2.57869e-6    0.0121628    -9.07698e-6      -1.95516e-6    5.58853e-5 
  0.00194427   -0.00793574    0.00106706   …  -0.000226475   0.0101109  
 -1.61525e-5    0.00201233    1.54474e-6      -0.000125637  -0.0156778  
 -0.000253978   0.00230396   -0.000617237     -0.00123861    0.00256596 
  ⋮                                        ⋱                            
 -0.0113686    -0.0107581    -0.00820473       0.00462062    0.00193607 
 -1.5835e-6     0.000135531   0.000844468      0.000351036  -0.0010109  
  0.00553721    0.0118431     0.0120688       -0.0017466     0.00297146 
  1.89804e-5   -0.0135594    -0.0155274       -6.6132e-6    -0.00105747 
 -0.00669351    0.00859285    0.00832474   …  -0.000579034  -0.00308679 
  1.24352e-5   -0.00573448   -1.0406e-7        0.00178989    0.00218958 
  0.0247506     0.00643042   -0.000939542     -0.00130609    0.00198111 
 -0.0406369    -0.00350874   -7.5799e-6       -0.000372674  -0.00223376 
  0.024568     -0.00146143   -0.00114894       0.00132469    0.00165565 
  9.05832e-7   -4.31222e-6   -3.18369e-6   …  -1.46323e-5   -0.00127304 
 -0.00402011    0.0061899     0.000796783     -0.000598393  -0.00490415 
 -5.99336e-7   -0.00909471    4.52905e-6       7.73119e-8    0.00990787 

[:, :, 12] =
 -0.00449085    0.00396476   -0.00232793   …  -0.004166      0.000106307
 -0.00319138    0.00245479   -0.00263145      -0.00319149    0.00150597 
  0.000276887  -4.09888e-7   -0.0035307       -5.42174e-5    1.72207e-6 
 -0.000669645   0.00224556   -0.00571916       0.000406822  -0.000740761
  0.00156905    8.3065e-7    -0.00376568      -0.00287511    0.00385917 
  0.0121829    -0.0107244     0.00396029   …  -0.00337479    0.0040742  
  0.00143818    3.64192e-6   -0.00286217      -0.000834918  -3.2223e-6  
 -0.0150026     0.0160502    -0.0109862        0.00254461   -0.00426762 
 -0.00333856    1.73205e-5    0.0149371        0.000350724  -8.21148e-6 
  0.0063219    -0.0110108     0.0278823       -0.0072361     0.0127171  
  0.00264213    1.18644e-6   -0.000154913  …  -0.00335475    0.00419147 
  0.00187368    0.00315936   -0.0111698        0.00271088   -0.0100405  
  0.00291421    1.52903e-6   -0.00382745      -0.00441138    0.00492219 
  ⋮                                        ⋱                            
 -0.00155628   -2.30734e-6    0.00878927       0.0111443    -0.0242021  
 -0.00695112   -0.000924208   0.0135361        0.00185189   -0.0106929  
 -0.00444753    1.7005e-7     0.00510469      -0.0030131    -0.000350499
 -0.00283263   -4.38726e-5    0.00161316      -0.000740089  -0.00451854 
 -0.00464792   -3.13702e-5    0.0019732    …   0.00214606   -0.00986261 
  0.00208197    0.000606016  -0.0084714       -0.00184894   -0.00279289 
 -0.0057928    -2.36751e-6    0.00556668      -0.00336604   -0.000118007
 -0.0150847    -0.000946847   0.0154178        0.00298061   -0.0110905  
  0.00436221   -7.43884e-6   -0.0181692        0.00729842   -0.0202426  
  0.011324      0.000659877  -0.0245522    …   0.00633017   -0.0198908  
  0.000485084  -1.67179e-7   -0.0091768        0.00173709   -0.00808405 
 -0.00203203   -0.000379899  -0.00998926      -0.00163171    0.0013418  

[:, :, 13] =
  0.00355613   -0.00148139    0.00109446   …   0.00567658   -0.011787   
 -0.00390518    0.00225513   -0.000823849     -0.00585691    0.00992344 
  0.00291568   -0.00260679    0.00092917       0.00384719   -0.00452704 
  0.000769272  -4.32777e-5   -0.000172785      0.000322157  -0.0015609  
 -0.00507854    0.0045675    -0.00264505      -0.00148378    0.0028184  
 -0.000486914  -2.54141e-6    0.000993943  …   0.000106665  -2.74196e-5 
  0.0223541    -0.0205873     0.0111644       -0.000694964  -1.35017e-5 
 -0.037605      0.0342995    -0.0187872        0.000727461  -0.000130751
  0.0284017    -0.0205045    -0.00207004       0.00220342   -0.00411383 
 -0.0131143    -1.62847e-6    0.0304887        6.58912e-5   -3.36512e-6 
  0.00633852    0.00410443   -0.025955     …  -0.011318      0.0199959  
 -0.00288937    3.68533e-6    0.00357207       0.0198394    -0.0331738  
 -0.00139255   -0.000999686   0.00773926      -0.0161178     0.0205128  
  ⋮                                        ⋱                            
  0.00140687   -0.00146856    0.00183502      -0.00371426    0.00227529 
 -0.000101363  -2.76017e-6   -0.000649082      0.00185308   -0.00307042 
 -0.00647417    0.000397433   0.00787183      -0.00345536    0.00705918 
  0.0127634     2.95358e-5   -0.0174405        5.61066e-5   -0.000509055
 -0.0132811    -0.000300215   0.019811     …   0.00292933   -0.0052473  
  0.00418413    1.09733e-5   -0.00906009       0.000416578  -0.000188827
  0.0167908     0.00120511   -0.0217332       -0.00517419    0.00816235 
 -0.0329837    -0.00200856    0.0501814        0.00375625   -0.00551317 
  0.0230317     0.00119328   -0.0384689       -0.000278776   0.000643716
 -4.47471e-6    8.36188e-7   -8.85988e-6   …  -0.000206552  -0.000759992
 -0.0103651    -0.000191526   0.0237438        0.00282315   -0.00670209 
  0.0106138    -5.11262e-7   -0.0282697       -0.00561311    0.014263   

Soft threshold.

In [121]:
a = SoftThresh(a, lambd*tau);
Out[121]:
128×128×13 Array{Float64,3}:
[:, :, 1] =
 0.0323256  0.0322666  0.0321238  …  0.0496885  0.0497154  0.0497284
 0.032248   0.0321887  0.0320453     0.0496728  0.0496992  0.0497121
 0.0320242  0.0319641  0.0318176     0.0496294  0.0496555  0.0496682
 0.0316437  0.0315835  0.0314344     0.0495691  0.0495955  0.0496082
 0.0310917  0.0310332  0.0308857     0.0494966  0.049524   0.0495368
 0.0303691  0.0303135  0.0301701  …  0.0494194  0.0494477  0.0494607
 0.0294972  0.0294455  0.0293092     0.0493423  0.0493709  0.0493838
 0.0285195  0.0284726  0.0283461     0.0492602  0.0492884  0.0493009
 0.0274431  0.0274016  0.0272865     0.0491826  0.0492095  0.0492214
 0.0263047  0.0262695  0.0261674     0.0491219  0.0491479  0.0491591
 0.0251326  0.0251041  0.0250165  …  0.0490921  0.0491183  0.0491293
 0.0239232  0.0239009  0.0238275     0.049099   0.0491264  0.0491375
 0.0227239  0.0227068  0.0226463     0.0491322  0.049162   0.0491738
 ⋮                                ⋱  ⋮                              
 0.0381369  0.0380944  0.0379081     0.0553394  0.0557089  0.0558259
 0.038573   0.0385065  0.0382593     0.0554324  0.0558153  0.0559361
 0.0388984  0.0388096  0.0385077     0.0555174  0.0559131  0.0560379
 0.0390928  0.0389823  0.0386288     0.0555971  0.0560048  0.0561331
 0.03918    0.0390497  0.0386511  …  0.0556814  0.0561011  0.0562331
 0.039169   0.0390228  0.0385898     0.0557755  0.0562076  0.0563436
 0.0390277  0.0388689  0.0384101     0.0558827  0.0563268  0.0564666
 0.0388148  0.0386467  0.0381698     0.0559885  0.0564434  0.0565866
 0.0386897  0.0385137  0.0380221     0.0560682  0.056532   0.0566781
 0.0386397  0.0384549  0.0379453  …  0.0561127  0.0565823  0.0567303
 0.0386404  0.0384486  0.0379247     0.0561269  0.0565998  0.0567488
 0.0386623  0.0384681  0.0379398     0.0561273  0.0566014  0.056751 

[:, :, 2] =
  0.00510226    0.00510552    0.00513444   …   0.000379498   0.000375894
  0.00486813    0.00486841    0.00489124       0.000386455   0.000383373
  0.00423359    0.00422921    0.0042397        0.000419537   0.000418157
  0.00329478    0.00328801    0.00328804       0.000439591   0.00044003 
  0.00206479    0.0020553     0.00204367       0.000424627   0.000426217
  0.000709618   0.000699362   0.000678304  …   0.000395531   0.000398293
 -0.000381386  -0.000390444  -0.000417035      0.000354051   0.000357917
 -0.00143944   -0.00145015   -0.00148365       0.000261173   0.000265948
 -0.00226614   -0.00227714   -0.00231149       7.63704e-5    8.10535e-5 
 -0.00270503   -0.00271167   -0.00273738      -1.37104e-5   -1.03817e-5 
 -0.00270917   -0.00271172   -0.00272967   …  -0.000284781  -0.000284108
 -0.00241791   -0.0024175    -0.00243187      -0.000534622  -0.000536648
 -0.00182916   -0.00182363   -0.0018307       -0.000747352  -0.00075084 
  ⋮                                        ⋱                            
  0.00353895    0.00344148    0.00319769      -8.27083e-5   -8.35321e-5 
  0.00303975    0.00294742    0.00272767       0.0           3.33761e-6 
  0.00255252    0.00247113    0.00228522       0.000146993   0.000154873
  0.00183624    0.00177572    0.00163476       0.000236869   0.000245873
  0.000944073   0.000906358   0.000813231  …   0.000303761   0.000313041
  0.0           0.0          -0.0              0.000344421   0.000353891
 -0.000992316  -0.000988242  -0.000981535      0.000331186   0.000339915
 -0.001981     -0.00196982   -0.00193121       0.000248512   0.000256223
 -0.00270712   -0.00269348   -0.00263526       0.000139164   0.000145546
 -0.00332774   -0.00330843   -0.00323545   …   5.69425e-5    6.15534e-5 
 -0.00373964   -0.00371682   -0.00363583       1.76932e-5    2.1346e-5  
 -0.00386062   -0.00383781   -0.00375576       8.11381e-6    1.17366e-5 

[:, :, 3] =
  0.00130924    0.00127291    0.00125255   …  0.000220417  0.000221047
  0.00127895    0.00124572    0.001233        0.000217767  0.000218295
  0.00119353    0.00116457    0.00116289      0.000216771  0.000216966
  0.00109959    0.00107459    0.00107924      0.000225401  0.000225959
  0.0010535     0.00103082    0.00103505      0.000241829  0.000244351
  0.0010654     0.00103645    0.00102008   …  0.000265144  0.000269613
  0.00110202    0.00106184    0.00101319      0.00029067   0.00029557 
  0.00114426    0.00109173    0.00101165      0.000309267  0.000313605
  0.00119248    0.00112585    0.0010097       0.000322214  0.000324481
  0.00122225    0.00114298    0.000993568     0.000330406  0.000329943
  0.00122804    0.00113888    0.000963195  …  0.000338382  0.000335706
  0.00118494    0.00108779    0.000889896     0.000351391  0.000346693
  0.00104067    0.000942508   0.000740406     0.00036304   0.000357576
  ⋮                                        ⋱                          
 -0.00693931   -0.00638049   -0.00481124      0.00416279   0.00412214 
 -0.00606768   -0.00548862   -0.00387402      0.00456035   0.00454003 
 -0.0051642    -0.00456625   -0.00290716      0.00493543   0.00493481 
 -0.00427551   -0.00365952   -0.00196325      0.00529256   0.00531067 
 -0.00342362   -0.00279403   -0.001071     …  0.00563776   0.00567451 
 -0.00260803   -0.00197067   -0.000227861     0.00596858   0.00602395 
 -0.00190084   -0.00125732    0.000348584     0.00628752   0.00636163 
 -0.0013123    -0.000664599   0.000952607     0.00656727   0.00665733 
 -0.000773965  -0.000127456   0.00149638      0.00678757   0.00689084 
 -0.000344404   0.000147616   0.00191763   …  0.00694055   0.00705385 
 -4.45263e-5    0.000440518   0.00220474      0.0070249    0.00714406 
  0.0           0.000564828   0.00232503      0.00705173   0.00717317 

...

[:, :, 11] =
  0.00909856    0.00590708   -0.000316382  …   0.0          -0.00719907 
 -0.00570593   -0.00338352   -2.22446e-5       0.0           0.00458897 
  0.000152154  -7.90309e-5    0.000846024     -0.000475871  -0.000855539
  0.00281586    0.00179035    0.0              0.0          -0.00187784 
 -0.00102618   -0.000221133  -0.000895237      0.00126825    0.00325085 
 -0.0           0.000381273   0.0          …  -0.0           7.94315e-5 
 -0.00640695   -0.0067899     0.0016449       -0.00378098   -0.00525344 
  0.0127618     0.0126466     0.0              0.00522225    0.00640507 
 -0.0093257    -0.0142965    -0.00252962      -0.0019037    -0.00413879 
  0.0           0.0120885    -0.0             -0.0           0.0        
  0.00187006   -0.00786152    0.000992838  …  -0.000152256   0.0100367  
 -0.0           0.00193811    0.0             -5.14187e-5   -0.0156036  
 -0.00017976    0.00222974   -0.000543018     -0.00116439    0.00249174 
  ⋮                                        ⋱                            
 -0.0112943    -0.0106839    -0.00813051       0.0045464     0.00186185 
 -0.0           6.13123e-5    0.000770249      0.000276817  -0.000936678
  0.00546299    0.0117689     0.0119946       -0.00167239    0.00289724 
  0.0          -0.0134852    -0.0154532       -0.0          -0.000983246
 -0.00661929    0.00851863    0.00825052   …  -0.000504815  -0.00301257 
  0.0          -0.00566026   -0.0              0.00171567    0.00211536 
  0.0246764     0.0063562    -0.000865324     -0.00123187    0.00190689 
 -0.0405626    -0.00343452   -0.0             -0.000298456  -0.00215954 
  0.0244938    -0.00138722   -0.00107472       0.00125047    0.00158143 
  0.0          -0.0          -0.0          …  -0.0          -0.00119882 
 -0.00394589    0.00611568    0.000722564     -0.000524174  -0.00482993 
 -0.0          -0.00902049    0.0              0.0           0.00983365 

[:, :, 12] =
 -0.00441663    0.00389054   -0.00225371  …  -0.00409178    3.20887e-5 
 -0.00311716    0.00238057   -0.00255723     -0.00311727    0.00143175 
  0.000202668  -0.0          -0.00345648     -0.0           0.0        
 -0.000595426   0.00217134   -0.00564495      0.000332604  -0.000666543
  0.00149483    0.0          -0.00369146     -0.00280089    0.00378495 
  0.0121087    -0.0106502     0.00388607  …  -0.00330057    0.00399998 
  0.00136396    0.0          -0.00278795     -0.000760699  -0.0        
 -0.0149284     0.015976     -0.010912        0.00247039   -0.0041934  
 -0.00326434    0.0           0.0148629       0.000276505  -0.0        
  0.00624769   -0.0109366     0.027808       -0.00716188    0.0126429  
  0.00256791    0.0          -8.06943e-5  …  -0.00328053    0.00411725 
  0.00179946    0.00308514   -0.0110956       0.00263666   -0.00996628 
  0.00283999    0.0          -0.00375323     -0.00433716    0.00484797 
  ⋮                                       ⋱                            
 -0.00148207   -0.0           0.00871505      0.0110701    -0.0241279  
 -0.0068769    -0.000849989   0.0134619       0.00177767   -0.0106187  
 -0.00437331    0.0           0.00503047     -0.00293888   -0.00027628 
 -0.00275841   -0.0           0.00153894     -0.00066587   -0.00444432 
 -0.0045737    -0.0           0.00189899  …   0.00207184   -0.00978839 
  0.00200775    0.000531797  -0.00839718     -0.00177472   -0.00271867 
 -0.00571858   -0.0           0.00549247     -0.00329182   -4.37879e-5 
 -0.0150105    -0.000872628   0.0153435       0.00290639   -0.0110163  
  0.00428799   -0.0          -0.018095        0.0072242    -0.0201684  
  0.0112498     0.000585658  -0.024478    …   0.00625595   -0.0198166  
  0.000410866  -0.0          -0.00910258      0.00166287   -0.00800983 
 -0.00195781   -0.000305681  -0.00991504     -0.00155749    0.00126758 

[:, :, 13] =
  0.00348191   -0.00140717    0.00102024   …   0.00560236   -0.0117128  
 -0.00383096    0.00218091   -0.00074963      -0.0057827     0.00984923 
  0.00284146   -0.00253257    0.000854952      0.00377297   -0.00445282 
  0.000695053  -0.0          -9.85663e-5       0.000247939  -0.00148669 
 -0.00500433    0.00449328   -0.00257083      -0.00140956    0.00274418 
 -0.000412696  -0.0           0.000919725  …   3.24464e-5   -0.0        
  0.0222799    -0.0205131     0.0110902       -0.000620745  -0.0        
 -0.0375308     0.0342253    -0.018713         0.000653242  -5.65321e-5 
  0.0283275    -0.0204303    -0.00199582       0.0021292    -0.00403962 
 -0.0130401    -0.0           0.0304144        0.0          -0.0        
  0.0062643     0.00403021   -0.0258808    …  -0.0112437     0.0199217  
 -0.00281515    0.0           0.00349786       0.0197651    -0.0330996  
 -0.00131833   -0.000925467   0.00766504      -0.0160436     0.0204386  
  ⋮                                        ⋱                            
  0.00133265   -0.00139434    0.0017608       -0.00364004    0.00220107 
 -2.71445e-5   -0.0          -0.000574863      0.00177886   -0.0029962  
 -0.00639995    0.000323214   0.00779761      -0.00338114    0.00698496 
  0.0126892     0.0          -0.0173663        0.0          -0.000434836
 -0.0132069    -0.000225997   0.0197367    …   0.00285511   -0.00517308 
  0.00410991    0.0          -0.00898587       0.000342359  -0.000114609
  0.0167165     0.00113089   -0.021659        -0.00509997    0.00808813 
 -0.0329095    -0.00193434    0.0501072        0.00368203   -0.00543895 
  0.0229575     0.00111906   -0.0383947       -0.000204558   0.000569497
 -0.0           0.0          -0.0          …  -0.000132334  -0.000685773
 -0.0102909    -0.000117307   0.0236696        0.00274893   -0.00662787 
  0.0105395    -0.0          -0.0281955       -0.00553889    0.0141887  

Exercise 3

Perform the iterative soft thresholding. Monitor the decay of the energy $E$.

In [122]:
include("NtSolutions/inverse_5_inpainting_sparsity/exo3.jl")

# niter = 1000
# a = U.*PsiS(fSpars)
# E = []
# for i in 1 : niter
#     fTI = Psi(a)
#     d = y - Phi(fTI, Omega)
#     E = [E; 1/2.*vecnorm(d )^2 + lambd*sum(abs(a))]
#     # step 
#     a = SoftThresh(a + tau.*PsiS(Phi(d, Omega)), lambd*tau)
# end

# figure(figsize = (7, 5))    
# plot(E)
# show()
In [33]:
## Insert your code here.

Perform the reconstruction.

In [123]:
fTI = Psi(a);
Out[123]:
128×128 Array{Float64,2}:
 0.5959    0.595353  0.595991  0.599185  …  0.807213  0.808388  0.808516
 0.584907  0.588184  0.595149  0.599971     0.806676  0.808078  0.808128
 0.57642   0.578371  0.584624  0.59133      0.805662  0.80676   0.80688 
 0.553808  0.557782  0.563702  0.575164     0.804513  0.804783  0.806422
 0.526524  0.527638  0.531838  0.545507     0.802583  0.805117  0.810185
 0.515926  0.507072  0.512286  0.511787  …  0.801578  0.802938  0.805109
 0.535189  0.514696  0.511119  0.498153     0.797553  0.801766  0.801545
 0.490022  0.548615  0.486463  0.4634       0.799584  0.809141  0.811728
 0.476114  0.453588  0.431169  0.393877     0.798867  0.803229  0.803014
 0.426983  0.410282  0.35747   0.199237     0.797629  0.803465  0.808521
 0.416674  0.400481  0.384433  0.374821  …  0.796928  0.805276  0.816949
 0.429307  0.413717  0.373537  0.33173      0.801037  0.811888  0.804466
 0.385212  0.386148  0.384318  0.368313     0.803085  0.809197  0.817707
 ⋮                                       ⋱  ⋮                           
 0.580924  0.568513  0.543235  0.486878     0.961058  0.931482  0.879598
 0.545986  0.544897  0.540175  0.517487     0.967928  0.931906  0.889483
 0.515031  0.523947  0.534806  0.539136     0.965063  0.933182  0.911034
 0.493114  0.493515  0.509994  0.589143     0.96075   0.94113   0.917824
 0.479228  0.499803  0.535113  0.551319  …  0.969464  0.946206  0.917224
 0.477929  0.490436  0.529352  0.613429     0.963602  0.947622  0.925302
 0.481471  0.490679  0.535707  0.686777     0.973385  0.94616   0.926498
 0.476539  0.501362  0.550068  0.481006     0.987526  0.94981   0.921532
 0.501373  0.517568  0.600884  0.776717     0.998177  0.962172  0.91704 
 0.530728  0.546292  0.586503  0.622514  …  0.994391  0.968419  0.916097
 0.546093  0.563104  0.596076  0.570503     0.96796   0.954034  0.923728
 0.540378  0.556386  0.599143  0.728456     0.9488    0.941611  0.930387

Display the result.

In [124]:
figure(figsize = (6, 6))
imageplot(clamP(fTI))

Exercise 4

Perform the iteration with a decaying value of $\lambda$

In [125]:
include("NtSolutions/inverse_5_inpainting_sparsity/exo4.jl")
Out[125]:
PyObject <matplotlib.text.Text object at 0x0000000025B44048>
In [37]:
## Insert your code here.

Inpainting using Iterative Hard Thresholding

To improve the sparsity of the solution, it is possible to replace the soft thresholding by a hard threshdoling. In this case, the resulting algorihtm does not perform anymore a variational minimization of an energy.

The hard thresholding is defined as $h_T(x)=0$ if $-T < x < T$ and $h_T(x)=x$ otherwise. It thus defines a thresholding operator of wavelet coefficients as $H_T(a)_m = h_T(a_m)$.

Define a shortcut for this vectorialized hard thresholding

Important: Scilab users have to create a file |HardThresh.m| to implement this function.

In [126]:
HardThresh = (x, t) -> x.*(abs(x) .> t)
Out[126]:
(::#103) (generic function with 1 method)

Display a curve of the 1-D Hard thresholding.

In [127]:
x = linspace(-1, 1, 1000)

figure(figsize = (7, 5))
plot(x, HardThresh(x, .5))
show()

The hard thresholding in the translation invariant wavelet basis $\Psi$ reads $$ H_T^\Psi(f) = \Xi \circ H_T \circ \Psi^* (f) $$ where $\Xi = (\Phi^*)^+$ is the reconstruction operator.

We follow the MCA paradigm of Jean-Luc Starck, that alternates between a gradient descent step and a hard thresholding denoising, using a decaying threshold. $$f^{(\ell+1)} = H_{\tau\lambda_\ell}^\Psi( f^{(\ell)} - \tau \Phi^*(\Phi f^{(\ell)} - y) ). $$

Number of iterations.

In [128]:
niter = 500
Out[128]:
500

List of thresholds. One must start by a large enough initial threshold.

In [129]:
lambda_list = linspace(1, 0, niter)
Out[129]:
500-element LinSpace{Float64}:
 1.0,0.997996,0.995992,0.993988,…,0.00601202,0.00400802,0.00200401,0.0

Initialization.

In [130]:
fHard = y
Out[130]:
128×128 Array{Float64,2}:
 0.0       0.0       0.0       0.596059  …  0.817734  0.0       0.0     
 0.571429  0.0       0.591133  0.610838     0.0       0.0       0.812808
 0.581281  0.0       0.0       0.0          0.0       0.0       0.0     
 0.561576  0.0       0.0       0.586207     0.0       0.802956  0.0     
 0.0       0.0       0.522168  0.0          0.0       0.0       0.82266 
 0.0       0.0       0.0       0.0       …  0.0       0.802956  0.0     
 0.0       0.0       0.0       0.0          0.0       0.0       0.0     
 0.0       0.561576  0.0       0.0          0.0       0.82266   0.0     
 0.472906  0.0       0.0       0.394089     0.0       0.0       0.0     
 0.0       0.0       0.0       0.182266     0.0       0.0       0.0     
 0.0       0.0       0.0       0.0       …  0.0       0.0       0.832512
 0.44335   0.0       0.0       0.0          0.0       0.812808  0.0     
 0.0       0.0       0.0       0.369458     0.807882  0.0       0.827586
 ⋮                                       ⋱  ⋮                           
 0.0       0.0       0.0       0.472906     0.955665  0.932677  0.0     
 0.0       0.0       0.541872  0.0          0.0       0.935961  0.876847
 0.0       0.0       0.0       0.0          0.0       0.0       0.0     
 0.0       0.0       0.497537  0.0          0.0       0.0       0.91133 
 0.0       0.507389  0.0       0.546798  …  0.0       0.0       0.906404
 0.0       0.0       0.0       0.0          0.0       0.955665  0.0     
 0.0       0.0       0.0       0.689655     0.0       0.0       0.0     
 0.0       0.487685  0.0       0.0          0.990148  0.0       0.916256
 0.0       0.0       0.0       0.788177     1.0       0.0       0.916256
 0.0       0.0       0.0       0.0       …  1.0       0.97537   0.91133 
 0.0       0.0       0.0       0.561576     0.0       0.0       0.916256
 0.0       0.0       0.0       0.738916     0.945813  0.0       0.0     

Gradient descent.

In [131]:
fHard = ProjC(fHard, Omega)
Out[131]:
128×128 Array{Float64,2}:
 0.0       0.0       0.0       0.596059  …  0.817734  0.0       0.0     
 0.571429  0.0       0.591133  0.610838     0.0       0.0       0.812808
 0.581281  0.0       0.0       0.0          0.0       0.0       0.0     
 0.561576  0.0       0.0       0.586207     0.0       0.802956  0.0     
 0.0       0.0       0.522168  0.0          0.0       0.0       0.82266 
 0.0       0.0       0.0       0.0       …  0.0       0.802956  0.0     
 0.0       0.0       0.0       0.0          0.0       0.0       0.0     
 0.0       0.561576  0.0       0.0          0.0       0.82266   0.0     
 0.472906  0.0       0.0       0.394089     0.0       0.0       0.0     
 0.0       0.0       0.0       0.182266     0.0       0.0       0.0     
 0.0       0.0       0.0       0.0       …  0.0       0.0       0.832512
 0.44335   0.0       0.0       0.0          0.0       0.812808  0.0     
 0.0       0.0       0.0       0.369458     0.807882  0.0       0.827586
 ⋮                                       ⋱  ⋮                           
 0.0       0.0       0.0       0.472906     0.955665  0.932677  0.0     
 0.0       0.0       0.541872  0.0          0.0       0.935961  0.876847
 0.0       0.0       0.0       0.0          0.0       0.0       0.0     
 0.0       0.0       0.497537  0.0          0.0       0.0       0.91133 
 0.0       0.507389  0.0       0.546798  …  0.0       0.0       0.906404
 0.0       0.0       0.0       0.0          0.0       0.955665  0.0     
 0.0       0.0       0.0       0.689655     0.0       0.0       0.0     
 0.0       0.487685  0.0       0.0          0.990148  0.0       0.916256
 0.0       0.0       0.0       0.788177     1.0       0.0       0.916256
 0.0       0.0       0.0       0.0       …  1.0       0.97537   0.91133 
 0.0       0.0       0.0       0.561576     0.0       0.0       0.916256
 0.0       0.0       0.0       0.738916     0.945813  0.0       0.0     

Hard threshold (here $\lambda=\lambda_0$) is used).

In [132]:
fHard = Xi(HardThresh(PsiS(fHard), tau*lambda_list[1]));
Out[132]:
128×128 Array{Float64,2}:
  3.66494e-5    0.000114282   3.83719e-5   …   1.48864e-5   -1.52949e-5 
  0.571531     -9.20383e-5    0.591257        -2.28221e-5    0.812817   
  0.581632     -0.000467075   0.000137072     -4.51812e-5    3.25829e-6 
  0.560358      0.000902527  -0.000436964      0.802976     -6.8564e-5  
  0.000264716  -0.000460029   0.522475         0.000223522   0.822554   
 -4.61803e-5   -0.000139511   0.00022262   …   0.802417      0.000101539
 -0.000100566  -2.66475e-5    6.36337e-7       0.00029115   -6.52386e-5 
  4.16665e-5    0.561575     -0.000171974      0.822604     -7.24068e-5 
  0.472961     -1.94345e-5   -9.283e-5        -0.000170705  -0.000214937
 -6.30669e-5   -1.12377e-5    3.03863e-5      -0.000520495  -0.000549995
 -2.24487e-6   -6.98895e-5    0.000103071  …   0.00121376    0.833889   
  0.44338      -5.18721e-6    0.000131892      0.812257     -0.000614638
 -0.00046031    0.00051018   -6.4683e-5       -0.000247252   0.827269   
  ⋮                                        ⋱                            
  0.000178969   0.000110926  -3.71198e-5       0.932618     -5.27957e-5 
 -7.94895e-5    2.71154e-5    0.541936         0.936001      0.876884   
 -6.81682e-6    4.09405e-5    6.37876e-6       3.43327e-5    3.08629e-5 
  3.13311e-5   -8.60576e-6    0.497441         9.4948e-6     0.911338   
  0.000105544   0.507468      1.86034e-5   …   3.39909e-6    0.906405   
  0.000192037   0.000134476   0.000117835      0.955668     -7.16564e-7 
  0.000124403   0.000190191   0.000154254      5.12116e-6    3.36553e-7 
  8.35522e-5    0.487518     -0.000471817     -4.15121e-5    0.916294   
  4.45437e-5    0.000100549   0.000114492     -3.95922e-5    0.916275   
 -0.000195955  -0.000119247  -5.10633e-5   …   0.975452      0.911199   
  0.00021258    0.000195805  -2.00267e-5       0.000369328   0.915838   
  0.000278678   0.000204671  -8.39566e-5      -0.00090505    0.000835345

Exercise 5

Perform the iteration with a decaying value of $\lambda$

In [133]:
include("NtSolutions/inverse_5_inpainting_sparsity/exo5.jl")

# lambda_list = linspace(1, 0, niter)
# fHard = y

# for i in 1 : niter
#     fHard = Xi(HardThresh(PsiS(ProjC(fHard, Omega)), lambda_list[i]))
# end

    
# figure(figsize = (6, 6))
# imageplot(clamP(fSpars), @sprintf("Inpainting hard thresh., SNR = %.1f dB", snr(f0, fHard)))
Out[133]:
PyObject <matplotlib.text.Text object at 0x0000000025A10B00>
In [46]:
## Insert your code here.