BROADCASTING
An operation that has become extremely useful is the use of the "period" or "dot" operation pronounced "broadcast." We will mention one use here but full details, if interested, may be found in the documentation on broadcast. In its full generality it is one way that makes Julia faster than languages such as Python. (If interested in such performance issues see Steven Johnson's blog on broadcast).
If there is one argument, the dot just applies the operation
to every element. Thus
sqrt.(v)
computes the sqrt of element of v.
Another case is where $A$ is an mxn matrix and $v$ is an m vector : $$ (A.*v)_{ij} = A_{ij}*v_i $$ and $$ (A./v)_{ij} = A_{ij}/v_i$$ in general $f.(A,v)$ is the matrix whose (i,j) entry is $f(A_{ij},v_i)$. Many people like to imagine that v is replaced with $V=[v \ v \ v \ \ldots \ v]$ (copy v as many times to match the size of $A$) and then everything is elementwise.
For example:
A = [1 2 3 4;5 6 7 8;9 10 11 12;13 14 15 16]
4×4 Array{Int64,2}: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16
A.*[1, 10, 100, 1000]
4×4 Array{Int64,2}: 1 2 3 4 50 60 70 80 900 1000 1100 1200 13000 14000 15000 16000
A./[1, 10, 100, 1000]
4×4 Array{Float64,2}: 1.0 2.0 3.0 4.0 0.5 0.6 0.7 0.8 0.09 0.1 0.11 0.12 0.013 0.014 0.015 0.016
Finally there is a row vector version where f.(A,v') has (i,j) element $f(A_ij,v_j)$, which may be thought of creating a matrix $V$ whose rows are v, and then applying f elementwise.
A=[.8 .3;.2 .7]
2×2 Array{Float64,2}: 0.8 0.3 0.2 0.7
has steady state
v = [.6,.4]
2-element Array{Float64,1}: 0.6 0.4
A*v # checking the steady state with Julia
2-element Array{Float64,1}: 0.6 0.4
A./v
2×2 Array{Float64,2}: 1.33333 0.5 0.5 1.75
A'./v'
2×2 Array{Float64,2}: 1.33333 0.5 0.5 1.75
1a) Verify that if v is the steady state eigenvector for any 2 x 2 Markov matrix
A./v
and
A'./v'
are equal.
Below is an example:
A
2×2 Array{Float64,2}: 0.619677 0.232379 0.126491 0.442719
A./v
2×2 Array{Float64,2}: 1.33333 0.5 0.5 1.75
A'./v'
2×2 Array{Float64,2}: 1.33333 0.5 0.5 1.75
Markov matrices show up in many many fields, most recently in the machine learning area of MCMC. One blog on the subject
2a) Show that if A is an nxn reversible Markov matrix, then it is diagonally similar to its transpose. (This means that the similarity matrix may be chosen diagonal.) The example below gives a hint when n=2.
w = sqrt.(v)
Diagonal(v)\A*Diagonal(v)
2×2 Array{Float64,2}: 0.8 0.2 0.3 0.7
A'
2×2 Array{Float64,2}: 0.8 0.2 0.3 0.7
3a) Show that if a matrix is diagonally similar to its transpose, where the diagonal is positive, then this matrix has real eigenvalues. (Hint: use w = sqrt.(v)) to show that the matrix is similar to a symmetric matrix. See the example below)
w = sqrt.(v) # elementwise sqrt
Diagonal(w)\A*Diagonal(w)
2×2 Array{Float64,2}: 0.8 0.244949 0.244949 0.7
We will not discuss here but the famous Metropolis algorithm, starts with the vector v, First finds a non-Markov matrix that is diagonally similar to its transpose, and then iterates until the matrix is Markov with steady state v. Thus the Metropolis algorithm is an "inverse problem", as it begins with the eigenvector of a Markov matrix and constructs the Markov matrix with that eigenvector.
This matrix M is antisymmetric and also _______. Then all its eigenvalues are pure imaginary and they also have |λ|=1. (||Mx||=||x|| for every x.) Find all four eigenvalues from the trace of M:
$$ M = \frac{1}{\sqrt 3} \begin{pmatrix} 0 & 1 & 1 & 1 \\ -1 & 0 & -1 & 1 \\ -1 & 1 & 0 & -1 \\ -1 & -1 & 1 & 0 \end{pmatrix} $$can only have eigenvalues i or -i.
For which numbers b and c are these matrices positive definite?
$$ S = \begin{pmatrix} 1 & b \\ b & 9 \end{pmatrix} \ \ S = \begin{pmatrix} 2 & 4 \\ 4 & c \end{pmatrix} \ \ S = \begin{pmatrix} c & b \\ b & c \end{pmatrix} .$$For which s and t do S and T have all λ>0 (therefore positive definite)?
$$ S = \begin{pmatrix} s & -4 & -4 \\ -4 & s & -4 \\ -4 & -4 & s \end{pmatrix} \ \ T = \begin{pmatrix} t & 3 & 0 \\ 3 & t & 4 \\ 0 & 4 & t \end{pmatrix} $$Find the nonzero singular values of $A$ and $B$ without forming any matrices times their transpose. You can use the fact that hte sum of the squares of the singular values equals the sum of the squares of the elements of a matrix.
$$ A = \begin{pmatrix} 1 & 2 & 3 & 4 \\ 2 & 4 & 6 & 8 \\ 3 & 6 & 9 & 12 \\ 4 & 8 & 12 & 16 \end{pmatrix} \ and \ B= \begin{pmatrix} 2 & 3 & 4 & 5 \\ 3 & 4 & 5 & 6 \\ 4 & 5 & 6 & 7 \\ 5 & 6 & 7 & 8 \end{pmatrix} . $$Hint for B: How many singular values are non-zero? What is the sum of the eigenvalues of B? What is the sum of the squares of the eigenvalues of B? Can you use these last two questions to get the product of the eigenvalues, hence the eigenvalues themselves? Can you then figure out the singular values of B?