(From Strang, section 6.4, problem 18.)
Let $A$ be some rectangular (possibly non-square, possibly complex) matrix. Form the block matrix $$S = \begin{pmatrix} 0 & A \\ A^H & 0 \end{pmatrix} $$ where "0" denotes a block of zeros of the appropriate size. Consider the eigenvalues λ and eigenvectors $x = (y,z)$ of S, satisfying $Sx = \lambda x$:
$$ Sx = \underbrace{\begin{pmatrix} 0 & A \\ A^H & 0 \end{pmatrix}}_S \underbrace{\begin{pmatrix} y \\ z \end{pmatrix}}_x = \lambda \begin{pmatrix} y \\ z \end{pmatrix} = \lambda x $$(a) The eigenvalues of $S$ must be real because ...............
(b) If $A$ is $m\times n$, how big are the vectors $y$ and $z$, and how big are the two blocks of 0's in $S$?
(c) Show that $-\lambda$ is also an eigenvalue, with eigenvector $(y,-z)$.
Check this for a random $3\times 4$ matrix A = rand(Complex{Float64},3,4)
, with S = [ 0I A; A' 0I]
. Compute eigvals(S)
: does it match your prediction?
(d) Show (for the same $z$) that $A^HAz = \lambda^2 z$, so that $\lambda^2$ is an eigenvalue of $A^H A$. Check this via eigvals(A'*A)
.
(e) If $A = I$ ($2 \times 2$), find all four eigenvectors and eigenvalues of $S$.
(From Strang, section 6.4, problem 33.)
Suppose $A^T = -A$, a real antisymmetric matrix (also called skew-symmetric). Form a random real antisymmetric $5\times 5$ matrix in Julia via A = randn(5,5); A = A - A'
.
Explain the following facts about $A$, and check each fact numerically for your random A
matrix:
(a) $x^T A x = 0$ for every real vector $x$. (Try x'*A*x
in Julia with x = randn(5)
.)
(b) The eigenvalues of $A$ (eigvals(A)
) are purely imaginary. (There are multiple ways to show this, but it is a good excuse to review the proof from class that Hermitian matrices have real eigenvalues… almost the same proof works here.)
(c) The determinant of $A$ (det(A)
) is positive or zero (not negative).
(d) The matrix $e^{A}$ is unitary (check Q'*Q - I
for Q = expm(A)
.) Why?
(e) If you solve $dx/dt = Ax$ for any initial condition $x(0)$, then the length of $x$ is conserved: $\Vert x(t) \Vert = \Vert x(0) \Vert$ for all $t$. (In Julia, compare norm(expm(A*t)*x)
to norm(x)
for various t
.)
(From Strang, section 6.3, problem 31.)
The cosine of a matrix can be defined like $e^A$, by copying the Taylor series for $\cos t$:
$$ \cos A = I - \frac{A^2}{2!} + \frac{A^4}{4!} - \frac{A^6}{6!} + \cdots . $$(a) If $Ax = \lambda x$, multiply each term in the series by $x$ to find an eigenvalue of $\cos A$.
(b) Explain, using the series, why $\frac{d^2}{dt^2} \cos(At) = -A^2 \cos(At)$.
(c) Explain why $u(t) = \cos(At) u(0)$ solves $\frac{d^2 u}{dt^2} = -A^2 u$, given the initial conditions $u(0)$ and $\left . \frac{du}{dt} \right|_{t=0} = 0$.
(d) If $A = \begin{pmatrix} \pi & \pi \\ \pi & \pi \end{pmatrix}$, it has eigenvectors $v_1 = \begin{pmatrix} 1 \\ 1 \end{pmatrix}$ and $v_2 = \begin{pmatrix} 1 \\ -1 \end{pmatrix}$. Find the corresponding eigenvalues, and use them to compute the matrix $\cos A$.
Suppose that $A$ is a $3\times 3$ real-symmetric matrix with eigenvalues $\lambda_1 = 0$, $\lambda_2 = -1$, and $\lambda_3 = -2$. You are given two corresponding eigenvectors $v_1 = \begin{pmatrix} 1 \\ 0 \\ 1 \end{pmatrix}$ (for $\lambda_1$) and $v_3 = \begin{pmatrix} 0 \\ 1 \\ 0 \end{pmatrix}$ (for $\lambda_3$).
(a) Give an approximate solution at $t = 100$ to $\frac{dx}{dt} = Ax$ for $x(0) = \begin{pmatrix} 1 \\ 1 \\ 0 \end{pmatrix}$. (Give the numerical value of a specific vector — no unknown coefficients or symbolic expressions!)
(b) Give an eigenvector $v_2$ for $\lambda_2$ and write the matrix $A$ as a product of three matrices. (You shouldn't need your answer here to answer part a!)
(c) Suppose that we compute the sequence $x_0, x_1, x_2, \ldots$ given by the recurrence $x_{n+1} = \alpha A x_n$ for some scalar $\alpha$. For what value(s) of α would you expect $x_n$ to approach oscillating (not exponentially growing, decaying, or constant) solutions for large $n$, for most initial $x_0$?