With real vectors and matrices, the transpose operation is simple and familiar. It also happens to correspond to what we call the **adjoint** mathematically. In the complex case, one also has to conjugate the entries to keep the mathematical structure intact. We call this operator the **hermitian** of a matrix and use a star superscript for it.

In [1]:

```
A = rand(2,4) + 1i*rand(2,4)
```

In [2]:

```
Aadjoint = A'
```

To get plain transpose, use a `.^`

operator.

In [3]:

```
Atrans = A.'
```

If **u** and **v** are column vectors of the same length, then their **inner product** is $\mathbf{u}^*\mathbf{v}$. The result is a scalar.

In [4]:

```
u = [ 4; -1; 2+2i ], v = [ -1; 1i; 1 ],
innerprod = u'*v
```

The inner product has geometric significance. It is used to define length through the 2-norm,

In [6]:

```
length_u_squared = u'*u
```

In [8]:

```
sum( abs(u).^2 )
```

In [9]:

```
norm_u = norm(u)
```

It also defines the angle between two vectors as a generalization of the familiar dot product.

In [10]:

```
cos_theta = (u'*v) / ( norm(u)*norm(v) )
```

The angle may be complex when the vectors are complex!

In [14]:

```
theta = acos(cos_theta)
```

The operations of inverse and hermitian commute.

In [17]:

```
A = rand(4,4)+1i*rand(4,4); (inv(A))'
```

In [18]:

```
inv(A')
```

So we just write $\mathbf{A}^{-*}$ for either case.

Orthogonality, which is the multidimensional extension of perpendicularity, means that $\cos \theta=0$, i.e., that the inner product between vectors is zero. A collection of vectors is orthogonal if they are all pairwise orthogonal.

Don't worry about how we are creating the vectors here for now.

In [22]:

```
[Q,~] = qr(rand(5,3),0)
```

Since $\mathbf{Q}^*\mathbf{Q}$ is a matrix of all inner products between columns of $\mathbf{Q}$, those columns are orthogonal if and only if that matrix is diagonal.

In [23]:

```
QhQ = Q'*Q
```

In fact we have a stronger condition here: the columns are **orthonormal**, meaning that they are orthogonal and each has 2-norm equal to 1.

Given any other vector of length 5, we can compute its inner product with each of the columns of $\mathbf{Q}$.

In [25]:

```
u = rand(5,1); c = Q'*u
```

We can then use these coefficients to find a vector in the column space of $\mathbf{Q}$.

In [26]:

```
v = Q*c
```

As explained in the text, $\mathbf{r} = \mathbf{u}-\mathbf{v}$ is orthogonal to all of the columns of $\mathbf{Q}$.

In [27]:

```
r = u-v; Q'*r
```

Consequently, we have decomposed $\mathbf{u}=\mathbf{v}+\mathbf{r}$ into the sum of two orthogonal parts, one lying in the range of $\mathbf{Q}$.

In [28]:

```
v'*r
```

We just saw that a matrix whose columns are orthonormal is pretty special. It becomes even more special if the matrix is also square, in which case we call it **unitary**. (In the real case, such matrices are confusingly called *orthogonal*. Ugh.) Say $\mathbf{Q}$ is unitary and $m\times m$. Then $\mathbf{Q}^*\mathbf{Q}$ is an $m\times m$ identity matrix---that is, $\mathbf{Q}^*=\mathbf{Q}^{-1}$! It can't get much easier in terms of finding the inverse of a matrix.

In [31]:

```
[Q,~] = qr(rand(5,5)+1i*rand(5,5));
abs( inv(Q) - Q' )
```

The rank of $\mathbf{Q}$ is $m$, so continuing the discussion above, the original vector $\mathbf{u}$ lies in its column space. Hence the remainder $\mathbf{r}=\boldsymbol{0}$.

In [32]:

```
c = Q'*u;
v = Q*c;
r = u - v
```

This is another way to arrive at a fact we already knew: Multiplication by $\mathbf{Q}^*=\mathbf{Q}^{-1}$ changes the basis to the columns of $\mathbf{Q}$.