Machine Learning Foundations

 Week-5 Revision

Arun Prakash A

Complex vectors

x \in \mathbb{C}^n
y \in \mathbb{C}^n
x = \begin{bmatrix}3-2i \\-2+i \\-4 - 3i \ \end{bmatrix} \in \mathbb{C}^3
y=\begin{bmatrix}-2+4i\\5-i\\-2i\\ \end{bmatrix} \in \mathbb{C}^3
z = x+y \in \mathbb{C}^n
z=\begin{bmatrix}1+2i\\3\\-4-5i\\ \end{bmatrix} \in \mathbb{C}^3

Operations:

\overline{z}=\begin{bmatrix}1-2i\\3\\-4+5i\\ \end{bmatrix}

Conjugate

Addition

x \cdot y = \overline{x}^Ty \in \mathbb{C}
x = \begin{bmatrix}3-2i \\-2+i \\-4 - 3i \ \end{bmatrix}
y=\begin{bmatrix}-2+4i\\5-i\\-2i\\ \end{bmatrix}
\overline{x}^T = [3+2i ,-2-i ,-4 + 3i ]
\begin{bmatrix}-2+4i\\5-i\\-2i\\ \end{bmatrix}

=  \( -19+13i\)

Properties

1.\(x \cdot y = \overline{y \cdot x}\)

2. \((x+y) \cdot z = x \cdot z + y \cdot z \)

 

3. \(x \cdot cy = c (x \cdot y)\)

4.  \(cx \cdot y = \overline{c} (x \cdot y)\)

Inner Product

 

5. \(x \cdot x = ||x||^2\)

 

 \(cx \cdot cy = |c|(x \cdot y) \), true or false ?

Complex Matrices

A = \begin{bmatrix}2& 3-3i \\ 3+3i & 5 \\ \end{bmatrix}
B = \begin{bmatrix}2i& 3+3i \\ 3+3i & 5i \\ \end{bmatrix}

Hermitian if:

A = \begin{bmatrix}2& 3-3i \\ 3+3i & 5 \\ \end{bmatrix}
A^* = \overline{A}^T =\overline{A^T}
x \cdot y = x^*y \in \mathbb{C}

is Hermitian

B = \begin{bmatrix}2i& 3+3i \\ 3+3i & 5i \\ \end{bmatrix}

is not Hermitian

Think : Is there a matrix \(D\) such that when multiplied with the matrix \(C\), the resultant matrix is Hermitian?

C = \begin{bmatrix}2i& 3+3i \\ 3-3i & 5i \\ \end{bmatrix}

is not Hermitian

Properties of Hermitian Matrices

1. All Eigenvalues \( \lambda_i\) are real.

2. Eigenvectors are orthogonal if \( \lambda_i \neq \lambda_j\) for \(i \neq j\)

Finding complex eigenvectors:

Consider  the matrix \(A = \begin{bmatrix}2& 3-3i \\ 3+3i & 5 \\ \end{bmatrix} \). Find the complex eigenvector for the eigenvalue \(\lambda=8\)

R_2 = R_2+\frac{1}{2}(1+i)R_1
N[A-\lambda I] = \begin{bmatrix}-6& 3-3i \\ 3+3i & -3 \\ \end{bmatrix}
= \begin{bmatrix}-6& 3-3i \\ 0 & 0 \\ \end{bmatrix}
\begin{bmatrix}-6& 3-3i \\ 0 & 0 \\ \end{bmatrix}\begin{bmatrix}x_1 \\x_2 \\ \end{bmatrix} = \begin{bmatrix} 0 \\0 \\ \end{bmatrix}
-6x_1+(3-3i)x_2=0
-2x_1+(1-1i)x_2=0
2x_1=(1-1i)x_2
-2x_1+(1-1i)x_2=0
2x_1=(1-1i)x_2
\therefore x =\begin{bmatrix}1 \\ 1+1i \\ \end{bmatrix} = c\begin{bmatrix}1 \\ 1+1i \\ \end{bmatrix}
x = 1-i\begin{bmatrix}1 \\ 1+1i \\ \end{bmatrix} = \begin{bmatrix}1-i \\ 2\\ \end{bmatrix}

Unitary Matrices

Real Case:

\(Q^TQ=I\)

Complex Case: \(U^*U=I\)

U = \begin{bmatrix} cos(t)& -sin(t) \\ sin(t) & cos(t) \end{bmatrix}
U^T= \begin{bmatrix} cos(t)& sin(t) \\ -sin(t) & cos(t) \end{bmatrix}
U*U^T= \begin{bmatrix} cos^2(t)+sin^2(t)& cos(t)sin(t) -sin(t)cos(t)\\ sin(t)cos(t)-cos(t)sin(t) & sin^2(t)+cos^2(t) \end{bmatrix}
U*U^T= \begin{bmatrix} 1& 0\\0 & 1 \end{bmatrix} = I

It preserves the length and angle of vectors!

Unitary matrices are need not  be necessarily Hermitian.

Therefore, eigenvalues are  \( |\lambda_i |=1\)

Eigenvectors are orthogonal if \( \lambda_i \neq \lambda_j\) for \(i \neq j\)

Properties:

There exists unitary matrix that diagonalizes a Hermitian matrix.

Diagonalization of Hermitian Matrices

Schur's Theorem

Any \( n \times n\) matrix is similar to upper triangular matrix \(T\), that is \( A = UTU^*\) 

Example:

\( A =  \begin{bmatrix} 5 & 7 \\ -2 & -4 \end{bmatrix} \)

\( \lambda_1 = -2, \lambda_2 = 3 \)

\( x_1 = \begin{bmatrix}1 \\ -1 \end{bmatrix}\)

\( x_2 = \begin{bmatrix}7 \\ -2 \end{bmatrix}\)

\( U =  \begin{bmatrix} 1 & 7 \\ -1 & -2 \end{bmatrix} \)

If we do \( U^*AU\), will it be triangular or diagonal matrix?

No, why?

It is not an orthonormal matrix!

So how to find orthogonal matrix?

Gram-Schmidt process:

\( A =  \begin{bmatrix} 5 & 7 \\ -2 & -4 \end{bmatrix} \)

\( \lambda_1 = -2, \lambda_2 = 3 \)

\( x_1 = \begin{bmatrix}1 \\ -1 \end{bmatrix}\)

\( x_2 = \begin{bmatrix}7 \\ -2 \end{bmatrix}\)

Find a vector orthogonal to \( x_1 = \begin{bmatrix}1 \\ -1 \end{bmatrix}\) ( you could have picked \(x_2\) as well )

\( q_1 = \begin{bmatrix}1 \\ -1 \end{bmatrix}\)

\( q_2 =  x_2 - (x_2 \cdot q_1)\frac{q_1}{q_1 \cdot q_1}\)

q_2 = \begin{bmatrix}2.5 \\ 2.5 \end{bmatrix} = \begin{bmatrix}1 \\ -1 \end{bmatrix}

\( U = \frac{1}{\sqrt{2}}\begin{bmatrix} 1 & 1 \\-1&1 \end{bmatrix} \)

If we do \( U^*AU\), will it be triangular or diagonal matrix?

\( U^*AU\)

=\begin{bmatrix} -2 & 9 \\ 0 & 3 \\ \end{bmatrix}

What if we have considered different vector instead of \(x_2\) during orthogonalization?

Gram-Schmidt process:

\( A =  \begin{bmatrix} 5 & 7 \\ -2 & -4 \end{bmatrix} \)

\( \lambda_1 = -2, \lambda_2 = 3 \)

\( x_1 = \begin{bmatrix}1 \\ -1 \end{bmatrix}\)

\( x_2 = \begin{bmatrix}7 \\ -2 \end{bmatrix}\)

For a 2 x 2 case, the orthogonal vector to a given vector \(x_1\) is unique!

So, it doesn't matter which vector you use as \(x_2\)

This is my claim!

For a matrix of size \(3 \times 3\), if we have only one eigenvector \(x_1\), then there are many possible orthogonal vectors based on the direction of \(x_2\)

Gram-Schmidt process:

Could you reason, why? (I hope, no need for geogebra :-))

Therefore, schur decomposition is not unique!

Spectral Theorem

Any Hermitian matrix is similar to diagonal matrix \(D\), that is \( A = UDU^*\) 

Singular Value Decomposition (SVD)

If SVD is used for PCA, then Singular values  represent the variance of the data. Higher the singular value, higher the variance!. (Watch once again the Image compression tutorial in week-5, keeping this in mind)

No problem in computation steps as long as none of the singular values are zero.

If any of the singular value is zero, we need to bring GS process to create unitary matrices.

Add-on

Any matrix \(A\) can be diagonalized as \( A = Q_1 \Sigma Q_2^T \), where \( Q_1=AA^T\) and \(Q_2=A^TA\)

Week_5_Revision

By Arun Prakash