Image compression & Denoise Image with SVD
- Explain Vector
- Explain Eigenvector
- Explain Eigenvalue
- Combination of Eigenvector & Eigenvalue
- Explain SVD
- Vector is an object has both a magnitude and a direction.
- Vector is a matrix with single column or single row.
- A eigenvector v, is a non-zero vector that sastifies the following equation:
- Vector v is called eigenvector of matrix A if we multiply matrix A by vector v, the new vector (lamda)v does not change direction after the transformation
- Eigenvalue tell us how much the eigenvector changes in size when mutiplied with the matrix
- New column vector (lamda)v has same direction eigenvector.
- New column vector (lamda)v maybe either longer or shorter than eigenvector because of eigenvalue Lamda.
- Eigenvalue lamda might be negative. The direction of new vector is reversed but still on the same line
- We might find many eigenvectors of matrix A
- All vector with the same direction are actually eigenvector of matrix A
Since mathematics is just the art of assigning different names to the same concept, SVD is nothing more than decomposing vectors onto orthogonal axes.
- Support u and v are vectors
- Vector u decomposed into orthogonal components w1 and w2
- Want to decompose u as: u = w1 + w2
- w1 is parallel to vector v and w1 is perpendicular/orthogonal to w2
- The vector component w1 is also called the projection of vector u onto vector v:
- w1 = proj(v)(u) (projection of vector u onto vector v).
- w2 = u - w1
- Extend to more than 1 vectors.
- Generalize to any number of points and dimensions
- Any set of vectors (A) can be expressed in terms of their lengths of projections (S) on some set of orthogonal axes (V).
- Normalize these comlumn vectors to make them of unit length
- Dividing each column vector by its magnitude, but in matrix form