Monday 2 April 2018

Importance of eigenvalues and eigenvectors

Hello there,
In this post, I am going to talk about the importance of eigenvalues and eigenvectors. For those who don't know what are eigenvalues and eigenvectors:
Suppose you have a square matrix  A of size n x n, λ be the eigenvalue and X be the eigenvector of size n x 1. Then the equation :
AXλX      ---(1)
holds true.
How eigenvalues and eigenvectors are calculated isn't the aim of this post. If you want to learn the way eigenvalues and eigenvectors are calculated, go to this link.

So coming back to the main topic of this post i.e the importance of eigenvalues and eigenvectors, I would like to give a definition of eigenvalues and eigenvectors in a different way.
The eigenvalue is a value and the Eigenvector is a vector hiding in a matrix. Both are useful to solve systems of linear equations and to find differentiation of vectors.  In the equation above X is a vector.
Suppose you have λ, as well as X with you for a particular matrix A. Can you find out the eigenvalue and eigenvector for the A^2 matrix? 
The answer is: 
The eigenvector of A^2 is same as the eigenvector of A. The eigenvalue for A^2 is  λ^2. The same goes for the nth power of the matrix

Ok, enough of formulae and definition. The question still remains the same. What is the use of it? How it works in the real world?
For answering these questions, let's dive deeper.

Let's understand what is a transformation.
A transformation is a change in the current shape of the object.  It can either be in single dimension or in multiple dimension.

1.

 2.
3.

Obviously, I am not talking about the transformation in picture number 1 😅😅😅. 
In the transformation in picture number 2, you can see that the direction of the blue arrow remains the same but the direction of the red arrow changes. The length of the blue arrow increases with some constant.
Eigenvectors are those vectors whose direction does not change even after the transformation. They only change their length when subjected to any transformation. The constant by which the length of these eigenvector changes is called as it's eigenvalue.
Similarly, in the picture number 3, you can see that the direction of the red vector remains same while that of the blue vector changes.












Thursday 8 March 2018

Faster Matrix Algebra for ATLAS

Hello Everyone,
In this and coming next posts, I am going to discuss the Gsoc 2018 project  I am applying for. To have a glimpse of the idea of the project go through the link.
I have already started working on this project and this is the work I have done till now on my Github repo. Go to this link and take an overview of the work and the details about the project, it's limitations or the current part being implemented.
I will be updating this blog from now on for this project and the issues related to this project. Any kind of suggestions and feedbacks are most welcome.


Update 1:
The current issue is with the speed of multiplication of two matrix of size 100 by 100, which takes around 317.748 milliseconds in SymMat class as compared to  24.72 milliseconds in case of Eigen:: Matrix class.
The Eigen Matrix class uses a method similar to BLASS GEMM. So the current plan is to implement something similar to this.


Update 2:
I have updated the symmetric.cpp file with using multithreading in case of multiplication of one SYmMat and One Eigen Matrix, both of size 100 by 100.
By doing so I was able to reduce the initial time complexity by 27% i.e now the time taken is 232.278 milliseconds.