Covariance and Correlation

Rahull Trehan
3 min readJun 5, 2021

--

Covariance and Correlation are the terms you would come across very frequently while you are building models in machine learning. Conceptually it is very important to understand what these terms mean. Through this article we will see:

  • What these two terms are?
  • How Covariance and Correlation are different from each other? and
  • Why do these terms fail for non-linear variables?

Let us first understand the basic definition of Covariance and Correlation.

Covariance as we all know is a measure of how much two random variables vary together. For example — The years of experience and salary have positive covariance which means as the years of experience increases the salary also increases (in general); weight and height also have a positive covariance between them, as height increases so are the weight; the height above sea level and temperature can be one of the examples which explains the negative covariance wherein as the height or altitude increases the temperature goes down.

All of the examples above show how one variable is related to the other, it could have a positive or covariance and each of the examples has a different unit/scale of measurement: Height (in cm/inches), weight (in kilograms), Altitude (in feet), salary (in currency unit), years of experience (in months/years), etc. and if we change the scale or unit of measurement of any of the variable the covariance will change.

So what happens if we want to compare the covariance of multiple independent variables in relation to the target variable having all different units? We have to find out a way to normalize the covariance where we remove the scale from covariance.

This removal of scale from covariance is called Correlation.

So, correlation is the covariance of standardization of the variables; it is dimensionless (since it is a ratio) and if the correlation is positive if one variable increases the other also increases, and if the correlation is negative of one variable increased the other decreases.

If the two variables are independent then the covariance between the two would tend to be zero (0), however, it is not necessarily to imply that if the covariance between the two variables is zero then they are independent.

I know it may sound a bit confusing, but for now just hold onto this thought that “if two variables are independent then their covariance is zero, however, if the covariance between two variables is zero it is not necessary that these variables are independent.”

A very important thing to note is that both correlation and covariance measures the linear relationship between the two variables.

To understand the above statement we can take up an example of two variables X and Y. Let’s assume that these two variables have a quadratic relationship between them where: Y = X^2

Now if we calculate the covariance between X & Y

We would see that the covariance is zero, however, we cannot say that X & Y are independent since there is a quadratic relationship between the two variables and hence the point which was highlighted earlier that the covariance and correlation measure the linear relationship between the two variables is justified through this example which also proves the point that if the covariance between two variables is zero it is not necessary that these variables are independent.

Hope you liked this article. Thank you for reading.

--

--