You are not logged in.
Pages: 1
Given random variable X with standard deviation x, and a random variable Y=a + bX, where a,b are constants, show that if b<0, the correlation coefficient = -1, and if b>0, correlation coefficient is 1.
Does anyone know how to prove this?
Offline
I really need to learn more statistics myself, but I think this is how you should approach it:
Definition of the correlation coefficient:
[align=center]
[/align]V is the covariance, and I'm using the sigmas in place of the lower case letters for the standard deviations here - it can get confusing otherwise. Now try to express everything in terms of X. For example, if you write out the definition of covariance you get Mean(XY) - Mean(X)*Mean(Y). You can plug in for Y in terms of X here. You should find that the covariance only depends on the standard deviation of X.
Then, also express the standard deviation of Y in terms of the standard deviation of X and if you plug everything in to your formula all the X dependence should cancel out - if not, you know you made a mistake. Also, remember that the standard deviation has to be positive, but the covariance can be negative. That's how you will get a negative answer for the case b < 0. Please ask again if that doesn't make sense!
Last edited by fgarb (2006-10-16 02:48:25)
Offline
Pages: 1