\[
\begin{aligned}
\mbox{Cov}(X,Y) &= E\Big[\big(X-E(X)\big)\big(Y-E(Y)\big)\Big] \\\\
&= E[XY] - E(X)E(Y)
\end{aligned}
\]
Correlation defined as: \[ \rho = \frac{\mbox{Cov}(X,Y) }{ \sigma_X \sigma_Y} \]
This will ensure \(-1 \leq \rho \leq 1\).
Follows from Cauchy-Schwartz inequality, \[
E\big[ \big| (X-\mu_X)(Y-\mu_Y) \big| \big]
\leq \sqrt{ E\big[(X-\mu_X)^2\big] \, E\big[(Y-\mu_Y)^2\big] }
\]
library(mvtnorm)
set.seed(363)
X = rmvnorm(100, c(5,5), matrix(c(1,.7,.7,1), 2,2)) # draw sample from Bivariate Normal
plot(X)
## [,1] [,2]
## [1,] 1.1585022 0.7897293
## [2,] 0.7897293 1.1826087
## [,1] [,2]
## [1,] 1.0000000 0.6746978
## [2,] 0.6746978 1.0000000
\(\mbox{Cov}(a,Y) = 0\)
Example: \(Var(X+Y)\)
\[
\epsilon_i ~ N(0,\sigma)
\] It means that each observation is independent of one another. Independence gurantees no correlation (theoretically).
Each observation is from the normal distribution.
Same as random sample from normal distribution.
\[
\epsilon_i ~ WN(0,\sigma)
\] It means that the series is Uncorrelated. This is weaker assumption than independence, because no correlation does not imply independence.
Also, it says nothing about the distribution.