Addition, Multiplication Theorem of Expectation and Covariance

Addition, Multiplication Theorem of Expectation and Covariance

Data Science and A.I. Lecture Series

By Bindeshwar Singh Kushwaha

PostNetwork Academy

Outline

  • Introduction
  • Addition Theorem of Expectation
  • Proof of Addition Theorem
  • Multiplication Theorem of Expectation
  • Proof of Multiplication Theorem
  • Covariance

Introduction

Expectation (or expected value) is a fundamental concept in probability and statistics. It provides a measure of the central tendency of a random variable. We discuss two key theorems: Addition and Multiplication Theorems of Expectation.

Addition Theorem of Expectation

Theorem: If \( X \) and \( Y \) are two random variables, then:

\[ E(X + Y) = E(X) + E(Y) \]

This holds for any finite number of random variables:

\[ E(X_1 + X_2 + \dots + X_n) = E(X_1) + E(X_2) + \dots + E(X_n) \]

The theorem holds regardless of whether the variables are independent or dependent.

Proof of Addition Theorem

By definition, expectation is computed as:

\[ E(X) = \sum_{x} x P(X = x) \]

Similarly,

\[ E(Y) = \sum_{y} y P(Y = y) \]

For two discrete random variables:

\[ E(X + Y) = \sum_{x,y} (x + y) P(X = x, Y = y) \]

Distributing the sum,

\[ E(X + Y) = \sum_{x,y} x P(X = x, Y = y) + \sum_{x,y} y P(X = x, Y = y) \]

Separating terms,

\[ E(X + Y) = \sum_{x} x P(X = x) + \sum_{y} y P(Y = y) = E(X) + E(Y) \]

Multiplication Theorem of Expectation

Theorem: If \( X \) and \( Y \) are two independent random variables, then:

\[ E(XY) = E(X)E(Y) \]

The expected value of the product of two independent random variables is the product of their expected values. This does not necessarily hold if \( X \) and \( Y \) are dependent.

Proof of Multiplication Theorem

By definition,

\[ E(XY) = \sum_{x,y} xy P(X = x, Y = y) \]

Since \( X \) and \( Y \) are independent, we can write:

\[ P(X = x, Y = y) = P(X = x) P(Y = y) \]

Substituting this,

\[ E(XY) = \sum_{x,y} xy P(X = x) P(Y = y) \]

Separating sums,

\[ E(XY) = \left( \sum_{x} x P(X = x) \right) \left( \sum_{y} y P(Y = y) \right) = E(X)E(Y) \]

Covariance

For a bivariate frequency distribution, covariance between two variables \( X \) and \( Y \) is defined as:

\[ \text{Cov}(X,Y) = \frac{\sum f_i (x_i – \bar{X})(y_i – \bar{Y})}{\sum f_i} \]

For a bivariate probability distribution:

\[ \text{Cov}(X,Y) = \begin{cases}
\sum (x_i – \mathbb{E}[X])(y_i – \mathbb{E}[Y]) p_{ij}, & \text{discrete case} \\
\int \int (x – \mathbb{E}[X])(y – \mathbb{E}[Y]) f(x,y) dx dy, & \text{continuous case}
\end{cases} \]

Using expectation:

\[ \text{Cov}(X, Y) = \mathbb{E}[XY] – \mathbb{E}[X] \mathbb{E}[Y] \]

If \( X \) and \( Y \) are independent, then \( \mathbb{E}[XY] = \mathbb{E}[X] \mathbb{E}[Y] \), hence \( \text{Cov}(X,Y) = 0 \).

Video

PDF

Addition Multiplication Theorem and Covariance

Reach PostNetwork Academy

Thank You!

 

©Postnetwork-All rights reserved.