## 1.

Let $X_n$ be the number of dollars at the $n$-th trial. Then,

By the rule of iterated expectations, $\mathbb{E} X_{n + 1} = (5 / 4) \mathbb{E} X_n$. By induction, $\mathbb{E} X_n = (5 / 4)^n c$.

## 2.

If $\mathbb{P}(X = c) = 1$, then $\mathbb{E}[X^2] = (\mathbb{E} X)^2 = c^2$ and hence $\mathbb{V}(X) = 0$.

The converse is more complicated. We claim that whenever $Y$ is a nonnegative random variable, $\mathbb{E}Y=0$ implies that $\mathbb{P}(Y=0)=1$. In this case, it is sufficient to take $Y=(X-\mathbb{E}X)^{2}$ to conclude that $\mathbb{P}(X=\mathbb{E}X)=1$.

To substantiate the claim, suppose $\mathbb{E}Y=0$. Take $A_{n}=\{Y\geq1/n\}$. Then,

It follows that $\mathbb{P}(A_{n})=0$ for all $n$. By continuity of probability,

## 3.

Since $F_{Y_n}(y) = \mathbb{P}(X_1 \leq y)^n = y^n$, it follows that $f_{Y_n}(y) = n y^{n - 1}$. Therefore,

## 4.

Note that $X_n = \sum_{i = 1}^n (1 - 2B_i) = n - 2 \sum_i B_i$ where $B_1, \ldots, B_n \sim \operatorname{Bernoulli}(p)$ are IID. It follows that $\mathbb{E} X_n = n - 2 n \mathbb{E} B_1 = n - 2np$ and $\mathbb{V}(X_n) = 4 n \mathbb{V}(B_1) = 4 n p (1 - p)$.

## 5.

Let $\tau$ be the number of tosses until a heads is observed. Let $C$ denote the result of the first toss. Then,

Solving for $\mathbb{E} \tau$ yields $2$.

## 7.

Integration by parts yields

Define $G_y(x)=(F_{X}(y)-F_{X}(x))I_{(0,y)}(x)$. Note that $G_y$ converges pointwise to $1-F_{X}$ as $y\rightarrow\infty$. Moreover, $y \mapsto G_y$ is monotone increasing. The desired result follows by Lebesgue’s monotone convergence theorem.

## 8.

The first two claims follow from

and

As for the final claim, note that

and hence

Next, note that $\mathbb{E}[X_{1}^{2}]=\sigma^{2}+\mu^{2}$ and $\mathbb{E}[\overline{X}^{2}]=\sigma^{2}/n+\mu^{2}$. Moreover,

and hence $\mathbb{E}[X_{1}\overline{X}]=\sigma^{2}/n+\mu^{2}$. Substituting these findings into the equation above yields $\mathbb{E}[S_{n}^{2}]=\sigma^{2}$, as desired.

## 9.

TODO (Computer Experiment)

## 10.

The MGF of a normal random variable is $\exp(t^2 / 2)$. Therefore, $\mathbb{E} \exp(X) = \sqrt{e}$ and

## 11.

### a)

This was already solved in Question 4.

### b)

TODO (Computer Experiment)

TODO

## 13.

### a)

Let $C$ denote the result of the coin toss. Then,

### b)

Similarly to Part (a),

Therefore, $\mathbb{V}(X) = 19 / 3 - 4 = 7 / 3$.

## 14.

The result follows from

## 15.

First, note that $\mathbb{V}(2X - 3Y + 8) = \mathbb{V}(2X - 3Y)$. Moreover,

and

Therefore, $\mathbb{V}(2X - 3Y) = 245 / 81$.

## 16.

In the continuous case,

Taking $s = 1$ yields $\mathbb{E}[r(X) \mid X = x] = r(x)$. The discrete case is similar. A more general notion of conditional expectation requires Radon-Nikodym derivatives.

## 17.

By the tower property,

and

The desired result follows from summing the two quantities.

## 18.

Since

and $\mathbb{E}X =\mathbb{E}[\mathbb{E}[X\mid Y]] = c$ by the tower property, $\operatorname{Cov}(X,Y)=\mathbb{E}[XY] - \mathbb{E}X\mathbb{E}Y = 0$.

## 19.

Unlike the distribution of $X_1 \sim \operatorname{Unif}(0, 1)$, the distribution of $(X_1 + \cdots + X_n)/n$ is concentrated around $\mathbb{E}[X_1]$. As $n$ increases, so too does the concentration.

## 20.

For a vector $a$ with entries $a_i$,

For a matrix $A$ with entries $a_{ij}$, define the column vector $a_{i\star}$ as the transpose of the $i$-th row of $A$. Then,

Therefore, $\mathbb{E}[AX]=A\mathbb{E}X$.

Next, using our findings in Question 14,

As before, we can generalize this to the matrix case by noting that

Therefore, $\mathbb{V}(AX)=A\mathbb{V}(X)A^{\intercal}$.

## 21.

If $\mathbb{E}[Y\mid X]=X$, then

and $\mathbb{E}Y=\mathbb{E}[\mathbb{E}[Y\mid X]]=\mathbb{E}X$. Therefore,

## 22.

### a)

Note that $\mathbb{E}[YZ]=\mathbb{E}I_{(a,b)}(X)=b-a$. Moreover, $\mathbb{E}Y=\mathbb{E}I_{(0,b)}(X)=b$ and $\mathbb{E}Z=\mathbb{E}I_{(a,1)}(X)=1-a$. Since $\mathbb{E}[YZ]\neq\mathbb{E}Y\mathbb{E}Z$, $Y$ and $Z$ are dependent.

### b)

If $Z = 0$, then $% $ and hence $Y = 1$. Therefore, $\mathbb{E}[Y \mid Z = 0] = 1$ trivially. Moreover,

## 23.

Let $K \sim \operatorname{Poisson}(\lambda)$. The MGF of $K$ is

Let $X \sim N(\mu, \sigma^2)$. Then,

Therefore, the MGF of $X$ is $\exp(t\mu+t^{2}\sigma^{2}/2)$.

Lastly, let $Y \sim \operatorname{Gamma}(\alpha, \beta)$. Then,

is finite whenever $% $. Therefore, under the same condition, the MGF of $Y$ is $(1 - t / \beta)^{-\alpha}$.

## 24.

Suppose $\beta>t$. Then,

and hence

Since this is the MGF of a Gamma distribution, it follows that the sum of IID exponentially distributed random variables are Gamma distributed.