1.

Let be the number of dollars at the -th trial. Then,

By the rule of iterated expectations, . By induction, .

2.

If , then and hence .

The converse is more complicated. We claim that whenever is a nonnegative random variable, implies that . In this case, it is sufficient to take to conclude that .

To substantiate the claim, suppose . Take . Then,

It follows that for all . By continuity of probability,

3.

Since , it follows that . Therefore,

4.

Note that where are IID. It follows that and .

5.

Let be the number of tosses until a heads is observed. Let denote the result of the first toss. Then,

Solving for yields .

6.

7.

Integration by parts yields

Define . Note that converges pointwise to as . Moreover, is monotone increasing. The desired result follows by Lebesgue’s monotone convergence theorem.

8.

The first two claims follow from

and

As for the final claim, note that

and hence

Next, note that and . Moreover,

and hence . Substituting these findings into the equation above yields , as desired.

9.

TODO (Computer Experiment)

10.

The MGF of a normal random variable is . Therefore, and

11.

a)

This was already solved in Question 4.

b)

TODO (Computer Experiment)

12.

TODO

13.

a)

Let denote the result of the coin toss. Then,

b)

Similarly to Part (a),

Therefore, .

14.

The result follows from

15.

First, note that . Moreover,

and

Therefore, .

16.

In the continuous case,

Taking yields . The discrete case is similar. A more general notion of conditional expectation requires Radon-Nikodym derivatives.

17.

By the tower property,

and

The desired result follows from summing the two quantities.

18.

Since

and by the tower property, .

19.

Unlike the distribution of , the distribution of is concentrated around . As increases, so too does the concentration.

20.

For a vector with entries ,

For a matrix with entries , define the column vector as the transpose of the -th row of . Then,

Therefore, .

Next, using our findings in Question 14,

As before, we can generalize this to the matrix case by noting that

Therefore, .

21.

If , then

and . Therefore,

22.

a)

Note that . Moreover, and . Since , and are dependent.

b)

If , then and hence . Therefore, trivially. Moreover,

23.

Let . The MGF of is

Let . Then,

Therefore, the MGF of is .

Lastly, let . Then,

is finite whenever . Therefore, under the same condition, the MGF of is .

24.

Suppose . Then,

and hence

Since this is the MGF of a Gamma distribution, it follows that the sum of IID exponentially distributed random variables are Gamma distributed.