## 1.

Chebyshev’s inequality gives $\mathbb{P}(\left|X-\mu\right|\geq k\sigma)\leq1/k^{2}$. An exact calculation yields instead $e^{-(1+k)}$. To see this, note that $\beta(\mu\pm k\sigma)=1\pm k$ and $% $ so that

## 3.

First, note that $\mathbb{V}(\overline{X})=\mathbb{V}(X_{1})/n=p(1-p)/n$. Chebyshev’s inequality yields

Next, note that

Let $Y_{i}=(X_{i}-\mathbb{E}X_{1})/n=(X_{i}-p)/n$ so that $\overline{X}-p=\sum_{i}Y_{i}$. Then, $\mathbb{E}Y_{i}=0$ and $-p/n\leq Y_{i}\leq(1-p)/n$. Hoeffding’s inequality yields

Similarly, $\mathbb{P}(\overline{X}-p\leq-\epsilon)=\mathbb{P}(\sum_{i}(-Y_{i})\geq\epsilon)\leq\exp(-2n\epsilon^{2})$. It follows that

is tighter than the Chebyshev bound for sufficiently large $n$.

## 4.

### a)

Applying our findings from Question 3,

### b)

TODO (Computer Experiment)

### c)

The length of the interval is $2\epsilon_{n}$. This length is at most $c>0$ if and only if $n\geq2\log(2/\alpha)/c^{2}$.

TODO (Plot)

As per the hint,

TODO (Plot)

## 7.

A linear combination of IID normal random variables is itself a normal random variable. Therefore, $\overline{X}$ is a random variable with zero mean and variance $1/n$. Letting $Z\sim N(0,1)$, Mill’s inequality yields

The above is tighter than the Chebyshev bound $1 / (t^2 n)$ for sufficiently large $n$.