\bigskip
\textit{\textbf{Note:} This homework consists of two parts. The first part (questions 1-6) will be graded and will determine your score for this homework. The second part (questions 7-8) will be graded if you submit them, but will not affect your homework score in any way. You are strongly advised to attempt all the questions in the first part. You should attempt the problems in the second part only if you are interested and have time to spare.}
\rule{6in}{2px}
\begin{center}
For each problem, justify all your answers unless otherwise specified.
\end{center}
\section*{Part 1: Required Problems}
\bigskip
\Question{Short Answer}
\begin{Parts}
\Part Let $X$ be uniform on the interval $[0,2]$, and define $Y = 2X + 1$. Find the PDF, CDF, expectation, and variance of $Y$.
\Part Let $X$ and $Y$ have joint distribution
$$
f(x,y) = \begin{cases} c x y + 1/4 & x \in [1,2] \text{ and } y \in [0,2] \\ 0 & \text{else} \end{cases}
$$
Find the constant $c$. Are $X$ and $Y$ independent?
\Part Let $X \sim \text{Exp}(3)$. What is the probability that $X \in [0,1]$? If I define a new random variable $Y = \lfloor X \rfloor$, for each $k \in \N$, what is the probability that $Y = k$? Do you recognize this (discrete) distribution?
\Part Let $X_i \sim \text{Exp}(\lambda_i)$ for $i = 1,...,n$ be mutually independent. It is a (very nice) fact that $\min(X_1,...,X_n) \sim \text{Exp}(\mu)$. Find $\mu$.
% \Part You are handing out candy on halloween, and for some reason every child is dressed as a shark in formalwear or a zombie sheep. The arrivals of the sharks and sheep are each governed by a Poisson arrival process: in any given span of $t$ minutes, the number of dapper sharks is distributed as $\Pois(t/5)$, and the number of undead sheep as $\Pois(t/4)$. The two processes are independent of one another. Let's say that each shark spends exaclty one minute choosing their candy. If a shark has just arrived at your door, what is the probability that a sheep \emph{and} another shark arrive before the current shark is done selecting, with the sheep arriving first?
\end{Parts}
\Question{Exponential Expectation}
\begin{Parts}
\Part Let $X \sim \text{Exp}(\lambda)$. Use induction to show that $\E [X^k] = k!/\lambda^k$ for every $k \in \N$.
\Part For any $|t| < \lambda$, compute $\E [e^{tX}]$ directly from the definition of expectation.
\Part Using part (a), compute $\sum_{k = 0}^\infty \frac{\E[X^k]}{k!}t^k$.
\end{Parts}
\Question{Continuous Probability Continued}
For the following questions, please briefly justify your answers or show your
work.
\begin{Parts}
\Part If $X\sim N(0, \sigma^2_X)$ and $Y\sim N(0,\sigma_Y^2)$ are
independent, then what is $\mathbb E\left[ (X+Y)^k \right]$ for any \textit{odd}
$k\in\mathbb N$?
\Part Let $f_{\mu, \sigma}(x)$ be the density of a $N(\mu, \sigma^2)$ random
variable, and let $X$ be distributed according to $\alpha f_{\mu_1,
\sigma_1}(x) + (1-\alpha)f_{\mu_2, \sigma_2}(x)$ for some $\alpha\in[0,1]$.
Please compute $\mathbb E[X]$ and $\mathrm{Var}[X]$. Is $X$ normally
distributed?
\Part Assume $\text{Bob}_1, \text{Bob}_2, \dots, \text{Bob}_k$ each hold a
fair coin whose two sides show numbers instead of heads and tails, with the
numbers on $\text{Bob}_i$'s coin being $i$ and $-i$. Each Bob tosses their
coin $n$ times and sums up the numbers he sees; let's call this number
$X_i$. For large $n$, what is the distribution of $\left( X_1 + \dots + X_k
\right)/\sqrt{n}$ approximately equal to?
\Part If $X_1, X_2, \dots$ is a sequence of i.i.d. random variables of mean $\mu$ and
variance $\sigma^2$, what is $\lim_{n\to\infty} \mathbb P\left[ \sum_{k=1}^n
\frac{X_k - \mu}{\sigma n^{\alpha}} \in \left[ -1,1 \right] \right]$ for
$\alpha\in[0,1]$ (your answer may depend on $\alpha$ and $\Phi$, the CDF of
a $N(0,1)$ variable)?
\Part Assume we wanted to estimate the value of $\pi$ by repeatedly throwing
a needle of length $1$ cm on a striped floor, whose stripes all run
parallel at distances $1$ cm from each other. Please give an estimator of
$\pi$, and compute an approximate $95\%$ confidence intervals using the
central limit theorem. (\textit{Hint}: You may assume that $\pi(\pi \pm
\varepsilon) \approx \pi^2$ for small $\varepsilon$).
\end{Parts}
\Question{Buffon's Needle on a Grid}
In this problem, we will consider Buffon's Needle, but with a catch. We now drop
a needle at random onto a large grid, and example of which is shown below.
\begin{center}
\begin{tikzpicture}
\draw[step=0.5cm,color=gray] (.75,.75) grid (3.75,3.75);
\draw[thick,color=black] (2.25, 2.45) -- (2.603553, 2.803553);
\end{tikzpicture}
\end{center}
The length of the needle is $1$, and the space between the grid lines is $1$ as well.
Recall from class that a random throw means that
the position of the center of the needle and its orientation
are independent and uniformly distributed.
\begin{Parts}
\Part Given that the angle between the needle and the horizontal lines is $\theta$,
what is the probability that the needle does not intersect any grid lines?
Justify your answer.
\Part Continue part (a) to find the probability that the needle, when dropped onto the
grid at random (with the angle chosen uniformly at random), intersects a grid line.
Justify your answer.
\textit{Hint:} You may use the fact that
$\sin \theta \cos \theta = \frac{1}{2} \sin (2\theta)$
without proof.
\Part Let $X$ be the number of times the needle intersects a gridline (so, in the example shown
above, $X = 2$). Find $\mathbb{E}[X]$. Justify your answer.
\textit{Hint:} Can you do this without using your answer from part (b)?
\Part Combine the previous parts to figure out the probability that $X = 1$, i.e. the needle
will only intersects one gridline. Justify your answer.
%\Part We will fold the needle into an equilateral triangle, where each side is length
%$\frac{1}{3}$. What is the expected number of intersections that this triangle will have
%with the gridlines, when dropped onto the grid? Justify your answer.
%\Part Now assume that we took a shape that, no matter how we place it on the
%plane, will always intersect the grid in exactly four points. What must the
%circumference of this shape be?
\end{Parts}
\Question{Useful Uniforms}
Let $X$ be a continuous random variable whose image is all of $\mathbb R$; that
is, $\mathbb P\left[ X \in (a,b) \right] > 0$ for all $a,b \in \mathbb R$ and $a\neq
b$.
\begin{Parts}
\Part Give an example of a distribution that $X$ could have, and one
that it could not.
\Part Show that the CDF $F$ of $X$ is strictly increasing. That is,
$F(x+\varepsilon) > F(x)$ for any $\varepsilon > 0$. Argue why this implies
that $F: \mathbb R\to (0,1)$ must be invertible.
\Part Let $U$ be a uniform random variable on $(0,1)$. What is the
distribution of $F^{-1}(U)$?
\Part Your work in part (c) shows that in order to sample $X$, it is enough
to be able to sample $U$. If $X$ was a discrete random variable instead,
taking finitely many values, can we still use $U$ to sample $X$?
\end{Parts}
\Question{Balls, meet Bins}
Alice and Bob are tasked with throwing balls into bins (to set up a probability problem for later). They decide to make a game out of it: Alice and Bob will each take a ball, and once per minute, they will both simultaneously (and independently) attempt to throw their balls into a bin.
Once Alice or Bob successfully lands a ball in a bin, that person stops while the other person continues to try until they also land a throw. When this happens, the game is over.
Suppose that on every try, the probability of successfully landing the throw in a bin is $p$. What is the expected number of minutes until the game is over? Solve this using a Markov chain with three states. Then, state how your solution can be interpreted in terms of two geometric random variables.
\bigskip
\textit{\textbf{Note:} This concludes the first part of the homework. The problems below are optional, will not affect your score, and should be attempted only if you have time to spare.}
\rule{6in}{2px}
\bigskip
\section*{Part 2: Optional Problems}
\bigskip
\Question{Markov Chains: Prove/Disprove}
Prove or disprove the following statements, using the definitions from the previous question.
\begin{Parts}
\Part
There exists an irreducible, finite Markov chain for which there exist initial distributions that converge to different distributions.
\Part
There exists an irreducible, aperiodic, finite Markov chain for which $\Pr(X_{n+1} = j \mid X_{n} = i) = 1$ or $0$ for all $i,j$.
\Part
There exists an irreducible, non-aperiodic Markov chain for which $\Pr(X_{n+1} = j \mid X_{n} = i) \neq 1$ for all $i,j$.
\Part
For an irreducible, non-aperiodic Markov chain, any initial distribution not equal to the invariant distribution does not converge to any distribution.
\end{Parts}
% original question written Spr 16; re-written Fall 19
\Question{Oski's Markov Chain}
When Oski Bear is studying for CS70, he splits up his time between reading notes and working on practice problems. To do this, every so often he will make a decision about what kind of work to do next.
When Oski is already reading the notes, with probability $a$ he will decide to switch gears and work on a practice problem, and otherwise, he will decide to keep reading more notes. Conversely, when Oski is already working on a practice problem, with probability $b$ he will think of a topic he needs to review, and will decide to switch back over to the notes; otherwise, he will keep working on practice problems.
Assume that (unlike real life, we hope!) Oski never runs out of work to do.
\begin{Parts}
\Part Draw a 2-state Markov chain to model this situation.
\nosolspace{1cm}
\Part
\noindent In the remainder of this problem, we will learn to work with the definitions of some important terms relating to Markov Chains. These definitions are as follows:
\begin{enumerate}
\item (Irreducibility) A Markov chain is irreducible if, starting from any state $i$, the chain can transition to any other state $j$, possibly in multiple steps.
\item (Periodicity) $d(i) := \gcd\{n > 0 \mid P^n(i, i) = \Pr[X_n = i \mid X_0 = i] > 0 \}$, $i \in {\cal X}$. If $d(i) = 1 \; \forall i \in \mc{X}$, then the Markov chain is aperiodic; otherwise it is periodic.
\item (Matrix Representation) Define the transition probability matrix $P$ by filling entry $(i, j)$ with probability $P(i, j)$.
\item (Invariance) A distribution $\pi$ is invariant for the transition probability matrix $P$ if it satisfies the following balance equations: $\pi = \pi P$.
\end{enumerate}
For what values of $a$ and $b$ is the Markov chain irreducible?
\nosolspace{1cm}
\Part For $a=1$, $b=1$, prove that the Markov chain is periodic.
\nosolspace{1cm}
\Part For $0 < a < 1$, $0 < b < 1$, prove that the Markov chain is aperiodic.
\nosolspace{1cm}
\Part Construct a transition probability matrix using the Markov chain.
\nosolspace{3cm}
\Part Write down the balance equations for the Markov chain and solve them. Assume that the Markov chain is irreducible.
\nosolspace{2.5cm}
\end{Parts}