\bigskip
\textit{\textbf{Note:} This homework consists of two parts. The first part (questions 1-5) will be graded and will determine your score for this homework. The second part (questions 6-7) will be graded if you submit them, but will not affect your homework score in any way. You are strongly advised to attempt all the questions in the first part. You should attempt the problems in the second part only if you are interested and have time to spare.}
\rule{6in}{2px}
\begin{center}
For each problem, justify all your answers unless otherwise specified.
\end{center}
\section*{Part 1: Required Problems}
\bigskip
\Question{Random Variables Warm-Up}
Let $X$ and $Y$ be random variables, each taking values in the set $\{0,1,2\}$, with joint distribution
\begin{align*}
\Pr[X = 0, Y = 0] &= 1/3 & \Pr[X=0,Y=1] &= 0 & \Pr[X=0,Y=2] &= 1/3 \\
\Pr[X = 1, Y= 0] &= 0 & \Pr[X=1,Y=1] &= 1/9 & \Pr[X=1,Y=2] &= 0 \\
\Pr[X=2,Y=0] &= 1/9 & \Pr[X=2,Y=1] &= 1/9 & \Pr[X=2,Y=2] &= 0.
\end{align*}
\begin{Parts}
\Part What are the marginal distributions of $X$ and $Y$?
\Part What are $\E[X]$ and $\E[Y]$?
\Part What are $\var(X)$ and $\var(Y)$?
\Part Let $I$ be the indicator that $X = 1$, and $J$ be the indicator that $Y = 1$. What are $\E[I]$, $\E[J]$ and $\E[IJ]$?
\Part In general, let $I_A$ and $I_B$ be the indicators for events $A$ and $B$ in a probability space $(\Omega,\Pr)$. What is $\E[I_A I_B]$, in terms of the probability of some event?
\end{Parts}
\Question{Marginals}
\begin{Parts}
\Part Can there exist three random variables $X_1,X_2,X_3$, each taking values in the set $\{+1,-1\}$, with the property that for every $i\neq j$, the joint distribution of $X_i$ and $X_j$ is given by
\begin{equation}
\mathbb{P}[X_i = 1, X_j = -1] = \frac{1}{2} \qquad \mathbb{P}[X_i = -1, X_j = 1] = \frac{1}{2} \qquad \mathbb{P}[X_i = X_j] = 0?
\end{equation}
If so, specify the joint distribution of $X_1,X_2,X_3$; if not, prove it.
\Part For which natural numbers $n \ge 3$ can there exist random variables $X_1,X_2,...,X_n$, each taking values in the set $\{+1,-1\}$, with the property that for every $i$ and $j$ satisfying $i - j = 1 \pmod{n}$, the joint distribution of $X_i$ and $X_j$ is given by (1)? For any $n$ that work, specify the joint distribution; for those that do not, prove it.
\end{Parts}
\Question{Random Tournaments}
A \emph{tournament} is a directed graph in which every pair of vertices has exactly one directed edge between them---for example, here are two tournaments on the vertices $\{1,2,3\}$:
\begin{figure}[h]
\centering
\includegraphics[width=7cm]{src/problems/expectation/random-tournament.png}
\end{figure}
In the first tournament above, $(1,2,3)$ is a \emph{Hamiltonian path}, since it visits all the vertices exactly once, without repeating any edges, but $(1,2,3,1)$ is not a valid \emph{Hamiltonian cycle}, because the tournament contains the directed edge $1 \to 3$ and not $3 \to 1$. In the second tournament, $(1,2,3,1)$ is a \emph{Hamiltonian cycle}, as are $(2,3,1,2)$ and $(3,1,2,3)$; for this problem we'll say that these are all different Hamiltonian cycles, since their start/end points are different.
Consider the following way of choosing a random tournament $T$ on $n$ vertices: independently for each (unordered) pair of vertices $\{i,j\} \subset \{1,...,n\}$, flip a coin and include the edge $i \to j$ in the graph if the outcome is heads, and the edge $j \to i$ if tails. What is the expected number of Hamiltonian paths in $T$? What is the expected number of Hamiltonian cycles?
\Question{Triangles in Random Graphs}
Let's say we make a simple and undirected graph $G$ on $n$ vertices by randomly adding $m$ edges, without replacement. In other words, we choose the first edge uniformly from all ${n \choose 2}$ possible edges, then the second one uniformly from among the remaining ${n \choose 2} - 1$ edges, etc. What is the expected number of triangles in $G$? (A triangle is a triplet of distinct vertices with all three edges present between them.)
\Question{Variance}
A building has $n$ upper floors numbered $1,2,\ldots,n$, plus a ground floor $G$. At the ground floor, $m$ people get on the elevator together, and each person gets off at one of the $n$ upper floors uniformly at random and independently of everyone else. What is the \textit{variance} of the number of floors the elevator \textit{does not} stop at?
\nosolspace{7cm}
\bigskip
\textit{\textbf{Note:} This concludes the first part of the homework. The problems below are optional, will not affect your score, and should be attempted only if you have time to spare.}
\rule{6in}{2px}
\bigskip
\section*{Part 2: Optional Problems}
\bigskip
\Question{Indicators, Probabilities, and Positivity}
\begin{Parts}
\Part Let $X$ be a positive random variable, i.e. $X(\omega) \ge 0$ for every $\omega \in \Omega$. Prove that $\E[X] \ge 0$.
\Part Let $n$ be a natural number, $\alpha_1, \dotsc, \alpha_n \in \R$, and let $(\Omega,\Pr)$ be a probability space with some events $A_1, \dotsc, A_n \subset \Omega$. Prove that $\sum_{i=1}^n \sum_{j=1}^n \alpha_i \alpha_j \Pr(A_i \cap A_j) \ge 0$. Note that $\alpha_i$ can be less than $0$.
\Part Again let $X$ be a positive random variable, and let $I$ be the indicator that $X > 0$. Prove that
$$
\Pr[X > 0] \ge \tfrac{(\E[X])^2}{\E[X^2]}.
$$
It may be useful to prove that $X = XI$, and to consider the random variable $(X + aI)^2$ for various values of $a \in \mathbb{R}$.
\end{Parts}
\nosolspace{5cm}
\Question{Swaps and Cycles}
We'll say that a permutation $\pi = (\pi(1),...,\pi(n))$ contains a \emph{swap} if there exist $i,j\in\{1,...,n\}$ so that $\pi(i) = j$ and $\pi(j) = i$.
\begin{Parts}
\Part What is the expected number of swaps in a random permutation?
\Part What about the variance?
\Part We say that $\pi$ is an \emph{involution} if $\pi(\pi(i)) = i$ for every $i = 1,...,n$. What is the probability that $\pi$ is an involution? The answer may depend on $n$...
\Part In the same spirit as above, we'll say that $\pi$ contains a \emph{$s$-cycle} if there exist $i_1,...,i_s \in \{1,...,n\}$ with $\pi(i_1) = i_2,\pi(i_2) = i_3,...,\pi(i_s) = i_1$. Compute the expectation and variance of the number of $s$-cycles.
\end{Parts}