## MIT 18.6501x Fundamentals of Statistics

back · · probability statistics
• If $$X_1, \dots, X_n$$ are iid from a uniform distribution, $$\mathrm{Unif}(0, \theta)$$, then $$Z = \max \left\{ X_1, \dots, X_n \right\}$$ has the cumulative distribution given by
$$F_Z(z) = \frac{z^n}{\theta^n}.$$
• Definition. An estimator $$\theta_n$$ is consistent if it converges in probability to the true value of the parameter $$\theta^\star$$, that is, $$\theta_n \xrightarrow[(p)]{n \to \infty} \theta^\star$$.
• Definition. A random vector is a Gaussian vector if any linear combantion of its components is a (univariate) Gaussian random variable; that is, $$\mathbf\alpha^\top\mathbf{x}$$ is Gaussian for any non-zero vector $$\mathbf\alpha$$. I'm surprised I was not aware of this definition.
• An alternate formula for the covariance is $$\mathrm{cov}(X, Y) = \mathbf{E}\left[X(Y - \mu_Y)\right]$$; this property can be used to prove that the covariance is bilinear:
$$\mathrm{cov}(aX + bY, Z) = a \, \mathrm{cov}(X, Z) + b \, \mathrm{cov}(Y, Z).$$
• If $$(X, Y)$$ is independent of $$(U, V)$$ then all the following pairs are independent: $$(X, U)$$, $$(X, V)$$, $$(Y, U)$$, $$(Y, V)$$.
• Question. What can we estimate in statistics? It seems that is not only about the parameters, but also other functions related to the true probability distribution (such as, expectation or covariance).