In the opposite direction, there is the zeroth moment method, more commonly known as the union bound The claim now follows from Chebyshev’s inequality (2). A standard computation, exploiting (3) and the pairwise independence of the, shows that the variance of the empirical averages is equal to times the variance of the original variable X. (Second moment tail bound) If is finite, then We can get stronger bounds than Lemma 1 – in particular, bounds which improve with n – at the expense of stronger assumptions on X. Lemma 1 is not strong enough by itself to prove the law of large numbers in either weak or strong form – in particular, it does not show any improvement as n gets large – but it will be useful to handle one of the error terms in those proofs. The claim now follows from Markov’s inequality. By linearity of expectation, the expectation of is. (First moment tail bound) If is finite, then Returning to the law of large numbers, the first moment method gives the following tail bound: By Markov’s inequality (1) we conclude that But by linearity of expectation, the expectation of this random variable is, which is finite by hypothesis. Our task is to show that is almost surely finite. Let denote the indicator function of the event. Then almost surely, only finitely many of the events are true. Let be a sequence of events such that is finite. Here is a basic application of the first moment method:īorel-Cantelli lemma. Higher moments can in principle give more precise information, but often require stronger assumptions on the objects being studied, such as joint independence. Whereas to compute the second moment one also needs to understand covariances (which are particularly simple if one assumes pairwise independence), thanks to identities such as Generally speaking, to compute the first moment one usually employs linearity of expectation (note that (2) is just (1) applied to the random variable and to the threshold ). (which follows by taking expectations of the pointwise inequality ), whereas the second moment method employs some version of Chebyshev’s inequality, such as The first moment method usually employs Markov’s inequality The reason that this method is so effective is because the first few moments can often be computed rather precisely. the probability that it fluctuates far from its mean) by means of moments, and in particular the zeroth, first or second moment. The moment method seeks to control the tail probabilities of a random variable (i.e. The emphasis in this exposition will be on motivation and methods rather than brevity and strength of results there do exist proofs of the strong law in the literature that have been compressed down to the size of one page or less, but this is not my goal here. So I thought I would present a proof here of both laws, which proceeds by the standard techniques of the moment method and truncation. The weak law is easy to prove, but the strong law (which of course implies the weak law, by Egoroff’s theorem) is more subtle, and in fact the proof of this law (assuming just finiteness of the first moment) usually only appears in advanced graduate texts. With even more hypotheses on X, one similarly has more precise versions of the strong law of large numbers, such as the Chernoff inequality, which I will again not discuss here.) (If one strengthens the first moment assumption to that of finiteness of the second moment, then we of course have a more precise statement than the (weak) law of large numbers, namely the central limit theorem, but I will not discuss that theorem here. Suppose that the first moment of X is finite. Then converges in probability to, thus for every. A fundamental theorem in probability theory is the law of large numbers, which comes in both a weak and a strong form: Let be the empirical averages of this sequence. Let X be a real-valued random variable, and let be an infinite sequence of independent and identically distributed copies of X.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |