This post is about some understandings of the Central Limit Theorem and (un)related stuff.

## Central Limit Theorem

As defined in Wikipedia, its basic form is \[ \sqrt{n}(S_n-\mu) \rightarrow N(0,\sigma^2)\] saying that as \(n\) approaches infinity, the random variable \( \sqrt{n}(S_n-\mu) \) converges in distribution to a normal \( N(0,1) \).

The interesting part is the scaling factor \( \sqrt{n} \), for a smaller scaling factor the whole term \( \sqrt{n}(S_n-\mu) \) always goes to zero, for a larger scaling factor the term will blow up, only for \( \sqrt{n} \) it converges to a distribution with constant variance.

It can be proved using the fact convergence of moment-generating functions (MGFs) implies the convergence in distribution, i.e. to prove \( M(\sqrt{n}(S_n-\mu)/\sigma) = M(N(0,1))\). Here’s a very good lecture on that Lecture 29: Law of Large Numbers and Central Limit Theorem | Statistics 110.

So we know for \(n\) i.i.d variables with zero mean, as \(n\) gets larger, its mean goes to zero (the law of large numbers), its sum blows up to positive/negative infinity, but this term \( \sqrt{n}(S_n-\mu) \) converges in distribution to a normal.

CTL is useful in many areas including the Wiener process, which is often used to model the stock price, see this blog and this question.

## Variance of Sum of IID Random Variables

The \( \sqrt{n} \) scaling factor also applies when summing a finite number of i.i.d random variables (ref: sum of uncorrelated variables). For example sum of normally distributed random variables, and Irwin–Hall distribution.

Note that this is different from applying a change of variable \(z=2x\), which gives a variance of \((2\sigma)^2\) instead of \(2\sigma^2\).

For \(n\) normally distributed random variables with the same variance, it follows \(\sum (x_n-\mu_n) \sim N(0, n\sigma^2)\), which is equivalent to \[ \sqrt{n}(S_n-\mu) \sim N(0,\sigma^2)\] the Central Limit Theorem.

## Sampling Normal Distributions

Using the facts above we can approximate samples from any normal distribution given samples from a uniform distribution \(U(0,1)\) (though in fact Box–Muller transform can do that accurately).

According to Central Limit Theorem, the sum of \(n\) uniformly distributed variables approximately follows a normal distribution, \(\sum x \sim N(n/2,\sigma^2)\). According to sum of uncorrelated variables, we can work out the variance \(\sigma^2 = nVar(x)=n/12\), therefore \(\sqrt{n}(\sum x/n-1/2)\) converges in distribution to \( N(0,1/12) \).