# Iid Sample Example Essays

By:

In statistics, an exchangeable sequence of random variables (also sometimes interchangeable)[1] is a sequence such that future observations behave like earlier observations. More formally, this means that given a finite sequence of observations (i.e. of realizations of the random variables), any re-ordering of this sequence is equally likely to occur. This formalizes the notion of "the future being predictable on the basis of past experience." It is closely related to the use of independent and identically distributed random variables in statistical models. Exchangeable sequences of random variables arise in cases of simple random sampling.

## Definition

Formally, an exchangeable sequence of random variables is a finite or infinite sequence X1X2X3, ... of random variables such that for any finite permutation σ of the indices 1, 2, 3, ..., (the permutation acts on only finitely many indices, with the rest fixed), the joint probability distribution of the permuted sequence

is the same as the joint probability distribution of the original sequence.[1][2]

(A sequence E1, E2, E3, ... of events is said to be exchangeable precisely if the sequence of its indicator functions is exchangeable.) The distribution function FX1,...,Xn(x1, ..., xn) of a finite sequence of exchangeable random variables is symmetric in its arguments x1, ... ,xn.Olav Kallenberg provided an appropriate definition of exchangeability for continuous-time stochastic processes.[3][4]

## History

The concept was introduced by William Ernest Johnson in his 1924 book Logic, Part III: The Logical Foundations of Science.[5] Exchangeability is equivalent to the concept of statistical control introduced by Walter Shewhart also in 1924.[6][7]

## Exchangeability and the i.i.d statistical model

The property of exchangeability is closely related to the use of independent and identically-distributed random variables in statistical models. A sequence of random variables that are independent and identically-distributed (i.i.d), conditional on some underlying distributional form is exchangeable. This follows directly from the structure of the joint probability distribution generated by the i.i.d form.

Moreover, the converse can be established for infinite sequences, through an important representation theorem by Bruno de Finetti (later extended by other probability theorists such as Halmos and Savage). The extended versions of the theorem show that in any infinite sequence of exchangeable random variables, the random variables are conditionally independent and identically-distributed, given the underlying distributional form. This theorem is stated briefly below. (De Finetti's original theorem only showed this to be true for random indicator variables, but this was later extended to encompass all sequences of random variables.) Another way of putting this is that de Finetti's theorem characterizes exchangeable sequences as mixtures of i.i.d sequences — while an exchangeable sequence need not itself be unconditionally i.i.d, it can be expressed as a mixture of underlying i.i.d sequences.[1]

This means that infinite sequences of exchangeable random variables can be regarded equivalently as sequences of conditionally i.i.d random variables, based on some underlying distributional form. (Note that this equivalence does not quite hold for finite exchangeability. However, for finite vectors of random variables there is a close approximation to the i.i.d model.) An infinite exchangeable sequence is strictly stationary and so a law of large numbers in the form of Birkhoff-Khinchin theorem applies.[4] This means that the underlying distribution can be given an operational interpretation as the limiting empirical distribution of the sequence of values. The close relationship between exchangeable sequences of random variables and the i.i.d form means that the latter can be justified on the basis of infinite exchangeability. This notion is central to Bruno de Finetti's development of predictive inference and to Bayesian statistics. It can also be shown to be a useful foundational assumption in frequentist statistics and to link the two paradigms.[8]

The Representation Theorem: This statement is based on the presentation in O'Neill (2009) in references below. Given an infinite sequence of random variables we define the limiting empirical distribution function by:

(This is the Cesaro limit of the indicator functions. In cases where the Cesaro limit does not exist this function can actually be defined as the Banach limit of the indicator functions, which is an extension of this limit. This latter limit always exists for sums of indicator functions, so that the empirical distribution is always well-defined.) If the sequence is exchangeable then the elements of are independent with distribution function . This means that for any vector of random variables in the sequence we have joint distribution function given by:

If the distribution function is indexed by another parameter then (with densities appropriately defined) we have:

These equations show the joint distribution or density characterised as a mixture distribution based on the underlying limiting empirical distribution (or a parameter indexing this distribution).

Note that not all finite exchangeable sequences are mixtures of i.i.d. To see this, consider sampling without replacement from a finite set until no elements are left. The resulting sequence is exchangeable, but not a mixture of i.i.d. Indeed, conditioned on all other elements in the sequence, the remaining element is known.

## Covariance and correlation

Exchangeable sequences have some basic covariance and correlation properties which mean that they are generally positively correlated. For infinite sequences of exchangeable random variables, the covariance between the random variables is equal to the variance of the mean of the underlying distribution function.[8] For finite exchangeable sequences the covariance is also a fixed value which does not depend on the particular random variables in the sequence. There is a weaker lower bound than for infinite exchangeability and it is possible for negative correlation to exist.

Covariance for exchangeable sequences (infinite): If the sequence is exchangeable then:

Covariance for exchangeable sequences (finite): If is exchangeable with then:

The finite sequence result may be proved as follows. Using the fact that the values are exchangeable we have:

We can then solve the inequality for the covariance yielding the stated lower bound. The non-negativity of the covariance for the infinite sequence can then be obtained as a limiting result from this finite sequence result.

Equality of the lower bound for finite sequences is achieved in a simple urn model: An urn contains 1 red marble and n − 1 green marbles, and these are sampled without replacement until the urn is empty. Let Xi = 1 if the red marble is drawn on the i-th trial and 0 otherwise. A finite sequence that achieves the lower covariance bound cannot be extended to a longer exchangeable sequence.[9]

## Examples

• Any convex combination or mixture distribution of iid sequences of random variables is exchangeable. A converse proposition is de Finetti's theorem.[10]
• Suppose an urn contains n red and m blue marbles. Suppose marbles are drawn without replacement until the urn is empty. Let Xi be the indicator random variable of the event that the i-th marble drawn is red. Then {Xi}i=1,...n is an exchangeable sequence. This sequence cannot be extended to any longer exchangeable sequence.
• Let have a bivariate normal distribution with parameters , and an arbitrary correlation coefficient. The random variables and are then exchangeable, but independent only if . The density function is

## Applications

The von Neumann extractor is a randomness extractor that depends on exchangeability: it gives a method to take an exchangeable sequence of 0s and 1s (Bernoulli trials), with some probability p of 0 and of 1, and produce a (shorter) exchangeable sequence of 0s and 1s with probability 1/2.

Partition the sequence into non-overlapping pairs: if the two elements of the pair are equal (00 or 11), discard it; if the two elements of the pair are unequal (01 or 10), keep the first. This yields a sequence of Bernoulli trials with as, by exchangeability, the odds of a given pair being 01 or 10 are equal.

Exchangeable random variables arise in the study of U statistics, particularly in the Hoeffding decomposition.[11]

## Notes

1. ^ abcIn short, the order of the sequence of random variables does not affect its joint probability distribution.
• Chow, Yuan Shih and Teicher, Henry, Probability theory. Independence, interchangeability, martingales, Springer Texts in Statistics, 3rd ed., Springer, New York, 1997. xxii+488 pp. ISBN 0-387-98228-0
2. ^Aldous, David J., Exchangeability and related topics, in: École d'Été de Probabilités de Saint-Flour XIII — 1983, Lecture Notes in Math. 1117, pp. 1–198, Springer, Berlin, 1985. ISBN 978-3-540-15203-3doi:10.1007/BFb0099421
3. ^Diaconis, Persi (2009). "Book review: Probabilistic symmetries and invariance principles (Olav Kallenberg, Springer, New York, 2005)". Bulletin of the Amererican Mathematical Society (New Series). 46 (4): 691–696. doi:10.1090/S0273-0979-09-01262-2. MR 2525743.
4. ^ abKallenberg, O., Probabilistic symmetries and invariance principles. Springer-Verlag, New York (2005). 510 pp. ISBN 0-387-25115-4.
5. ^Zabell (1992)
6. ^Barlow & Irony (1992)
7. ^Bergman (2009)
8. ^ ab
• O'Neill, B. (2009) Exchangeability, Correlation and Bayes' Effect. International Statistical Review77(2), pp. 241-250.
9. ^Taylor, Robert Lee; Daffer, Peter Z.; Patterson, Ronald F. (1985). Limit theorems for sums of exchangeable random variables. Rowman and Allanheld. pp. 1–152.
10. ^Spizzichino, Fabio Subjective probability models for lifetimes. Monographs on Statistics and Applied Probability, 91. Chapman & Hall/CRC, Boca Raton, FL, 2001. xx+248 pp. ISBN 1-58488-060-0
11. ^Borovskikh, Yu. V. (1996). "Chapter 10 Dependent variables". U-statistics in Banach spaces. Utrecht: VSP. pp. 365–376. ISBN 90-6764-200-2. MR 1419498.

## Bibliography

• Aldous, David J., Exchangeability and related topics, in: École d'Été de Probabilités de Saint-Flour XIII — 1983, Lecture Notes in Math. 1117, pp. 1–198, Springer, Berlin, 1985. ISBN 978-3-540-15203-3doi:10.1007/BFb0099421
• Barlow, R. E. & Irony, T. Z. (1992) "Foundations of statistical quality control" in Ghosh, M. & Pathak, P.K. (eds.) Current Issues in Statistical Inference: Essays in Honor of D. Basu, Hayward, CA: Institute of Mathematical Statistics, 99-112.
• Bergman, B. (2009) "Conceptualistic Pragmatism: A framework for Bayesian analysis?", IIE Transactions, 41, 86–93
• Borovskikh, Yu. V. (1996). U-statistics in Banach spaces. Utrecht: VSP. pp. xii+420. ISBN 90-6764-200-2. MR 1419498.
• Chow, Yuan Shih and Teicher, Henry, Probability theory. Independence, interchangeability, martingales, Springer Texts in Statistics, 3rd ed., Springer, New York, 1997. xxii+488 pp. ISBN 0-387-98228-0
• Diaconis, Persi (2009). "Book review: Probabilistic symmetries and invariance principles (Olav Kallenberg, Springer, New York, 2005)". Bulletin of the Amererican Mathematical Society (New Series). 46 (4): 691–696. doi:10.1090/S0273-0979-09-01262-2. MR 2525743.
• Kallenberg, O., Probabilistic symmetries and invariance principles. Springer-Verlag, New York (2005). 510 pp. ISBN 0-387-25115-4.
• Kingman, J. F. C., Uses of exchangeability, Ann. Probability 6 (1978) 83–197 MR494344JSTOR 2243211
• O'Neill, B. (2009) Exchangeability, Correlation and Bayes' Effect. International Statistical Review77(2), pp. 241–250. ISBN 978-3-540-15203-3doi:10.1111/j.1751-5823.2008.00059.x
• Taylor, Robert Lee; Daffer, Peter Z.; Patterson, Ronald F. (1985). Limit theorems for sums of exchangeable random variables. Rowman and Allanheld. pp. 1–152.
• Zabell, S. L. (1988) "Symmetry and its discontents", in Skyrms, B. & Harper, W. L. Causation, Chance and Credence, pp155-190, Kluwer
• — (1992). "Predicting the unpredictable". Synthese. 90: 205. doi:10.1007/bf00485351.

A sequence of random variables (we need at least two!) can fail to be iid (independent and identically distributed) in three different ways:

1. The random variables are not independent but they are identically distributed,
2. The random variables are independent but are not identically distributed,
3. The random variables are neither independent nor identically distributed.

For an example of 1., consider the "sampling without replacement" method described in this answer. Or consider an experiment in which we have a bag containing two coins with different probabilities $p_1$ and $p_2$ of turning up Heads. We choose one of the coins at random and toss the chosen coin $n$ times. Let $X_i$ be the indicator function of Heads on the $i$-th toss. Then, the law of total probability tells us that $$P\{X_i = 1\} = P\{X_i = 1\mid ~\text{coin #1}\}\times \frac 12 + P\{X_i = 1\mid ~\text{coin #2}\}\times \frac 12 = \frac{p_1+p_2}{2}.$$ Thus, the $X_i$'s are identically distributed Bernoulli random variables with parameter $\frac{p_1+p_2}{2}$. However, they are not independent random variables since the law of total probability gives us that $$P\{X_i=1, X_j=1\}=\frac{P\{X_i=1, X_j=1 \mid~\text{#1}\} + P\{X_i=1, X_j=1 \mid~\text{#2}\}}{2} = \frac{p_1^2+p_2^2}{2}$$ which does not equal $P\{X_i=1\}P\{X_j=1\} = \left(\frac{p_1+p_2}{2}\right)^2$.

For an example of 2., consider a similar experiment in which the the $i$-th (independent toss) is of a coin that has probability $p_i$ different from $p_1, p_2, \ldots, p_{i-1}$. (Let $p_i = e^{-i}$ if you need an explicit example). Then the $X_i$ are independent Bernoulli random variables (by assumption) but they are not identically distributed since they all have different parameters.

I will leave the exercise of coming up with an example of 3. to you.