Econometrica: Jul, 1999, Volume 67, Issue 4
Bayesian Representation of Stochastic Processes under Learning: de Finetti Revisited
https://doi.org/10.1111/1468-0262.00055
p. 875-893
Matthew O. Jackson, Ehud Kalai, Rann Smorodinsky
A probability distribution governing the evolution of a stochastic process has infinitely many Bayesian representations of the form =∫(). Among these, a natural representation is one whose components ( 's) are ‘learnable’ (one can approximate by conditioning on observation of the process) and ‘sufficient for prediction’ ('s predictions are not aided by conditioning on observation of the process). We show the existence and uniqueness of such a representation under a suitable asymptotic mixing condition on the process. This representation can be obtained by conditioning on the tail‐field of the process, and any learnable representation that is sufficient for prediction is asymptotically like the tail‐field representation. This result is related to the celebrated de Finetti theorem, but with exchangeability weakened to an asymptotic mixing condition, and with his conclusion of a decomposition into i.i.d. component distributions weakened to components that are learnable and sufficient for prediction.