Dec 20, 2020

Bayesian priors and posteriors in 3D

Bayesian priors and posteriors in 3D

Once a year you travel on business to New York, always looking forward to lunch with your “year abroad” friend who moved there. Your tradition is to “flip for lunch” using the same tarnished coin you found on a street in Florence that wonderful day the two of you first met.

The opening image above is a visualization of the various probability distributions regarding the fairness of that old coin that could arise after four years of lunches. It presumes significant doubt about its fairness to begin with.

This post visualizes those distributions in three dimensions and asks what insights might have been gained.


In the first Bayesian statistics Coursera course From Concept to Data Analysis, instructor Dr. Herbert Lee, UCSC School of Engineering, writes his notes seemingly backwards (!) on a picture window, frequently looking at us through his handiwork to make sure we are getting his point. The effect is attention-grabbing; his windowboards screen-shot-worthy.

In the shot below Dr. Lee summarizes five important aspects of the relationship between a Binomial likelihood (a.k.a, a “Binomial process,” as when flipping a coin and counting the number of heads) and a Beta prior:

  1. When the prior distribution for the probability \(\theta\) of the Binomial process is a Beta, the posterior for \(\theta\) is also a Beta (the conjugate property)
  2. The mean of a Beta is \(\frac \alpha {\alpha + \beta}\)
  3. The parameters of the posterior are \(\alpha + \#(successes)\) and \(\beta + n - \#(successes)\)
  4. The effective sample size of the prior is \(\alpha + \beta\)
  5. The posterior mean is the weighted average of the prior mean and the data mean