NP Bayes Reading Group: 1st meeting

We’ve started a reading group to come to grips with some of the recent developments in non-parametric (NP) Bayesian modeling, in particular, hierarchical Bayesian models for discrete data.  The defining characteristic of NP models are that the number of parameters scale with the amount of data (leading to an infinite number of parameters in the limit of infinite data).  Although these  have sparked a mini-revolution in cognitive psychology (e.g., Tenenbaum, Griffiths & Kemp 2006), they do not appear to have found much application to statistical analysis of neural data (with the exception of spike sorting — see, e.g. Wood & Black 2008).

Our first assignment is to go through the slides from Michael Jordan’s 2005 NIPS tutorial (  Last week we began, and made it through slide #23, covering the basic ideas of non-parametric models, exchangeability, De Finetti’s theorem, conjugate priors, Gibbs sampling, graphical models, Dirichlet & Beta distributions.

A few issues set aside for further exploration:

  • proof of De Finetti’s theorm (Evan)
  • relationship between CRP and stick-breaking (JP)
  • slide 13: “A short calculation shows…” (Joe)
  • proof that # of occupied tables is O(log n).  (Memming)
  • aggregation property (Ken)

Next meeting: today 3pm (Jun 17, 2011).  Memming to lead…

1 thought on “NP Bayes Reading Group: 1st meeting

  1. Pingback: NP Bayes Reading Group: 2nd meeting | Pillow Lab Blog

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s