Homework 8.1: Hierarchical models are hiding in plain sight (20 pts)


Say we have a set of measurements, \(\mathbf{x} = \{x_1, x_2,\ldots\}\). Each measurement has associated with is some measurement error such that \(x_i \sim \text{Norm}(\mu_i, s)\). Furthermore, there is a natural variability from measurement to measurement such that \(\mu_i \sim \text{Norm}(\mu, \sigma)\).

a) Write a mathematical expression for the joint generative probability density, \(\pi(\mathbf{x}, \boldsymbol{\mu}, \mu, s, \sigma)\), where \(\mathbf{x} = \{x_1, x_2,\ldots\}\) and \(\boldsymbol{\mu} = \{\mu_1, \mu_2, \ldots\}\). You may assume that the prior may be separated such that \(g(\mu, \sigma, s) = g(\mu) g(\sigma) g(s)\).

b) Calculate \(\pi(\mathbf{x}, \mu, s, \sigma)\). Hint: It may help to use the relation

\begin{align} \frac{(x_i-\mu_i)^2}{2s^2} + \frac{(\mu_i-\mu)^2}{2\sigma^2} = \frac{(x_i-\mu)^2}{2(s^2 + \sigma^2)} + \frac{\left(\mu_i - b\right)^2}{2a^2}, \end{align}

where

\begin{align} a^2 = \frac{s^2\sigma^2}{s^2+\sigma^2}\;\;\text{and}\;\; b = \frac{x_i/s^2 + \mu/s^2}{s^{-2} + \sigma^{-2}}, \end{align}

which may be found by completing the square. Another hint: It is useful to know about Gaussian integrals, namely that

\begin{align} \int_{-\infty}^\infty \mathrm{d}x\,\mathrm{e}^{-(x-b)^2 / 2 a^2} = \sqrt{2\pi a^2}. \end{align}

c) Write an expression fro \(\pi(\mathbf{x}, \mu, s, \sigma)\) in the limit where the natural variability is much greater than the measurement error (\(s \ll \sigma\)).

d) Finally, in this limit, write the expression for \(\pi(\mathbf{x}, \mu, \sigma)\) and show that for this hierarchical model, the limit of \(s\ll \sigma\) is equivalent to having a likelihood of \(x_i \sim \text{Norm}(\mu, \sigma)\;\forall i\).

This problem shows how a hierarchical model can reduce to a non-hierarchical one. In this case, it helps you see what is being neglected when you choose not to use a hierarchical model (since so many models actually are hierarchical if you don’t make approximations like the one derived here).