There may be simpler/more elegant ways to show what I show below, but I find that this question comes up often and it’s good to have a toy example handy.

Consider the hierarchical model

where for simplicity, we assume that

How does increasing the number of covariates *p* affect our posterior judgements on and ? We will analyse the simple case when the components of are all orthogonal. Then the answer to this question is:

**Let ***X*** be an** *n x p ***matrix where** *n = *dim(*Y*) **and** *p = *dim(), **where the columns of** *X* **are orthogonal, then**

- Var
*(**Y | Z)***strictly increases with***p***and** - Var
*( | Z), i = 1,…,p,***is independent of***p**.*

To illustrate (not prove) why these two claims are true we first re-write the model as

where , and . In typical applications *Z* would constitute the observations, *Y* the hidden state, *X* the regressors and the weights attached to each component in . For our simple model

**Var(Y | Z)**

Here we are interested in the effect of the number of columns in on the posterior uncertainty of *Y*, that is, Var(*Y | Z). *Using Schur complements on we obtain

Now, if , where is a vector (a single covariate), then

while if (two covariates) then it can be shown that

where denotes the inner product between and .

Note that both and can be written in the form

Therefore if , then and vice-versa. Now, when and are orthogonal,

Therefore the posterior variance of *Y* increases with the number of orthogonal regressors in *X*.

**Var( | Z)**

Here we are interested in the effect of the number of columns in on the posterior uncertainty of , that is, Var(* | Z). *Using Schur complements on we obtain

and therefore when (one regressor),

When we obtain

If is orthogonal to then we obtain

Note that