So it seems to me that the exercise just demonstrates that Bayesianism-done-slyly outperformed frequentism-done-mindlessly.
This example really is Bayesianism-done-straightforwardly. The point is that you really don’t need to be sly to get reasonable results.
For the Bayesian CI, Jaynes takes a constant prior, then jumps straight to the posterior being N exp(N(θ - x1))
A constant prior ends up using only the likelihoods. The jump straight to the posterior is a completely mechanical calculation, just products, and normalization.
Then I’d construct the Bayesian CI by mechanically defining the likelihood as the product of the individual observations’ likelihoods.
Each individual likelihood goes to zero for (x < θ). This means that product also does for the smallest (x1 < θ). You will get out the same PDF as Jaynes. CIs can be constructed many ways from PDFs, but constructing the smallest one will give you the same one as Jaynes.
EDIT: for full effect, please do the calculation yourself.
This example really is Bayesianism-done-straightforwardly. The point is that you really don’t need to be sly to get reasonable results.
A constant prior ends up using only the likelihoods. The jump straight to the posterior is a completely mechanical calculation, just products, and normalization.
Each individual likelihood goes to zero for (x < θ). This means that product also does for the smallest (x1 < θ). You will get out the same PDF as Jaynes. CIs can be constructed many ways from PDFs, but constructing the smallest one will give you the same one as Jaynes.
EDIT: for full effect, please do the calculation yourself.
I stopped reading cupholder’s comment before the last paragraph (to write my own reply) and completely missed this! D’oh!