If you start out with a maximum-entropy prior, then you never learn anything, ever, no matter how much evidence you observe. You do not even learn anything wrong—you always remain as ignorant as you began.
Can you clarify what you mean here? Are you referring specifically to the monkey example or making a more general point?
Can you clarify what you mean here? Are you referring specifically to the monkey example or making a more general point?