Well, yeah, if you simplify “rational” down to “probability estimates” then attempting to give a probability estimate of how rational you are is going to give some weird results.
(Analogously, if you had a computer program that did Bayesian calculations, and you asked it to take in these pieces of evidence and calculate the posterior probability that it performed a Bayesian calculation to get this posterior probability, you’d get similar weird results.)
It’s a case of presupposition. If you want more on Boltzmann brains, I think you need to delve into anthropics.
I much prefer to measure rationality as “probability that a randomly chosen belief matches reality”. That way, your rationality is a probability, you update it whenever you get something wrong or right, and it’s a good prior for quick judgments.
Well, yeah, if you simplify “rational” down to “probability estimates” then attempting to give a probability estimate of how rational you are is going to give some weird results.
(Analogously, if you had a computer program that did Bayesian calculations, and you asked it to take in these pieces of evidence and calculate the posterior probability that it performed a Bayesian calculation to get this posterior probability, you’d get similar weird results.)
It’s a case of presupposition. If you want more on Boltzmann brains, I think you need to delve into anthropics.
I much prefer to measure rationality as “probability that a randomly chosen belief matches reality”. That way, your rationality is a probability, you update it whenever you get something wrong or right, and it’s a good prior for quick judgments.