[SEQ RERUN] Infinite Certainty
Today’s post, Infinite Certainty was originally published on 09 January 2008. A summary (taken from the LW wiki):
If you say you are 99.9999% confident of a proposition, you’re saying that you could make one million equally likely statements and be wrong, on average, once. Probability 1 indicates a state of infinite certainty. Furthermore, once you assign a probability 1 to a proposition, Bayes’ theorem says that it can never be changed, in response to any evidence. Probability 1 is a lot harder to get to with a human brain than you would think.
Discuss the post here (rather than in the comments to the original post).
This post is part of the Rerunning the Sequences series, where we’ll be going through Eliezer Yudkowsky’s old posts in order so that people who are interested can (re-)read and discuss them. The previous post was Absolute Authority, and you can use the sequence_reruns tag or rss feed to follow the rest of the series.
Sequence reruns are a community-driven effort. You can participate by re-reading the sequence post, discussing it here, posting the next day’s sequence reruns post, or summarizing forthcoming articles on the wiki. Go here for more details, or to have meta discussions about the Rerunning the Sequences series.
Also see this follow-up by Yvain: Confidence levels inside and outside an argument.
Could someone please post the following summaries?
The Fallacy of Gray
Absolute Authority
Infinite Certainty
0 And 1 Are Not Probabilities
Beautiful Math
Expecting Beauty
Is Reality Ugly?
Beautiful Probability
Trust in Math
Done!
I think you should just register another username, the problem with the old one doesn’t appear to be being solved.
I tried a few times. Same problem.