Archive
Sequences
About
Search
Log In
Questions
Events
Shortform
Alignment Forum
AF Comments
Home
Featured
All
Tags
Recent
Comments
Charlie Steiner answers
What is the probability that a superintelligent, sentient AGI is actually infeasible?
Charlie Steiner
14 Aug 2022 23:23 UTC
−1
points
−3
About 25 nines. So 0.0000000000000000000000001%
Back to top
About 25 nines. So 0.0000000000000000000000001%