Yup. The MWI stuff is just a good local example of how not to justify what you believe. They’re doing same with AI what Eliezer did with MWI: trying to justify things they not very rationally believe in with advanced concepts they poorly understand, which works on non-experts only.
Yup. The MWI stuff is just a good local example of how not to justify what you believe. They’re doing same with AI what Eliezer did with MWI: trying to justify things they not very rationally believe in with advanced concepts they poorly understand, which works on non-experts only.