Yup. The MWI stuff is just a good local example of how not to justify what you believe. They’re doing same with AI what Eliezer did with MWI: trying to justify things they not very rationally believe in with advanced concepts they poorly understand, which works on non-experts only.
Most of this seems unrelated to what the OP says. Are you sure you posted this in the right place?
Yup. The MWI stuff is just a good local example of how not to justify what you believe. They’re doing same with AI what Eliezer did with MWI: trying to justify things they not very rationally believe in with advanced concepts they poorly understand, which works on non-experts only.