I take the lie. The class of true beliefs has on average a significantly higher utility-for-believing than the class of false beliefs but there is an overlap. The worst in the “true” is worse than the best in “false”.
I’d actually be surprised if Omega couldn’t program me with a true belief that caused me to drive my entire species to extinction, and probably worse than that. Because superintelligent optimisers are badass and wedrifids are Turing-complete.
I take the lie. The class of true beliefs has on average a significantly higher utility-for-believing than the class of false beliefs but there is an overlap. The worst in the “true” is worse than the best in “false”.
I’d actually be surprised if Omega couldn’t program me with a true belief that caused me to drive my entire species to extinction, and probably worse than that. Because superintelligent optimisers are badass and wedrifids are Turing-complete.