It’s about collating reasons to believe civilization will collapse before it gets to spawn a rogue superintelligence that consumes all matter in the Laniakea supercluster. A good outcome, all things considered.
an honest belief of yours, or satire? There are some arguments that unaligned AI systems might be morally valuable.
Civilization collapsing is blatantly better than rogue superintelligence, as it’s plausibly a recoverable disaster, so yes, that is my honest belief. I don’t consider non-organics to be moral entities, since I also believe they’re not sentient. Yeah, I’m aware those views are contested, but then, what the hell isn’t when it comes to philosophy. There are philosophers who argue for post-intentionalism, the view that our words, language and thoughts aren’t actually about anything, for crying out loud.
I guess the Centre for Applied Eschatology would be right up your alley.
I’m curious, is
an honest belief of yours, or satire? There are some arguments that unaligned AI systems might be morally valuable.
Civilization collapsing is blatantly better than rogue superintelligence, as it’s plausibly a recoverable disaster, so yes, that is my honest belief. I don’t consider non-organics to be moral entities, since I also believe they’re not sentient. Yeah, I’m aware those views are contested, but then, what the hell isn’t when it comes to philosophy. There are philosophers who argue for post-intentionalism, the view that our words, language and thoughts aren’t actually about anything, for crying out loud.