‘singleton’ as I’ve seen it used seems to be one possible Singularity in which a single AI absorbs everyone and everything into itself in a single colossal entity. We’d probably consider it a Bad Ending.
A singleton is a more general concept than intelligence explosion. The specific case of a benevolent AGI singleton aka FAI is not a bad ending. Think of it as Nature 2.0, supervised universe, not as a dictator.
‘singleton’ as I’ve seen it used seems to be one possible Singularity in which a single AI absorbs everyone and everything into itself in a single colossal entity. We’d probably consider it a Bad Ending.
See Nick Bostrom (2005). What is a Singleton?
A singleton is a more general concept than intelligence explosion. The specific case of a benevolent AGI singleton aka FAI is not a bad ending. Think of it as Nature 2.0, supervised universe, not as a dictator.
I stand corrected! Maybe this should be a wiki article—it’s not that common, but it’s awfully hard to google.
Done.