I think partially it’s meant to go from some sort of abstract model of intelligence as a scalar variable that increases at some rate (like, on a x/y graph) to concrete, material milestones. Like, people can imagine “intelligence goes up rapidly! singularity!” and it’s unclear what that implies, I’m saying sufficient levels would imply eating the sun, that makes it harder to confuse with things like “getting higher scores on math tests”.
I suppose a more general category would be, the relevant kind of self-improving intelligence would be the sort that can re-purpose mass-energy to creating more computation that can run its intelligence, and “eat the Sun” is an obvious target given this background notion of intelligence.
(Note, there is skepticism about feasibility on Twitter/X, that’s some info about how non-singulatarians react)
I think partially it’s meant to go from some sort of abstract model of intelligence as a scalar variable that increases at some rate (like, on a x/y graph) to concrete, material milestones. Like, people can imagine “intelligence goes up rapidly! singularity!” and it’s unclear what that implies, I’m saying sufficient levels would imply eating the sun, that makes it harder to confuse with things like “getting higher scores on math tests”.
I suppose a more general category would be, the relevant kind of self-improving intelligence would be the sort that can re-purpose mass-energy to creating more computation that can run its intelligence, and “eat the Sun” is an obvious target given this background notion of intelligence.
(Note, there is skepticism about feasibility on Twitter/X, that’s some info about how non-singulatarians react)