I mostly agree, although I would also accept as “successfully self-replicating” either being sneaky enough (like your example of computer virus), or self-sufficient such that it can earn enough resources and spend the resources to acquire sufficient additional compute to create a copy of itself (and then do so).
So, yeah, not quite at the red line point in my books. But not so far off!
I wouldn’t find this particularly alarming in itself though, since I think “barely over the line of able to sustain itself and replicate” is still quite a ways short of being dangerous or leading to an AI population explosion.
I mostly agree, although I would also accept as “successfully self-replicating” either being sneaky enough (like your example of computer virus), or self-sufficient such that it can earn enough resources and spend the resources to acquire sufficient additional compute to create a copy of itself (and then do so).
So, yeah, not quite at the red line point in my books. But not so far off!
I wouldn’t find this particularly alarming in itself though, since I think “barely over the line of able to sustain itself and replicate” is still quite a ways short of being dangerous or leading to an AI population explosion.