Am I the only one who thinks that the world as it is is unbelievably, fantastically, super bad? The concept of an AI destroying the world would only be bad because it would prevent a very good potential future from coming into existence, that could not otherwise happen. Stopping the AI would remove all hope of it ever happening.
Not speaking for Flaglandbase, but I’d argue the world right now (or rather, life on earth) is super bad because it’s dominated by animal suffering. I’m also happy most days.
I’d argue the world right now (or rather, life on earth) is super bad because it’s dominated by animal suffering
I agree with this, and the overall history of the world is definitely on balance extreme suffering.
For farmed animals in particular, we don’t need AGI to end their plight. Just regular economic growth and advocacy will do.
Also, given how much time we’ve been suffering already, and how much is at stake; would it be so bad to delay AGI by 100 or 200 years? We can do a lot of alignment research in that time.
Depends how much you value suffering vs pleasure I guess. If you think it’s better to exist, and experience positive things at the cost of great suffering, then this world is pretty awesome. If you would rather not exist (or more accurately believe that most beings would rather not exist if they had the choice), then things look pretty bad…
I agree the world right now is super bad. However, “delay AGI until we really know what we’re doing” doesn’t seem all that much harder than “delay AGI forever”, and most people do agree that alignment is solvable.[1] Right now, we still have far more people working on capability (and they’ve been working on it for far longer); if we could change this, alignment may even get solved relatively quickly.
I predict that most creatures disagree with you, if an honest poll about themselves was done, and not about some far abstraction of other people. (EDIT: Link is about humans, but I predict most non-humans also prefer to be alive and aren’t better off dead.)
Which is also my prior on the attitude of “it’s fine if everyone dies” people. Of historical cases where someone thought that, few people agreed and we end up glad they didn’t get their way. I’m sure it’s the same all over again here with you, and some other people I’ve heard express this attitude.
Am I the only one who thinks that the world as it is is unbelievably, fantastically, super bad? The concept of an AI destroying the world would only be bad because it would prevent a very good potential future from coming into existence, that could not otherwise happen. Stopping the AI would remove all hope of it ever happening.
I’m happy most days.
Not speaking for Flaglandbase, but I’d argue the world right now (or rather, life on earth) is super bad because it’s dominated by animal suffering. I’m also happy most days.
I agree with this, and the overall history of the world is definitely on balance extreme suffering.
For farmed animals in particular, we don’t need AGI to end their plight. Just regular economic growth and advocacy will do.
Also, given how much time we’ve been suffering already, and how much is at stake; would it be so bad to delay AGI by 100 or 200 years? We can do a lot of alignment research in that time.
Yeah, if I got to decide, I would barely factor in how bad the world is right now. Delay AGI until it’s outweighed by other x-risks.
Depends how much you value suffering vs pleasure I guess. If you think it’s better to exist, and experience positive things at the cost of great suffering, then this world is pretty awesome. If you would rather not exist (or more accurately believe that most beings would rather not exist if they had the choice), then things look pretty bad…
I agree the world right now is super bad. However, “delay AGI until we really know what we’re doing” doesn’t seem all that much harder than “delay AGI forever”, and most people do agree that alignment is solvable.[1] Right now, we still have far more people working on capability (and they’ve been working on it for far longer); if we could change this, alignment may even get solved relatively quickly.
Eliezer has said this explicitly, e.g. on the sam harris podcast (CTRL+F “alignment is impossible”)
I predict that most creatures disagree with you, if an honest poll about themselves was done, and not about some far abstraction of other people. (EDIT: Link is about humans, but I predict most non-humans also prefer to be alive and aren’t better off dead.)
Which is also my prior on the attitude of “it’s fine if everyone dies” people. Of historical cases where someone thought that, few people agreed and we end up glad they didn’t get their way. I’m sure it’s the same all over again here with you, and some other people I’ve heard express this attitude.
You say “creatures”, but the linked source seems to be only about humans.
Yes, but I predict it will end up applying to most non-humans too.
Why would you be able to generalize from humans to factory farmed animals?
Most animals aren’t in a factory farm.