The idea that the kind of AI that this community is worried about is not the scenario that is common in Scifi. A real AGI wouldn’t act like the one’s in Scifi.
The idea that the kind of AI that this community is worried about is not the scenario that is common in Scifi. A real AGI wouldn’t act like the one’s in Scifi.
I get where you’re going with this, but I think it’s either not true or not relevant. That is, it looks like a statement about the statistical properties of scifi (most AI in fiction is unrealistic) which might be false if you condition appropriately (there have been a bunch of accurate presentations of AI recently, and so it’s not clear this still holds for contemporary scifi). What I care about though is the question of whether or not that matters.
Suppose the line of argument is something like “scifi is often unrealistic,” “predicting based on unrealistic premises is bad,” and “this is like scifi because it’s unrealistic.” This is a weaker argument than one that just has the second piece and the third piece modified to say “this is unrealistic.” (And for this to work, we need to focus on the details of the argument.)
Suppose instead the line of argument is instead something like “scifi is often unrealistic,” “predicting based on unrealistic premises is bad,” and “this is like scifi because of its subject matter.” Obviously this leaves a hole—the subject matter may be something that many people get wrong, but does this presentation get it wrong?
The idea that the kind of AI that this community is worried about is not the scenario that is common in Scifi. A real AGI wouldn’t act like the one’s in Scifi.
I get where you’re going with this, but I think it’s either not true or not relevant. That is, it looks like a statement about the statistical properties of scifi (most AI in fiction is unrealistic) which might be false if you condition appropriately (there have been a bunch of accurate presentations of AI recently, and so it’s not clear this still holds for contemporary scifi). What I care about though is the question of whether or not that matters.
Suppose the line of argument is something like “scifi is often unrealistic,” “predicting based on unrealistic premises is bad,” and “this is like scifi because it’s unrealistic.” This is a weaker argument than one that just has the second piece and the third piece modified to say “this is unrealistic.” (And for this to work, we need to focus on the details of the argument.)
Suppose instead the line of argument is instead something like “scifi is often unrealistic,” “predicting based on unrealistic premises is bad,” and “this is like scifi because of its subject matter.” Obviously this leaves a hole—the subject matter may be something that many people get wrong, but does this presentation get it wrong?