I was thinking recently about how the phrase “artificial intelligence” causes bad intuition. The standard LW answer is to talk instead of “optimization processes”. That’s all right I guess.
In an unrelated event, I remembered the idea of the “improbability drive” from A Hitchhikers Guide to the Galaxy. The ID is a device that squeezes the probability distribution of the future into improbably good outcomes (like being randomly teleported across the universe due to quantum noise, or the hostess’ dress jumping 2 feet to the right).
Anyways, I thought it might be nice to explain an AI as an improbability drive. Employ a different set of intuitions.
As I recall, what the probability drive did in practice was further the plot, at which point one character would ask “well that was convenient” and be told “I know,right? How very improbable!”
I got another stupid idea.
I was thinking recently about how the phrase “artificial intelligence” causes bad intuition. The standard LW answer is to talk instead of “optimization processes”. That’s all right I guess.
In an unrelated event, I remembered the idea of the “improbability drive” from A Hitchhikers Guide to the Galaxy. The ID is a device that squeezes the probability distribution of the future into improbably good outcomes (like being randomly teleported across the universe due to quantum noise, or the hostess’ dress jumping 2 feet to the right).
Anyways, I thought it might be nice to explain an AI as an improbability drive. Employ a different set of intuitions.
yep...
As I recall, what the probability drive did in practice was further the plot, at which point one character would ask “well that was convenient” and be told “I know,right? How very improbable!”