One thing that I’ve tried with Google is using it to write stories. Start by searching on “Fred was bored and”.
Pick slightly from the results and search on “was bored and slightly”. Pick annoyed from the search results and search on “bored and slightly annoyed”
Trying this again just now reminds me that I let the sentence fragment grow and grow until I was down to, err, ten? hits. Then I took the next word from a hit that wasn’t making a literal copy, and deleted enough leading words to get the hit count back up.
Anyway, it seemed unpromising because the text lacked long range coherence. Indeed, the thread of the sentences rarely seemed to run significantly longer than the length of the search string.
Perhaps “unpromising” is too harsh. If I were making a serious Turing Test entry I would happily use Big Data and mine the web for grammar rules and idioms. On the other hand I feel the need for a new and different idea for putting some meaning and intelligence behind the words. Otherwise my chat bot would only be able to compete with humans who were terribly, terribly drunk and unable to get from one end a sentence to the other kind of cricket match where England collapses and we lose the ashes on the way back from the crematorium, which really upset the, make mine a pint, now where was I?
Essentially, you tried to make a Markov-chain story generator. Yes, it generates this type of texts, where short fragments look like parts of meaningful text, but a longer text reveals that it has no sense.
Seems to me that there is a mental illness (but I don’t remember which one it is) where people generate the same kind of speech. Not sure what are the philosophical consequences for the Turing test, though.
One thing that I’ve tried with Google is using it to write stories. Start by searching on “Fred was bored and”. Pick slightly from the results and search on “was bored and slightly”. Pick annoyed from the search results and search on “bored and slightly annoyed”
Trying this again just now reminds me that I let the sentence fragment grow and grow until I was down to, err, ten? hits. Then I took the next word from a hit that wasn’t making a literal copy, and deleted enough leading words to get the hit count back up.
Anyway, it seemed unpromising because the text lacked long range coherence. Indeed, the thread of the sentences rarely seemed to run significantly longer than the length of the search string.
Perhaps “unpromising” is too harsh. If I were making a serious Turing Test entry I would happily use Big Data and mine the web for grammar rules and idioms. On the other hand I feel the need for a new and different idea for putting some meaning and intelligence behind the words. Otherwise my chat bot would only be able to compete with humans who were terribly, terribly drunk and unable to get from one end a sentence to the other kind of cricket match where England collapses and we lose the ashes on the way back from the crematorium, which really upset the, make mine a pint, now where was I?
Essentially, you tried to make a Markov-chain story generator. Yes, it generates this type of texts, where short fragments look like parts of meaningful text, but a longer text reveals that it has no sense.
Seems to me that there is a mental illness (but I don’t remember which one it is) where people generate the same kind of speech. Not sure what are the philosophical consequences for the Turing test, though.
You’re probably thinking of word salad, most often associated with schizophrenia but not exclusive to it.