Over the last 12 years, I’ve chatted with small hundreds of people who were somewhere “in process” along the path toward “okay I guess I should take Singularity scenarios seriously.” From watching them, my guess is that the process of coming to take Singularity scenarios seriously is often even more disruptive than is losing a childhood religion. Among many other things, I have seen it sometimes disrupt:
People’s belief that they should have rest, free time, some money/time/energy to spend on objects of their choosing, abundant sleep, etc.
“It used to be okay to buy myself hot cocoa from time to time, because there used to be nothing important I could do with money. But now—should I never buy hot cocoa? Should I agonize freshly each time? If I do buy a hot cocoa does that mean I don’t care?”
People’s in-practice ability to “hang out”—to enjoy their friends, or the beach, in a “just being in the moment” kind of way.
“Here I am at the beach like my to-do list told me to be, since I’m a good EA who is planning not to burn out. I’ve got my friends, beer, guitar, waves: check. But how is it that I used to be able to enter “hanging out mode”? And why do my friends keep making meaningless mouth-noises that have nothing to do with what’s eventually going to happen to everyone?”
People’s understanding of whether commonsense morality holds, and of whether they can expect other folks in this space to also believe that commonsense morality holds.
“Given the vast cosmic stakes, surely doing the thing that is expedient is more important than, say, honesty?”
People’s in-practice tendency to have serious hobbies and to take a deep interest in how the world works.
“I used to enjoy learning mathematics just for the sake of it, and trying to understand history for fun. But it’s actually jillions of times higher value to work on [decision theory, or ML, or whatever else is pre-labeled as ‘AI risk relevant’].”
People’s ability to link in with ordinary institutions and take them seriously (e.g. to continue learning from their day job and caring about their colleagues’ progress and problems; to continue enjoying the dance club they used to dance at; to continue to take an interest in their significant other’s life and work; to continue learning from their PhD program; etc.)
“Here I am at my day job, meaninglessly doing nothing to help no one, while the world is at stake—how is it that before learning about the Singularity, I used to be learning skills and finding meaning and enjoying myself in this role?”
People’s understanding of what’s worth caring about, or what’s worth fighting for
“So… ‘happiness’ is valuable, which means that I should hope we get an AI that tiles the universe with a single repeating mouse orgasm, right? … I wonder why imagining a ‘valuable’ future doesn’t feel that good/motivating to me.”
People’s understanding of when to use their own judgment and when to defer to others.
“AI risk is really really important… which probably means I should pick some random person at MIRI or CEA or somewhere and assume they know more than I do about my own career and future, right?”
I think that at least the kinds of “Singularity-disrupted” people that Anna describes in “Reality-Revealing and Reality-Masking Puzzles” are in the fear game.