I’m sure there are many people whose inner experience is like this. But, negative data point: Mine isn’t. Not even a little. And yet, I still believe AGI is likely to wipe out humanity.
Also, for what it’s worth, I also don’t think of myself as the kind of person to naturally gravitate towards the apocalypse/”saving the world” trope. From a purely narrative-aesthetic perspective, I much prefer the idea of building novel things, pioneering new frontiers, realizing the potential of humanity, etc, as opposed to trying to prevent disaster, reduce risk, etc. I am quite disappointed at reality for not conforming to my literary preferences.
It’s interesting how people’s responses can be so different here. I’m someone who gets pretty extreme anxiety from the x-risk stuff, at least when I’m not repressing those feelings.
I came here to say something roughly like Jim’s comment, but… I think what I actually want is grounding? Like, sure, you were playing the addictive fear game and now think you’re out of it. But do you think I was? If you think there’s something that differentiates people who are and aren’t, what is it?
[Like, “your heart rate increases when you think about AI” isn’t a definitive factor one way or another, but probably you could come up with a list of a dozen such indicators, and people could see which are true for them, and we could end up with population statistics.]
Over the last 12 years, I’ve chatted with small hundreds of people who were somewhere “in process” along the path toward “okay I guess I should take Singularity scenarios seriously.” From watching them, my guess is that the process of coming to take Singularity scenarios seriously is often even more disruptive than is losing a childhood religion. Among many other things, I have seen it sometimes disrupt:
People’s belief that they should have rest, free time, some money/time/energy to spend on objects of their choosing, abundant sleep, etc.
“It used to be okay to buy myself hot cocoa from time to time, because there used to be nothing important I could do with money. But now—should I never buy hot cocoa? Should I agonize freshly each time? If I do buy a hot cocoa does that mean I don’t care?”
People’s in-practice ability to “hang out”—to enjoy their friends, or the beach, in a “just being in the moment” kind of way.
“Here I am at the beach like my to-do list told me to be, since I’m a good EA who is planning not to burn out. I’ve got my friends, beer, guitar, waves: check. But how is it that I used to be able to enter “hanging out mode”? And why do my friends keep making meaningless mouth-noises that have nothing to do with what’s eventually going to happen to everyone?”
People’s understanding of whether commonsense morality holds, and of whether they can expect other folks in this space to also believe that commonsense morality holds.
“Given the vast cosmic stakes, surely doing the thing that is expedient is more important than, say, honesty?”
People’s in-practice tendency to have serious hobbies and to take a deep interest in how the world works.
“I used to enjoy learning mathematics just for the sake of it, and trying to understand history for fun. But it’s actually jillions of times higher value to work on [decision theory, or ML, or whatever else is pre-labeled as ‘AI risk relevant’].”
People’s ability to link in with ordinary institutions and take them seriously (e.g. to continue learning from their day job and caring about their colleagues’ progress and problems; to continue enjoying the dance club they used to dance at; to continue to take an interest in their significant other’s life and work; to continue learning from their PhD program; etc.)
“Here I am at my day job, meaninglessly doing nothing to help no one, while the world is at stake—how is it that before learning about the Singularity, I used to be learning skills and finding meaning and enjoying myself in this role?”
People’s understanding of what’s worth caring about, or what’s worth fighting for
“So… ‘happiness’ is valuable, which means that I should hope we get an AI that tiles the universe with a single repeating mouse orgasm, right? … I wonder why imagining a ‘valuable’ future doesn’t feel that good/motivating to me.”
People’s understanding of when to use their own judgment and when to defer to others.
“AI risk is really really important… which probably means I should pick some random person at MIRI or CEA or somewhere and assume they know more than I do about my own career and future, right?”
I honestly don’t know. I lean toward no? But don’t believe me too much there.
If you think there’s something that differentiates people who are and aren’t, what is it?
The main one I’m interested in is “Do you recognize yourself in the dynamic I spelled out?”
I like Kaj bringing in that list. I think that’s helpful.
A lot of how I pick this stuff out isn’t a mental list. There’s a certain rushedness. A pressure to their excitement about and fixation on doomy things. Conversation flows in a particularly squeezy and jagged way. Body movements are… um… fitting of the pattern. :-P
There was a noticeable surge of this when Scott came out with “Meditations on Moloch”. I remember how at the EAG that year a bunch of people went and did a mock magical ceremony against Moloch. (I think Scott published it right as EAG was starting.) That totally had the energy of the thing I’m talking about. Playful, but freaked out.
I know this doesn’t help with the statistics thing. But I’m way less confident of “These are the five signs” than I am about this feeling tone.
Same. I feel somewhat jealous of people who can have a visceral in-body emotional reaction to X-risks. For most of my life I’ve been trying to convince my lizard brain to feel emotions that reflect my beliefs about the future, but it’s never cooperated with me.
I’m sure there are many people whose inner experience is like this. But, negative data point: Mine isn’t. Not even a little. And yet, I still believe AGI is likely to wipe out humanity.
Seconded: mine also isn’t.
Also, for what it’s worth, I also don’t think of myself as the kind of person to naturally gravitate towards the apocalypse/”saving the world” trope. From a purely narrative-aesthetic perspective, I much prefer the idea of building novel things, pioneering new frontiers, realizing the potential of humanity, etc, as opposed to trying to prevent disaster, reduce risk, etc. I am quite disappointed at reality for not conforming to my literary preferences.
It’s interesting how people’s responses can be so different here. I’m someone who gets pretty extreme anxiety from the x-risk stuff, at least when I’m not repressing those feelings.
Yep. That just means this wasn’t written for you! I expect this wasn’t written for a lot of (most?) people here.
I really wish that the post has been written in a way that let me figure out it wasn’t for me sooner...
I think it would have saved a lot of time if the paragraph in bold had been at the top.
I came here to say something roughly like Jim’s comment, but… I think what I actually want is grounding? Like, sure, you were playing the addictive fear game and now think you’re out of it. But do you think I was? If you think there’s something that differentiates people who are and aren’t, what is it?
[Like, “your heart rate increases when you think about AI” isn’t a definitive factor one way or another, but probably you could come up with a list of a dozen such indicators, and people could see which are true for them, and we could end up with population statistics.]
I think that at least the kinds of “Singularity-disrupted” people that Anna describes in “Reality-Revealing and Reality-Masking Puzzles” are in the fear game.
I honestly don’t know. I lean toward no? But don’t believe me too much there.
The main one I’m interested in is “Do you recognize yourself in the dynamic I spelled out?”
I like Kaj bringing in that list. I think that’s helpful.
A lot of how I pick this stuff out isn’t a mental list. There’s a certain rushedness. A pressure to their excitement about and fixation on doomy things. Conversation flows in a particularly squeezy and jagged way. Body movements are… um… fitting of the pattern. :-P
There was a noticeable surge of this when Scott came out with “Meditations on Moloch”. I remember how at the EAG that year a bunch of people went and did a mock magical ceremony against Moloch. (I think Scott published it right as EAG was starting.) That totally had the energy of the thing I’m talking about. Playful, but freaked out.
I know this doesn’t help with the statistics thing. But I’m way less confident of “These are the five signs” than I am about this feeling tone.
Same. I feel somewhat jealous of people who can have a visceral in-body emotional reaction to X-risks. For most of my life I’ve been trying to convince my lizard brain to feel emotions that reflect my beliefs about the future, but it’s never cooperated with me.