Getting caught up in an information cascade that says AGI is arriving soon. A person who’s “feeling the AGI” has “vibes-based” reasons for their short timelines due to copying what the people around them believe. In contrast, a person who looks carefully at the available evidence and formulates a gears-level model of AI timelines is doing something different than “feeling the AGI,” even if their timelines are short. “Feeling” is the crucial word here.
It seems to me that I have done a lot of careful thinking about timelines, and that I also feel the AGI. Why can’t you have a careful understanding what timelines we should expect, and also have an emotional reaction to that? Reasonably coming to the conclusion that many things will change greatly in the next few years deserves a reaction.
As I use the term, the presence or absence of an emotional reaction isn’t what determines whether someone is “feeling the AGI” or not. I use it to mean basing one’s AI timeline predictions on a feeling.
Getting caught up in an information cascade that says AGI is arriving soon. A person who’s “feeling the AGI” has “vibes-based” reasons for their short timelines due to copying what the people around them believe. In contrast, a person who looks carefully at the available evidence and formulates a gears-level model of AI timelines is doing something different than “feeling the AGI,” even if their timelines are short. “Feeling” is the crucial word here.
It seems to me that I have done a lot of careful thinking about timelines, and that I also feel the AGI. Why can’t you have a careful understanding what timelines we should expect, and also have an emotional reaction to that? Reasonably coming to the conclusion that many things will change greatly in the next few years deserves a reaction.
As I use the term, the presence or absence of an emotional reaction isn’t what determines whether someone is “feeling the AGI” or not. I use it to mean basing one’s AI timeline predictions on a feeling.