What is the best / most proper definition of “Feeling the AGI” there is?
I really like this phrase. I feel very identified with it. I have used it at times to describe friends who have that realization of where we are heading. However when I get asked what Feeling the AGI means, I struggle to come up with a concise way to define the phrase.
What are the best definitions you have heard, read, or even come up with to define Feeling the AGI?
I tend to associate “feeling the AGI” with being able to make inferences about the consequences of AGI that are not completely idiotic.
Are you imagining that AGI means that Claude is better and that some call center employees will lose their jobs? Then you’re not feeling the AGI.
Are you imagining billions and then trillions of autonomous brilliant entrepreneurial agents, plastering the Earth with robot factories and chip factories and solar cells? Then you’re feeling the AGI.
Are you imagining a future world, where the idea of a human starting a company or making an important government decision, is as laughably absurd as the idea of a tantrum-prone kindergartener starting a company or making an important government decision? Then you’re feeling the AGI.
The economists who forecast that AGI will cause GDP growth to increase by less than 50 percentage points, are definitely not feeling the AGI. Timothy B. Lee definitely does not feel the AGI. I do think there are lots of people who “feel the AGI” in the sense of saying things about the consequences of AGI that are not completely, transparently idiotic, but who are still wrong about the consequences of AGI. Feeling the AGI is a low bar! Actually getting it right is much harder! …At least, that’s how I interpret the term “feel the AGI”.
I think it’s about salience. If you “feel the AGI,” then you’ll automatically remember that transformative AI is a thing that’s probably going to happen, when relevant (e.g. when planning AI strategy, or when making 20-year plans for just about anything). If you don’t feel the AGI, then even if you’ll agree when reminded that transformative AI is a thing that’s probably going to happen, you don’t remember it by default, and you keep making plans (or publishing papers about the economic impacts of AI or whatever) that assume it won’t.
Getting caught up in an information cascade that says AGI is arriving soon. A person who’s “feeling the AGI” has “vibes-based” reasons for their short timelines due to copying what the people around them believe. In contrast, a person who looks carefully at the available evidence and formulates a gears-level model of AI timelines is doing something different than “feeling the AGI,” even if their timelines are short. “Feeling” is the crucial word here.
It seems to me that I have done a lot of careful thinking about timelines, and that I also feel the AGI. Why can’t you have a careful understanding what timelines we should expect, and also have an emotional reaction to that? Reasonably coming to the conclusion that many things will change greatly in the next few years deserves a reaction.
As I use the term, the presence or absence of an emotional reaction isn’t what determines whether someone is “feeling the AGI” or not. I use it to mean basing one’s AI timeline predictions on a feeling.