My understanding is that there are people like Elon Musk and Bill Gates who have said something like that, but I think we probably need something with more substance than “we should pay more attention to it”.
Fun fact: Elon Musk and Bill Gates have actually stopped saying that. Now it’s mostly crypto people like Sam Bankman-Fried and Peter Thiel, who will likely take the blame if revelations break that crypto was always just rich people minting worthless tokens and selling them to poor people.
It’s really easy to imagine a NYT article pointing fingers at the people who donate 5% of their income to a cause (AGI) that has nothing to do with inequality, or to malaria interventions in africa that “ignore people here at home”. Hence why I think there should be plenty of ways to explain AGI to people with short attention spans: anger and righteous rage might one day be the thing that keeps their attention spans short.
It’s really easy to imagine a NYT article pointing fingers at the people who donate 5% of their income to a cause (AGI) that has nothing to do with inequality, or to malaria interventions in africa that “ignore people here at home”. Hence why I think there should be plenty of ways to explain AGI to people with short attention spans: anger and righteous rage might one day be the thing that keeps their attention spans short.
This is a serious problem with most proposed AI governance and outreach plans that I find unaddressed. It’s not an unsolvable problem either, which irks me.
Fun fact: Elon Musk and Bill Gates have actually stopped saying that. Now it’s mostly crypto people like Sam Bankman-Fried and Peter Thiel, who will likely take the blame if revelations break that crypto was always just rich people minting worthless tokens and selling them to poor people.
It’s really easy to imagine a NYT article pointing fingers at the people who donate 5% of their income to a cause (AGI) that has nothing to do with inequality, or to malaria interventions in africa that “ignore people here at home”. Hence why I think there should be plenty of ways to explain AGI to people with short attention spans: anger and righteous rage might one day be the thing that keeps their attention spans short.
This is a serious problem with most proposed AI governance and outreach plans that I find unaddressed. It’s not an unsolvable problem either, which irks me.